The most general perception about AI is that it is used either for detection and recognition purposes or trend analysis. Well, tearing this narrative apart, Google launched its “AI for good” program.
Google has collaborated with a group of cetologists to analyze years of undersea data using AI intelligence.
The company aims at setting up a machine learning practice that can identify the humpback whale calls from undersea recordings.
What are whale calls?
Whales travel in deep waters in search of food. As they travel, they make a certain noise or rather a patterned voice that they use to call other whales. Whales and their movement in deep ground waters is rather hard to find and keep track of. But one can monitor whale calls to follow where the whales have gone. Humpback is a special species of whales and Google is working to analyze their calls from undersea data.
What Google did?
Google has partnered with NOAA to collect this data from sea bottoms for over the years. Google aims at analyzing the deep sea noise to find if they can extract some useful patterns of noise that point at humpback whale calls. This deep sea data was collected from the worldwide network of hydrophones that have been planted all along the ocean floor to track whale movements.
Google has thought of this data to be a great tool for machine learning practices, though this years’ long data has been analyzed by Google researchers already. To ensure that the AI systems are capable of extracting patterns from noisy data, the company is relying on the system to perform this task. However, a large amount of data has been already analyzed.
Google started by training the AI system with sample whale calls having the optimum frequencies and audio lengths that were good to go with the machine. Also, the system is taught about picking the frequencies that belonged to humpback calls so that the noise could be avoided.
After a successful training session, the AI system was tested with 15 years of deep sea noise for which the system provided spectrograms. Spectrograms are a good way to analyze patterns, visually. These spectrograms helped in identifying the patterns of whale calls from deep-sea grounds. The machine finally produced a 75-sec recording of the 15 years of data skimming through all the noise and unnecessary sounds.
This practice ensured researchers that they could rely on AI to perform stats and analysis tasks for them. Meanwhile, they can focus on their deduction.
AI for good, Yay or Nay?
Share Your Thoughts