AI is better than humans at detecting blue whale calls – Australian Antarctic Program (News 2022)

Whalers could soon do without a job – or at least a tiring and repetitive job – by applying artificial intelligence (AI) to their research.

Using machine learning, a team from the Australian Antarctic Division, the K. Lisa Yang Center for Conservation Bioacoustics at Cornell University and Curtin University trained an algorithm to detect ” D” calls of blue whales in sound recordings, with greater accuracy and speed than human experts.

Whale acoustician Dr Brian Miller said the technology will make it easier for scientists to analyze hundreds of thousands of hours of recordings from these elusive and hard-to-study whales, to better understand trends in their populations as they recover from whaling.

“By analyzing our recordings for D calls and other sounds, we get a more complete picture of the behavior of these animals, as well as trends and potential changes in their behavior,” Dr. Miller said.

“The deep learning algorithm we applied to this task outperforms experienced whale acoustics in accuracy, is much faster, and does not fatigue.

“So that frees us up to think about other big picture issues.”

Social calls

D calls (see video above) are considered “social” calls made by both male and female whales on feeding grounds. Unlike the “songs” of male blue whales, which have a regular and predictable pattern, D calls are highly variable from whale to whale and from season and year to year.

This variability makes automating the analysis of the recording more difficult than it would be for coherent sound.

To overcome this, the team trained the algorithm on a comprehensive library of around 5,000 D-calls, captured in 2,000 hours of recorded sound from sites around Antarctica between 2005 and 2017.

“The library covered different seasons and the range of habitats in which we would expect to find Antarctic blue whales, to ensure that we captured the variability of D-Calls as well as the varying soundscapes through which whales travel,” said Dr Miller.

Before the training could begin, however, six different human analysts went through the recordings and identified or “annotated” the D calls.

Rather than analyzing the sound, the calls were turned into “spectrograms” or visual representations of each call and its duration.

Using machine learning techniques, the algorithm trained to identify D calls from 85% of the library data, using the remaining 15% of data to validate and improve.

Human versus machine

The trained AI then received a test dataset of 187 hours of annotated recordings from a year at Casey in 2019.

The research team compared the number of D-call detections made by the AI, with those identified by the human experts, to see where they disagreed.

An independent human judge (Dr Miller) determined which of the disagreements were D-calls or not, to come to a final decision on who was more accurate.

“The AI ​​found about 90% of the D calls and the human just over 70%, and the AI ​​was better at detecting very quiet sounds,” Dr Miller said.

“It took about 10 hours of human effort to annotate the test dataset, but it took the AI ​​30 seconds to analyze that data – 1,200 times faster.”

The team made their AI available to other whale researchers around the world, to train on other whale sounds and soundscapes.

“Now that we have this power to analyze thousands of hours of sound very quickly, it would be great to build more recording sites and larger recording networks, and develop a long-term monitoring project. to look at trends in blue whales and other species,” Dr. Miller said.

The research is published in Remote sensing in ecology and conservation.

on


Source link

Comments are closed.