Diatom Group

Understanding Reef Fish Sounds through Explainable Artificial Intelligence

Study explored how explainable artificial intelligence can classify and interpret reef fish sounds, revealing which acoustic features best describe each sound type.

From the crackle of snapping shrimp to the calls and clicks of fish, reefs are alive with sound. These underwater soundscapes aren’t just background noise; they’re critical to understanding how marine ecosystems function. But as researchers collect vast amounts of acoustic data, manually identifying which fish are making which sounds has become nearly impossible.

That’s where artificial intelligence comes in.

A new study by Viviane R. Barroso, Marine Bioacoustics and Oceanography Lead at Diatom Group, and colleagues Aléxia Lessa, Bioacoustics and Fish Ecology Lead at Diatom Group, and Fabio Contrera, Bioacoustics and AI Lead at Diatom Group, published in Philosophical Transactions of the Royal Society B, used supervised algorithms to classify unknown fish sounds in the subtropical reef of Arraial do Cabo, Brazil. The study explores how machine learning can not only classify reef fish sounds but also explain how it makes its decisions. This is a crucial step toward building trust and interpretability in AI-based ecology.

https://doi.org/10.1098/rstb.2024.0055

The authors performed a multiclass classification of four types of pulsed fish sounds using four supervised learning algorithms (Naive Bayes, Random Forest, Decision Trees, and Multilayer Perceptron). To better understand the influence of each acoustic feature on the models’ predictions, the authors applied Explainable Artificial Intelligence (XAI) methods, specifically SHapley Additive exPlanations (SHAP). This approach revealed the contribution of individual features to each sound class, providing insight into their influence on the prediction of each class.

A key innovation was evaluating the contribution of Mel-frequency cepstral coefficients (MFCCs) by interpreting the contribution of each coefficient to each class of fish sound. These coefficients are widely used in speech recognition to study emotions and even diseases, such as depression, with specific coefficients acting as indicators of particular disorders. For fish, they may be important for recognising species-specific sounds or indicating behaviours.

Integrating interpretability into the complex process of classifying fish sounds is a significant step towards creating trustworthy AI applications in bioacoustics. Explainable models can also enhance our understanding of the acoustic diversity and ecological functions of reef fish. Recognising and characterising these sounds helps us to gain a better understanding of diel behaviours and important ecological processes in reef systems.

By teaching machines to listen and explain what they hear, this research opens a new window into the underwater world. Each click, pop, and pulse tells part of the story of reef life; who’s there, what they’re doing, and how healthy their home is. As coral reefs face mounting threats from warming oceans and human activity, being able to “hear” their hidden conversations could become one of our most powerful tools for protecting them. Thanks to AI, we’re learning not just to study the ocean, but to truly listen to it.