Researchers have successfully implemented an AI environment based algorithm that gauges the vitality of coral reefs through their soundscape, opening new avenues in environmental monitoring and restoration initiatives.
We are long resigned to living in an AI environment that listens to our every move on the Internet, at the cashier’s till, what we’re watching on cable TV. And too many other places. The power that data gives away is breath-taking, but what if we could apply data science to some of the seemingly intractable problems facing us in the environment?
In a trailblazing initiative, scientists have harnessed the power of sound recordings and AI to monitor the health of coral reefs. The innovative machine-learning algorithm can effectively distinguish between healthy and degraded coral reefs by analyzing the ecosystem’s acoustic footprint, according to a recent report by Mongabay.
Previous research suggested a compelling link between the acoustic characteristics of a healthy coral reef and one that has successfully recuperated from degradation. However, interpreting the acoustic data to validate this connection was a time-consuming process, not suitable for efficient and wide-scale application. The introduction of a new AI algorithm represents a significant leap forward in this context, efficiently processing the acoustic data and determining the success of reef restoration efforts.
Despite its potential, researchers insist that additional analysis is necessary. The algorithm, primarily tested in the Pacific Coral Triangle, needs to be assessed in other geographic regions to verify it is applicable to other reefs.
Underwater, the vibrant colors and teeming marine life of a healthy coral reef are instantly recognizable. Recent studies conducted in Indonesia’s Sangkarang Archipelago have found that these reefs are acoustically distinctive as well, given the right AI tool to listen.
The research, spearheaded by doctoral student Ben Williams from University College London and the Zoological Society of London, builds upon a concept introduced in a 2021 study. The previous research highlighted the similarity between the underwater sounds from healthy and mature restored reefs, made by various marine species. These sounds, ranging from pops and grunts to scrapes and whistles, are detectable by underwater microphones known as hydrophones.
Williams, inspired by a similar challenge from his undergraduate studies involving complex datasets to identify heart disease patients, employed machine learning to analyze these soundscapes. His algorithm was designed to categorize these audio samples as emanating from either a healthy or degraded reef.
Williams applied his model to separate audio samples from three distinct reef sites within the Sangkarang Archipelago. Findings showed that most recordings from mature restoration sites echoed the soundscape of a healthy reef, whereas recordings from a recently restored site reflected a degraded state.
Aaron Rice, a research scientist unaffiliated with the study, noted that these findings were an important milestone for coral reef restoration. The ability to use AI to interpret acoustic data and assess restoration progress provides a promising tool in the pursuit of environmental recovery.
Interestingly, it is not just scientists who listen to the sounds of these coral reefs. Juvenile fish and coral larvae rely on the acoustic signature of a healthy reef when choosing their habitat. A diverse and bustling soundscape is often an indicator of a vibrant ecosystem teeming with marine life.
According to Williams, the next logical step is to identify the specific sounds that differentiate healthy from degraded reefs. This complex task is often compared to sifting through the cacophonous audio environment of a bustling city. Understanding these sounds and how they travel through the underwater environment could potentially hold the key to deciphering the state of the reef’s health.
In a bid to uncover the sound of reef recovery across diverse ecosystems, the research team has initiated projects in various locations, including the Great Barrier Reef, Mexico, and the Maldives. This will allow them to test the applicability of their AI model across multiple geographical regions and diverse marine biodiversity.
The effectiveness of this bioacoustic data collection method lies in its relative simplicity and cost-effectiveness. Hydrophones are easy to install, can collect data for prolonged periods, and even operate during adverse conditions when direct human observation is unfeasible.
However, as Williams points out, while restoration efforts are crucial, they won’t single-handedly secure the future of the world’s coral reefs. Climate change continues to pose a significant threat to these vital ecosystems. Thus, while the AI environment offers hopeful solutions, curbing greenhouse gas emissions remains paramount to prevent the forecasted loss of almost all of the world’s reefs by the end of this century.