30 January, 2026
What if protecting the rainforest started with just listening? Google is betting on it. Its new Forest Listeners experiment turns anyone with an internet connection into a citizen scientist, helping train AI to recognise animal calls from Brazil’s rainforests. The result could be the largest verified database of rainforest sounds ever created.
Listening matters because sound is often the only practical way to “see” what’s happening in dense, remote forests: birds, frogs, insects, mammals, and even logging or vehicles, all leave acoustic fingerprints that reveal who’s there and how healthy the ecosystem is. Rich, varied soundscapes tend to signal higher biodiversity, while quieter or more monotonous ones can warn of species loss long before it’s obvious on the ground.
Here’s how the project works: users step into a 3D virtual forest and listen to short clips from the Atlantic and Amazon biomes. When they click “yes” or “no” to identify a species call, each response trains Perch, Google DeepMind’s AI model built to track biodiversity automatically. It’s simple, quite fun, and it teaches the AI what real ecosystems sound like at scale.
The project also fills a major gap in conservation research. Scientists already collect millions of hours of audio from remote forests, but analysing that data manually takes years.
By turning rainforest sound analysis into an accessible, game‑like experience, Google is testing a new way to scale environmental science through global participation. If that model works, it could change how we monitor fragile ecosystems everywhere, using human attention and machine learning together to spot risks earlier and respond faster.
By Emma Alajarin