For many years, scientists have been fascinated by the clicks and whistles of dolphins. Figuring out what these sounds mean has been a major goal. Now, technology, including Google’s Pixel phones, is playing a key role in trying to decode these complex marine communications.

Researchers from the Wild Dolphin Project (WDP) have spent decades studying Atlantic spotted dolphins in the Bahamas. They’ve gathered huge amounts of underwater audio and video, carefully linking sounds to specific dolphin behaviors. This long-term work provides crucial context, identifying sounds like unique signature whistles used between mothers and calves, or specific squawks heard during disputes.

Analyzing this massive dataset is a huge undertaking. This is where Google comes in. They developed an AI model called DolphinGemma, trained specifically on WDP’s dolphin recordings. This model learns the structure of dolphin sounds, helping identify patterns and predict what sound might come next, much like language models predict the next word in a sentence.

But how do Pixel phones fit in? Researchers are also exploring two-way communication using a system called CHAT (Cetacean Hearing Augmentation Telemetry), developed with Georgia Tech. CHAT aims to establish a simpler, shared vocabulary using synthetic whistles linked to objects dolphins enjoy, like sargassum or scarves.

The system needs to hear the dolphins mimic these sounds accurately in noisy underwater environments and identify the sound in real-time. Initially, a Google Pixel 6 was used to handle this high-fidelity analysis. The project is now moving towards using the upcoming Google Pixel 9. These phones are powerful enough to run the necessary AI models directly in the field. Using Pixel smartphones significantly reduces the need for bulky, custom hardware, making the research equipment smaller, more affordable, and easier to maintain in the ocean.

DolphinGemma helps make the CHAT system faster by predicting potential mimics earlier. This allows researchers to respond more quickly, reinforcing the learning process for the dolphins. The AI powering this is related to the Gemini models, and while its application here is specialized, it highlights Google’s push for integrating AI, similar to how features like Gemini Live are expected on the Pixel 9.

Google plans to release DolphinGemma as an open model, hoping it will help scientists worldwide study other dolphin species. Combining dedicated field research with advanced AI and accessible hardware like Pixel phones is bringing us closer to understanding our intelligent marine neighbours.

Dwayne Cubbins
1858 Posts

My fascination with Android phones began the moment I got my hands on one. Since then, I've been on a journey to decode the ever-evolving tech landscape, fueled by a passion for both the "how" and the "why." Since 2018, I've been crafting content that empowers users and demystifies the tech world. From in-depth how-to guides that unlock your phone's potential to breaking news based on original research, I strive to make tech accessible and engaging.

Next article View Article

[U: It's finally here] Fitbit app still missing dark mode support, but you can enable it on Android using this workaround

Update 21/08/25 - 5:55 pm (IST): After making users wait a long time, Google has finally introduced native dark mode support on the Fitbit app. This new eye...
Aug 21, 2025 4 Min Read