acm-header
Sign In

Communications of the ACM

ACM News

If We Could Talk to the Animals


View as: Print Mobile App Share:
Animals communicating.

Can artificial intelligence really help us talk to the animals?

Credit: Phillip Lay/Observer Design; Alamy; Getty

Understanding what various animals think and how they communicate has fascinated humankind for centuries. The concept has spawned articles, books, recordings and movies. Yet decoding incredibly complex speech patterns—from squeaks and barks to snorts and growls—has generally fallen outside the bounds of what is possible.

Now, artificial intelligence (AI) has joined the conversation, and is helping scientists learn how to talk to animals through a variety of methods, including large language models (LLMs).

"If we are able to understand what animals are thinking and saying, in many cases we can improve their welfare and well-being," says Elodie Floriane Mandel-Briefer, an associate professor in the department of biology at Denmark's University of Copenhagen.

An understanding of animal languages would have a profound impact on the way people interact with household and farm animals, but it could also significantly change the way humans interact with mammals and birds in the wild, including endangered species. As a result, researchers are exploring ways to communicate with mice, pigs, dolphins, whales, bats, and other animals.

"Deciphering animal sounds and transforming them into actual syntax is a remarkably difficult task, but we have begun to make progress," says Kevin Coffey, a research assistant professor in the department of psychiatry and behavioral sciences at the University of Washington. He and others already have developed algorithms that have begun to give animals a voice humans can understand.

A Voice of Reason

Humans communicating with other animals is not a revolutionary concept. Dogs can typically understand close to 100 words, and apes can communicate with humans through sign language and even iPads. Yet, translating sounds, words, and ideas into functional communication across the animal kingdom is a remarkably complex task.

For one thing, the range of sounds animals make—from chirps to roars—is unlike any form of human communication, and the underlying patterns cannot be mapped or translated into any known language. Some scientists even question whether animal communication can be considered a language system, or merely basic calls.

For another, cataloging sounds is extraordinarily difficult. It is not unusual for animal sounds to extend beyond the human capacity to hear them, and even when they are audible there's ambient environmental noise to consider. As a result, recordings often are not clear or definitive.

The resulting complexity makes human interpretation extraordinarily difficult—even with AI. "Different animal species make discrete sounds and use different patterns to communicate, so there's no way to develop a single algorithm that works for all groups of animals," Coffey explains. So far, he and other scientists have had good results applying machine vision algorithms to detect animal vocalizations that incorporate a distinct structure, such as mice, rats, and birds.

For example, Coffey and a fellow group of researchers have developed DeepSqueak, an algorithm that detects and deciphers ultrasonic calls from rodents and attempts to determine their mental state. "A happier research colony equals better outcomes," he says. "We can detect about 95% of the calls and categorize them in a way that feels satisfying to humans, but we still don't know how precise this understanding is."

Meanwhile, Mandel-Briefer and her colleagues have developed an algorithm that gauges how pigs are feeling through the oinks, squeals, and grunts they emit. The research group studied pigs from birth to death—analyzing 7,414 calls from 411 animals—and reported in the March 2022 issue of the journal Scientific Reports that an algorithm could be trained to match pig sounds to emotions with about 92% accuracy. Mandel-Briefer also is examining the communication patterns of other mammals and birds.

Several other projects have taken shape. For instance, CETI, which ties together researchers from the University of Oxford, University of Haifa, Google Research, and Amazon Web Services, is learning how whales communicate. The Bat Lab for Neuro-Ecology at Tel Aviv University is studying the language of fruit bats in the wild. The Earth Species Project, or ESP, is coordinating 40 research efforts globally, including projects focused on birds, whales, monkeys, and elephants.

Call of the Wild

A couple of factors are driving research in this space. One is the emergence of advanced recording capabilities, which can incorporate aerial drones, swimming robots, and IoT devices. These systems can detect sounds and audio ranges beyond the capabilities of human ears—and in places humans cannot normally reach. The other factor is far more advanced machine learning and deep learning, which make it possible to capture bioacoustics and find patterns that map to identifiable thoughts and feelings.

For example, the open source analysis tool  Voxaboxen pares background and environmental noise, sorts through various calls and sounds, and finds patterns that can be mapped to behavioral elements. Another software program called Batalef, developed by researchers at Tel Aviv University, taps into miniature sensors—including ultrasonic microphones attached to bats—to identify patterns and context that would otherwise fly beneath human radar.

After feeding more than 15,000 bat calls into the software, Yossi Yovel and other researchers at Tel Aviv University discovered unique vocalizations for fighting, mating, and jostling for perch positions. They also found the bats alter their vocal patterns, depending on whether they are communicating with a familiar or unfamiliar bat.

The ultimate goal is to establish is tool like Google Translate. Understanding animal vocal patterns more holistically could aid in conservation efforts and help protect endangered species. It might also shape sentiment about animal welfare, in much the same way that biologist Walter Payne changed perceptions about whales with the 1970 album Songs of the Humpback Whale. It became the best-selling nature album in history and led to a global ban on whaling.

The biggest challenge involves melding bioacoustics and behavioral analysis. "It is extremely difficult to draw specific semantic representations," Coffey says. In fact, some scientists believe that inter-species communication may never be fully possible. Most animals possess a limited ability to communicate—even within their own species (apes and monkeys are an exception)—so extracting deeper meaning from a cacophony of meows or bow-wows may never be possible.

At the same time, there are ethical concerns surrounding the inadvertent or intentional misuse of the technology, including selling software and products that might be pushed onto pet owners, poachers using the technology for monetary gain, and the general idea of manipulating animals—perhaps with the wrong messages. "It's still too early to know whether AI will lead to positive results." Mandel-Briefer says.

Nevertheless, the science of animal linguistics marches forward. As Aza Raskin, co-founder of the Earth Species Project has put it: "Our goal isn't just 'can we learn to listen from animals', but 'can we unlock communication and transform our relationship with the rest of nature?'"
 

Samuel Greengard is an author and journalist based in West Linn, OR, USA.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account