Can AI help us talk to animals?

At first glance, it might not seem like Dr. Dolittle and artificial intelligence (AI) have much in common. One belongs to the children’s literature of the 1900s, while the other is firmly rooted in the 21st century. One is a doctor-turned-veterinarian who can talk to animals, and the other is computerized technology who cannot. Unless…

AI has already given us the ability to bark instructions at robots like Siri and Alexa – could its potential be extended to the animal kingdom? Could it help us unravel some of the mysteries of the natural world and perhaps one day allow us to “talk” to animals?

Certainly there are those who think so. And progress has already been made in trying to decode animal communication using AI. It may be a way to let you do the dishes with your dog or spill the tea with your tortoise, but technology has – and we hope will continue to – improve our understanding of other species and the how they interact. When it comes to communicating with animals, Dolittle may have walked (and talked) so the AI ​​could work.

Do animals use language?

The first hurdle to “translating” animal communication is understanding what that communication looks like. Human language is made up of verbal and non-verbal cues, and animal communication is no different.

Dogs wag their tails, for example, to convey a range of emotions. Bees dance to let other bees know where to find a good source of nectar or pollen. Dolphins use clicks and whistles to relay information.

However, there is some debate as to whether it can be considered a “language”. A debate that Dr. Denise Herzing, research director, Wild Dolphin Project, AI says could help end.

“We currently don’t know if animals have language,” Herzing told IFLScience. “[But] AI can help us search for language-like structures that might suggest animals have parts of a language.

How can AI “translate” animal communication?

“Research in bioacoustics has shown that animal vocalizations convey many types of information, from their identity to their status, their internal state, and sometimes external objects or events”, Elodie F. Briefer, associate professor in behavior and communication animal studies at the University of Copenhagen, says IFLScience. “All of these could be picked up by AI.”

More precisely, by machine learning. It is a form of AI that can analyze data without having to follow specific instructions. In theory, it could be used to process recordings of animal communication and build language models based on these recordings.

“Machine learning is a powerful tool because it can be trained to identify patterns in very large datasets, so it could allow us to process large amounts of data and gain crucial insights into the how information in animal sounds changes over time, etc.” added.

It’s the same technology we use every day to power predictive text, Google Translate and voice assistants. Switching to animal communication may prove more difficult, but that hasn’t stopped researchers from trying.

“There are many different techniques and ways to approach science,” Herzing told IFLScience. These will be different, she added, “[depending] about the data, the AI ​​technique, or even the understanding of the animals themselves.

The Earth Species Project, for example, is a non-profit organization “dedicated to decoding non-human language.” So far, they have focused on cetaceans and primates, but they believe will eventually expand to other animals, including corvids.

The project uses a machine learning technique, which treats a language as a shape, “like a galaxy where each star is a word and the distance and direction between the stars encodes the relational meaning”. These can then be “translated by matching their structures to each other”.

Optimistically, Britt Selvitelle, co-founder of the Earth Species Project, believes this approach could help decode the first non-human language within the next decade, according to The New Yorker. Others, however, are more skeptical of AI as a tool to unravel animal communication.

It’s all well and good to analyze recordings, but it doesn’t make sense without context, says Julia Fischer of the German Primate Center Göttingen. “[AI] is not a magic wand that gives you an answer to biological questions or questions of meaning,” she told New Scientist.

Looking back to nature and correlating records with real-world observations is always essential, and that’s no small feat.

I communicate animals

As very social animals, cetaceans are a good place to start when trying to chat with animals. Image credit: F Photography R /

What has been achieved so far?

Many projects are currently working to unlock the secrets of animal communication with the help of AI, the Earth Species Project being one of them. Last December, the project published a paper that claims to have solved the “cocktail problem” – the problem that arises when distinguishing the source of a sound from multiple simultaneous sounds.

Imagine a cocktail, if you will. Amid the chatter and background noise, it’s nearly impossible to tell who exactly the calls for another espresso martini are coming from. And the same problem is present when deciphering animal communication.

In the study, the researchers describe an experimental algorithm – which they applied to species such as macaques, bottlenose dolphins and Egyptian fruit bats – that allowed them to identify which individual in a group of noisy animals “talk”.

AI is also emerging as a valuable tool in other areas of zoology. “[It] has been used especially in a relatively new field called ‘ecoacoustics’, which monitors biodiversity through passive acoustic monitoring and requires very large datasets,” Briefer told IFLScience.

“People have also used it to extract information from long-term records (for example, identifying marine mammals from underwater records). More recently, it has been used to identify patterns in other contexts as well, such as to identify underlying emotions in [pigs and chickens].”

Briefer’s work includes such a study. She and her co-authors trained an AI system to recognize positive or negative emotions in the growls, squeals and snores of pigs.

In rodents, software called DeepSqueak has been used to judge whether an animal is under stress based on its ultrasonic calls. These sounds, imperceptible to the human ear, are how rodents communicate socially. The software has also been used on primates and dolphins to help researchers automatically label recordings of animal calls.

primate ai communication

The AI ​​is trying to label primate calls. Image credit: Gudkov Andrey /

The nonprofit Wild Dolphin project, founded by Herzing, aims to use AI to uncover patterns of dolphin calls and explore communication between dolphins and humans. In 2013, after teaching a pod of dolphins to associate a particular whistle with a type of algae, researchers used a machine learning algorithm to identify and translate the sound in nature.

Meanwhile, the CETI (Cetacean Translation Initiative) project is trying to decode the communication of sperm whales by using language models to decipher their songs and establish their “language”.

Why these animals?

No species is superior when it comes to decoding communication, Briefer believes, but nonetheless, some have been targeted by researchers more than others.

“When considering acoustic communication, of course, the most interesting are those that are very vocal (e.g. birds, pigs, meerkats, etc.) and those that have a large sound repertoire,” Briefer said. at IFLScience.

Likewise, social animals, such as primates, whales, and dolphins, are more likely to have well-developed communication systems, making them ideal for study.

“Dolphins live in highly social societies, are long-lived and have long memories, suggesting they have complex relationships to communicate,” Herzing explained. Intelligence can also play a role.

“Cetaceans, or at least dolphins, are known to have high EQ [emotional intelligence], the abilities to learn artificial languages, to understand abstract ideas and to recognize oneself in a mirror,” Herzing added. “These are some of the foundations of intelligence.”

What are the benefits of understanding animal communication?

Aside from the obvious – finally finding out what your cat really thinks of you – there are many ways a better understanding of animal communication can be beneficial, for humans and animals alike.

“For both captive and wild species, it gives us a better understanding of them and when they are thriving or suffering,” Briefer told IFLScience. “This is crucial for the species we have around us (pets and farm animals for example), because their well-being depends on us.”

Not only could this make us better pet owners, but it has the potential to change our relationship with all animals forever. “Knowing that animals have language would hopefully help humans understand that we are not the only sentient species on the planet,” Herzing added.

At the very least, it might inspire more sympathy for other species and cause us to rethink how we treat them. This could have far-reaching implications, especially for the use of animals in sport, entertainment and research.

It could even trigger a complete overhaul of animal agriculture. With a better knowledge of the animals around us, can we still justify practices that exploit and kill them? From a human point of view, there is also a lot to learn, not only about animals, but also about ourselves and, perhaps, other forms of life.

Understanding animal communication could also tell us more about the evolution of language, Briefer told IFLScience. “The tools we develop with species on earth could apply to distant worlds if we encounter other life forms,” Herzing speculated, adding that these tools could help us assess their intelligence and if we could communicate. with them.

Switching from animal communication to extraterrestrial communication is certainly a decision that Dolittle never made. Should the fictional vet fear being beaten? Not yet. Using AI to actually speak to animals, not to mention aliens, is a big step forward. But it certainly has the potential to improve our understanding of other species and the world around us. In fact, it already is. Maybe Dolittle should start feeling a little nervous after all.

This article first appeared in issue 3 of CURIOUS, IFLScience’s electronic magazine. Subscribe now to receive every issue for free, straight to your inbox.

Leave a Reply

Your email address will not be published. Required fields are marked *