University of Michigan Researchers Use AI to Decode Dog Barks

1eb20520 240e 11ef 894c a90c371c460d.jpg
1eb20520 240e 11ef 894c a90c371c460d.jpg

Researchers at the University of Michigan are harnessing artificial intelligence (AI) to decode what a dog’s bark conveys, particularly whether it indicates playfulness or anger. They also investigate whether AI can accurately determine a dog’s age, gender, and breed based on its vocalizations.

Scientists have made significant strides towards understanding canine communication by repurposing existing computer models trained on human speech. “Advances in AI can be used to revolutionize our understanding of animal communication,” said Rada Mihalcea, head of the University of Michigan AI Laboratory. “Our research opens a new window into how we can leverage what we built so far in speech processing to start understanding the nuances of dog barks.”

AI has advanced significantly in understanding the subtleties of human speech, distinguishing nuances in tone, pitch, and accent, and enabling technologies such as voice recognition software. These systems have achieved sophistication by being trained on extensive databases of human voices. However, no comparable database exists for dogs, as animal vocalizations are more challenging to solicit and record, noted Artem Abzaliev, the study’s lead author.

To overcome this data gap, Abzaliev’s team collected barks, growls, and whimpers from 74 dogs of various breeds, ages, and sexes in different contexts. They fed these sounds into a machine-learning model initially designed to analyze human speech. Remarkably, the model performed well in interpreting what the dogs communicated, achieving an average accuracy of 70% across various tests.

“This is the first time that techniques optimized for human speech have been built upon to help with the decoding of animal communication,” said Mihalcea. “Our results show that the sounds and patterns derived from human speech can serve as a foundation for analyzing and understanding the acoustic patterns of other sounds, such as animal vocalizations.”

The researchers believe their findings could have significant implications for animal welfare. A better understanding of the nuances in animal vocalizations could improve how humans interpret and respond to their emotional and physical needs.

The results of the study were presented at the Joint International Conference on Computational Linguistics, Language Resources and Evaluation. The project was a collaboration between the University of Michigan and Mexico’s National Institute of Astrophysics, Optics, and Electronics Institute.

Tags:

Recent News

What Sa Nguyen uses to film TikToks that feel like FaceTime

Sa Nguyen’s Essential Tech for Creating Viral Content on TikTok and Beyond

Adam Levine's 'Hot Ones' starts off strong, gets derailed hard

Adam Levine Takes on Hot Ones and Gets Destroyed by Da Bomb

MrBeast pulls AI tool for YouTube thumbnails, issues video apology

MrBeast Removes AI Thumbnail Tool After Backlash from Artists and YouTubers

Scroll to Top