laitimes

The study found that dogs used similar computing and brain regions to extract words from continuous speech

A new study by researchers from the Department of Animal Action at The University of Eland in Hungary combining EEG and fMRI found that dogs use computational and brain regions similar to humans to extract words from continuous speech. This is the first time that non-human mammals have demonstrated the ability to use complex statistics to learn the boundaries of words.

The study found that dogs used similar computing and brain regions to extract words from continuous speech

Human babies can discover new words in the speech stream before they learn the meaning of these words. To tell where one word ends and where another begins, babies perform complex calculations to track syllable patterns: syllables that usually appear together may be words, while those that don't appear may not. A new brain imaging study conducted by Hungarian researchers has found that dogs may also recognize this complex regularity in speech.

"Tracking patterns are not unique to humans: many animals learn from this regularity of the world around them, which is known as statistical learning. What makes speech special is that its efficient processing requires complex calculations. ”

To learn new words from continuous speech, it is not enough to simply calculate the frequency with which certain syllables appear together. It is much more efficient to calculate the likelihood that these syllables will appear together.

"This is exactly how humans, even 8-month-old babies, solve the seemingly difficult task of word segmentation: they calculate complex statistics on the probability that one syllable will follow another," explains Mariana Boros, one of the study's lead authors and a postdoctoral researcher at the Laboratory of Communicative Neuroethics at Roland University's Department of Animal Action.

The study found that dogs used similar computing and brain regions to extract words from continuous speech

"Until now, we didn't know if any other mammal could have used such complex calculations to extract words from speech. We decided to test the ability of the domestic dog's brain to perform statistical learning from speech. Dogs were the first animal species to be domesticated and perhaps the animals we talk to the most often. Still, we know very little about the neural processes on which their word-learning abilities are based. ”

"To figure out what kind of statistics dogs calculate when listening to speech, we first measured their electrical activity with an EEG," said another lead author, Lilla Magyari, a postdoctoral researcher in the same research group, who laid the methodological foundation for non-invasive electrophysiology in awake, untrained, cooperative dogs.

"Interestingly, we saw differences in the brainwaves of dogs compared to rare words. But even more surprisingly, we also saw differences in brainwaves between syllables that always appeared together and syllables that appeared only occasionally, even if the total frequency was the same. Thus, it turns out that dogs track not only simple statistics (the number of times a word appears), but also complex statistics (the probability that the syllables of a word appear together). This is never seen in other non-human mammals. This is the kind of complex statistics that human babies use to extract words from consecutive speeches. ”

To explore how similar the responsible brain regions behind this complex computing power of dogs are to those of humans, the researchers also tested the dogs using functional magnetic resonance imaging. The test was also performed on awake, cooperative, non-restrained animals. For fMRI, dogs were previously trained to lie motionless while measuring.

The study found that dogs used similar computing and brain regions to extract words from continuous speech

"We know that in humans, brain regions related to general learning and language are involved in this process. And we find the same duality in dogs," Boros explains. "Both a generic and a specialized brain region seem to be involved in statistical learning from speech, but the activation patterns of the two are different. The general brain region, the so-called basal ganglia, responds more to random speech streams (where no words can be found using syllable statistics) than to structured speech streams (words can be easily discovered by calculating syllable statistics alone). Specialized brain regions, the so-called auditory cortex, play a key role in human speech statistical learning, showing different patterns: here, we see that brain activity increases over time for structured speech streams, but not for random speech streams. We believe that this increase in activity is a trace of word learning on the auditory cortex. ”

"We are now beginning to understand that some of the computational and neural processes known to be helpful for human language acquisition may not be unique to humans after all," said Attila Andics, principal investigator at the Laboratory of Communication Neuroethics.

"But we still don't know how these human-simulated brain-like vocabulary learning mechanisms emerge in dogs." Do they reflect skills formed by living in a linguistically rich environment, or those formed during thousands of years of domestication, or do they represent the abilities of an ancient mammal? We see that by studying the language processing of dogs, and even better dog breeds with different communication skills and other species that are close to human life, we can trace the origins of the specialization of human perception of language. ”

Read on