laitimes

Communicating with animal language, this is something that people can't do, AI can do?

author:Academic headlines

Author: Green Apple

Editor: HS

Typography: Li Xuewei

Have you seen "The Story of Hachiko the Loyal Dog"?

This highly rated film perfectly interprets the friendship between Akita Inuhachi and the protagonist Parker. Every day, Xiao Eight tried to go to the station to wait for the deceased owner, rain or shine, and the first wait was ten years.

There are also many excellent film and television works that reflect the same theme: the communication of people and animals.

Understanding animals is a puzzle. First of all, do animals even have "language"? If they had their own language, was there much more to say than the basics of survival?

AI is helping us answer these questions.

Communicating with animal language, this is something that people can't do, AI can do?

Using AI to explore animal language makes sense. After all, AI has been shown to be very effective at deciphering ancient human language.

A team of international scientists recently launched an ambitious Project CETI.

The subjects they chose to study lived in a blue ocean full of mysterious charm, a highly emotional and intelligent and fascinating "mysterious alien": Physeter macrocephalus.

Sperm whale is large in shape, fish-like in size, breathes with lungs, weighs more than 50 tons, can be up to 18 meters long, belongs to the largest toothed whale; the head is huge, can account for 1/3 of the entire body, the lower jaw is small, only the lower jaw has teeth, the neck is short, the nostrils are nozzles, located at the end of the snout, the forelimbs are finned, the forearms and hind limbs are degenerated, there is no dorsal fin, the tail resembles a fish, and the tail swings; it has a strong diving ability, not only in depth, but also in the diving time is no.1 of mammals.

Communicating with animal language, this is something that people can't do, AI can do?

You may not be able to think of what its staple food is, right? The king squid in the deep sea is its favorite

Inspired by "search for extraterrestrial intelligence, project leader David Gruber joined forces with scientists and researchers to explore an unprecedented quest: deciphering sperm whale language for human-whale dialogue. Thus was born the Cetacean Translation Initiative (Project CETI).

By listening to and interpreting the sounds of sperm whales, we obtain a way to communicate with them, in order to use the most cutting-edge technology to benefit mankind and the creatures of the earth, to achieve the purpose of cherishing life, protecting life, and harmonious coexistence between man and nature.

Using a nonprofit model, the project is a multidisciplinary group of experts from prestigious universities, research and environmental institutions such as The City University of New York, UC Berkeley, MIT, Harvard, Google Research, and National Geographic, bringing together top cryptographers, roboticists, linguists, AI experts, technologists, and marine biologists.

They agreed that only by adopting interdisciplinary research ideas and integrating the expertise of relevant directions can we have a more comprehensive, in-depth and thorough understanding of the language of cetaceans. The key steps are as follows:

Recording: large-scale longitudinal multimodal data information collecting whale communication and behavioral data from various sensors;

Process: coordination and processing of data from multiple sensors;

Decoding: With the help of ML technology, a communication model of the whale is constructed and its structure is described, which is closely related to behavior;

Encoding and playback: Improve and refine the whale language model through interactive playback experiments again and again.

As a result, a variety of the latest technologies have a broader place to play, covering AI, machine learning (ML, Machine Learning), cryptography and robotics.

Communicating with animal language, this is something that people can't do, AI can do?

Like the Apollo project's photographs of Earth's ascent, CETI's discoveries and advances have the potential to reshape humanity's understanding of its place on the planet.

Communicating with animal language, this is something that people can't do, AI can do?

The main research work of the project is summarized as follows:

Developing the most sophisticated robotics, including working with the National Geographic Society's Exploratory Technology Laboratory, to bring whale sounds into context while listening to them.

Deploy the Core Whale Listening System, a novel hydrophone array that can be used to study whale swarms at 20×20 km field sites.

The Dominican Sperm Whale Project has been very successful, capturing a wealth of data on whale sounds, social life and behavior.

Create big data pipelines to examine recorded data and decode it using advanced ML, natural language processing (NLP), and data science (DS, Data Science).

Strengthen partnerships and launch public interfaces, data visualization, communication platforms, and leadership initiatives to engage and promote global communities.

Communicating with animal language, this is something that people can't do, AI can do?

Figure | Schematic diagram of whale bioacoustic data collection

Here, we answer the first question in your mind: humans use AI to try to communicate with animals, why should they choose sperm whales?

In fact, compared to all living things, sperm whales have the largest brains and have surprisingly similar characteristics to humans.

Sperm whales have higher-level functions, such as conscious thinking and planning for the future, having a wealth of emotional experiences, expressing and feeling compassion, love, pain, intuition, and so on.

Its bioacoustic system is shown in the figure below. In A, the sperm whale head consists of a cetacean organ (c), a cavity filled with nearly 2,000 liters of waxy liquid, and a garbage chamber (f), which contains a series of wafers that act as acoustic lenses. The two are equivalent to two connecting tubes, forming a curved conical angle of about 10 meters long and about 0.8 meters in large mature male whales. The sound emitted by the "vocal lip" (i) on the front side of the head is transmitted through a curved horn, creating a flat wave surface on the outlet surface. Shown in 5-B is the temporal structure of sperm whale echolocation and tailwave clicks.

Communicating with animal language, this is something that people can't do, AI can do?
Communicating with animal language, this is something that people can't do, AI can do?

In addition, they live in matriarchal and multicultural societies, with dialects and strong multigenerational family ties.

Modern whales, as the super "stewards" of the marine environment, have a history of more than 30 million years, and from the perspective of time nodes, they are 5 times that of the earliest hominids, and our understanding of these animals is only just beginning.

The question begins with a discovery made in the late 1960s.

At the time, scientists, including DR. Roger Payne, CETI's lead consultant, discovered that whales would sing to each other.

As shown below, his recording Humpback Whale Song set off a massive "Save the Whales" campaign, one of the most successful conservation operations in history.

Communicating with animal language, this is something that people can't do, AI can do?

The movement culminated in the enactment of the Marine Mammal Protection Act, which marked the end of the era of mass whaling and effectively saved several endangered whale populations, preserving the most mysterious sound on earth.

Advances in engineering, AI, and linguistics to date have made it possible to learn more deeply about the communication of whales and other animals, shattering previously unattainable myths.

In this project, they will use natural language technology to study and analyze the 4 billion communication codes of sperm whales, linking each sound to a specific context, a process that takes at least five years. If the team achieves these goals, the next step will be to develop and deploy an interactive chatbot to have a conversation with sperm whales living in the wild.

We all know that humans are often very good at identifying acoustic differences in animal sounds that they are familiar with.

And as signal-based classification algorithms become more advanced, there's reason to believe that AI will soon reach the point where it can do better than humans.

Some signs of success are now being seen. In 2017, scientists developed a program that was able to identify many different marmoset calls with about 90 percent accuracy.

Marmosets are social animals that live in groups. Their "vocabulary" consists of 10 to 15 sounds, each with its own meaning. Studies have shown that like human babies, baby marmosets learn to communicate by hearing other marmosets talk to them. Monkey-like communication systems make them popular among scientists who study language, social communication, or vocalization, and marmosets carrying autism-related mutations are also good models for studying the improvement of the disease.

Communicating with animal language, this is something that people can't do, AI can do?

An MIT team developed an algorithm that converts frequency patterns from marmoset calls into pictures, and then passes these letter-like images to an artificial neural network for classification. Ultimately, the algorithm screened out the monkeys' conversations from the background noise with 80 percent accuracy and correctly identified the sounds the monkeys made in more than 90 percent of the cases.

That same year, another team asked AI to identify whether a sheep was in a difficult situation based solely on the facial expressions given to the sheep.

In the study, the Cambridge university team first listed several "facial motion units" (AU) associated with different levels of pain based on sheep's painful facial expressions, and then manually labeled these AUOs in 480 sheep photos — nostril deformation, rotation of each ear, and shrinkage of each eye.

They then trained the machine learning algorithm by feeding it 90 percent of the photo and its labels, and tested the algorithm on the remaining 10 percent. Ultimately, the program's average accuracy in identifying AUs is 67 percent, similar to that of the average person. Moreover, improved training procedures further improve accuracy. The team believes that their approach can also be applied to other animals, which could lead to better diagnosis and treatment options for animals.

In the future, if we can combine the idea of sound + image at the same time, we can have a more comprehensive understanding of what animals may want to say.

With the help of AI, it is a good thing to have a "Google Translate" that can translate animal languages, but more importantly, some species are currently at a critical moment of survival, and with the advancement and maturity of technology, we can build a brighter and closer future for both humans and animals under the support of technology.

.AI

Read on