<b>Like a bat, Daniel Kish, a blind man, relies on echolocation to travel through a new environment, describe the shape of a car, identify the architectural features of a building in the distance, and even ride a bicycle. In the video, he shares his magical life experience at TED's talk. </b> <b>Now, researchers have found that "superpowers" like Daniel Kish can also be acquired, and ordinary people can echolocate like bats. What have we all missed before that? </b>
Video loading...

The acoustic wave diagram shown in the figure is consistent with the scientist's findings. The red line represents the common waveform of the clicking sound emitted by the human mouth, and the blue pattern represents the path of sound waves in space after the human mouth makes a clicking sound.
For a blind person, Daniel Kish's "vision" is beyond your imagination. Like many blind people, Kish relies on non-visual senses to perceive, depict, and travel through the world. But the difference is that his ability is quite bizarre. The reason is that Kish can echolocate, just like bats.
As a child, Kish himself learned to make sharp clicks with his mouth, and then converted the echoes reflected by surrounding objects into spatial information. In one video, Kish uses his ability to navigate a new environment, describe the shape of a car, identify architectural features of a building in the distance, and even ride a bicycle.
Despite this extraordinary ability, Kish believes that he is nothing special. "For a long time, blind people have been using various forms of echo localization, and the effects are good and bad." He said.
Moreover, echolocation capabilities can be acquired.
Kish speaks at TED
As president of World Access for the Blind, one of Kish's missions is to help blind people learn to cook, travel and hike with sound, to do things with sound, and to live more independent lives. "But we've never systematically studied how we perform echolocation, nor how it works, or how to achieve the best results."
For these questions, a study recently published in the journal Computational Biology of PLOS gives a rough answer. The study measured the clicks made by Kish and two other echolocators with their mouths and then converted the measurements into computer signals.
Lore Thaler, a psychologist at Durham University, led a team of researchers to conduct the study in an anechoic chamber. The walls of this room are double-storey structures, and there is a heavy iron door, which is also equipped with a lot of sound insulation materials such as foam. In the anechoic chamber, you can't hear anything outside; when you speak inside, there's hardly any echo.
So, how to echolocate in an anechoic chamber?
I asked Kish how he felt, thinking he would call it a sensory deprivation. It turned out I was wrong. To him, Kish said, the place "sounded" as if it were in the middle of an infinitely vast meadow with a wire fence in front of it.
Echolocators each emit a waveform of three clicks
With this unique space, Taylor's team was able to record and analyze the thousands of clicks that Kish and the other two echolocators made with their mouths. The researchers used miniature microphones, one placed near the subjects' mouths, and the rest arranged in a circle around them, each 10 degrees apart, and suspended at different heights with thin wire. Microphones and thin wires must be small, because the larger the device, the more sound is reflected, which will affect the accuracy of the measurement results.
Taylor's team initially thought that the acoustic characteristics of the clicks made by different echolocators would be different. But in reality, they make very similar sounds. Taylor found that these sounds have the characteristics of bright (about 3 kHz and 10 kHz two high frequencies) and ephemeral, disappearing after only 3 milliseconds.
The researchers also analyzed the path of sound waves traveling in space after the echolocator made a sound with his mouth. "You can think of it as a sound flashlight." "When you turn on the flashlight, the light is distributed in the space, and many of them shoot forward, but some of them also spread out to the left and right," Taylor said. "The beam direction of the click is distributed in space in a similar way, except that one is light and the other is sound.
The researchers found that after a clicking sound was made in the subject's mouth, the beam direction was roughly concentrated in a 60-degree cone range, which was narrower than the range of the speaking sound. Taylor attributed the narrowing to the bright tone of the click. High-frequency tones tend to be more directional than low-frequency tones. If you've ever built a surround sound system, you know that the placement of the tweeter is more important than the placement of the woofer.
Using these measurements, Taylor's team simulated a sound with acoustic properties similar to a real click.
These synthetic clicks can be used to study human echolocation capabilities. Because echolocation masters like Kish are few and far between, such research is often limited.
"Now, we can use a horn in the real world, or a computer in a virtual environment, to simulate echolocation capabilities, so that we can come up with various hypotheses before experimenting on humans." "We can create virtual characters, objects, and environments, just like you do in video games, and then model the sounds that virtual characters hear," Taylor said. With such preliminary studies, Taylor and other researchers could refine their hypothesis before inviting echolocation experts to participate in experiments to see how well the model matched the real situation."
But these models won't be perfect. To ensure consistency in the measurements, Kish and two other subjects had to stand still in the anechoic chamber. "But in the real world, they turn their heads and change the acoustic nature of the click, which gives them more information about their surroundings." Cynthia Moss, a neuroscientist at Johns Hopkins University, said.
Still, Moss says the research is valuable in understanding how humans perform echolocation, and even to develop new devices that allow more people to have echolocation capabilities. After all, not everyone can click with their mouths as skillfully as Kish. "I studied a guy who snapped his fingers, but his hands would get tired quickly." Moss said. If there is a device that can emit a signal with an accurate tone, you can no longer learn to make a clicking sound with your mouth, which is really a blessing for the blind.
I asked Kish what he thought of it if there was a device that could make sounds like him. He said something like that already existed. About a third of his students are unable or unwilling to make clicks with their mouths. "But if you put the castanet in their hands, you can see the effect right away." "The sound they make is very pleasant, surprisingly bright, transparent and coherent," he said. ”
But Kish said he's all for developing more devices and doing more research. "We know that these signals are critical to the echolocation process. Bats use them, whales use them, and humans use them. Therefore, we should study, understand and optimize those signals. With the model built by Taylor and others, Kish's wish may finally come true.
Translation: Yu Bo
Source: wired
Make: Theater-style offline speech platform to discover the most creative ideas