laitimes

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

Reporting by XinZhiyuan

Editor: Yuan Xie

【New Zhiyuan Introduction】The brain-computer interface device can now make a "frozen person" who can't move a muscle successfully say the whole sentence.

Paralyzed patients and "frozen people" use brain-computer interface technology (BCI) to carry out various writing and tweeting and surprising operations, which is not a new thing in recent years.

Or rather, the most convenient and closest commercial landing prospect of brain-computer interface technology is to bring convenience to these people.

However, breakthroughs occur every year. But the brain-computer interface experiment that allows patients who really don't have a muscle in their whole body to express their meaning has not actually been there before.

However, on March 22, 2022, this record was broken.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

"Frozen people" can't move their eyes, so it is difficult to use eye tracking AI expressions

Thanks to the blessing of famous marketing events such as the "Ice Bucket Challenge", although ALS is still an incurable and rare disease, it is well known what "amyotrophic lateral sclerosis" (ALS) and how the life of "frozen people" will end.

Simply put, ALS/"FROSTBITE" is a neurodegenerative disease. In its "frozen" patients, their condition affects the motor neurons in the brain and spinal cord that tell the muscles what to do.

As the disease progresses, the degeneration of motor neurons in the brain interferes with the information transmitted to the body's muscles, subsequently causing muscle atrophy, depriving the patient of the ability to move his arms, legs, and body. Eventually the patient loses control of any muscle in the body.

Eventually, they also lose the ability to speak, walk, breathe, or swallow. Most patients die of respiratory failure caused by the inability to move the muscles of the lungs within three to five years of the first onset of symptoms.

Therefore, it is logical that the test subjects of various brain-computer interface companies and projects prefer such groups:

The "frozen man" urgently needs new technologies to improve the situation, and the brain-computer interface can also obtain data from the most ideal experimental subjects whose muscles cannot move but whose brain activity is not yet significantly impaired.

However, in recent years, the most famous breakthroughs in brain-computer interfaces, such as:

Cyborg self-improvement project of Peter Scott-Morgan, an Anglo-American in the 2020s;

In May 2021, the Stanford University research team made the paralyzed person with spinal cord injury the first person in history to use brain-computer interfaces and machine learning algorithms to convert imaginary "handwriting" in the brain into screen text with an accuracy rate of more than 99% of the first person to write;

At the end of December 2021, ALS patient Philip O'Keefe became the first person in the world to tweet with a brain-computer interface;

These projects all have one thing in common: their subjects, though paralyzed, are not people who can't move a single muscle at all.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

This small difference represents a technically difficult obstacle: all successful human experiments now manipulating peripherals with brain-computer interfaces must more or less rely on the assistance of eye tracking technology.

Self-transforming cyborgs, paralyzed people with mind-writing, and "frozen people" pushing brain-computer interfaces all rely on eye tracking AI that has been experimenting for many years since Hawking, typing on screens, moving cursors, or assisting in error correction.

Although this is also a great technical achievement, it is not useful for late "frozen people" who have lost control of the facial muscles around the eyeball or orbit. They will still gradually lose the ability to breathe in a silence that even high technology cannot make up.

Use the brain-computer interface plus auditory feedback training to teach the "frozen person" to control the machine to make high and low tones

This obstacle was crossed by researchers in Germany and the Netherlands in March 2022.

On March 22, 2022, the "Wyss Center for Biological and Neural Engineering" in Switzerland and the University of Tübingen in Germany published a study to successfully select letters and form sentences with an implantable brain-computer interface for the first time in a completely atresia patient who could no longer control the voluntary muscles and could not even move his eyes.

The study's paper was published in Nature Communications.

The subject of this study was a 36-year-old "frostbite man" who, when he collaborated with the research team in 2018, was also able to express "yes" and "no" with eye movements, as well as select letters on the screen through non-invasive electro-eye (EOG) or electroencephalogram (EEG).

However, after his condition continued to deteriorate and reached a completely locked state, he completely lost the ability to communicate with the outside world.

For a brain that is completely out of control of its physical abilities, it was not clear in the past that the scientific community could continue to signal intentions in order to achieve meaningful communication.

Since March 2019, the research team has implanted two microelectrode arrays in the patient's brain motor cortex, each measuring 3.2 x 3.2 mm, with 64 needle-like electrodes on the top to record neural signals.

The researchers said that when they asked the man to try to move his hands, feet, head and eyes, his brain nerve signals were inconsistent enough to judge the intention "yes" or "no."

After nearly 3 months of failed efforts, the research team tried to train subjects with auditory neurofeedback.

This is to let the subjects try to change their brain signals on their own, and let the researchers measure and feedback in real time whether they are successful, so that the subjects can gradually adjust the way they control brain activity.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

Experimental mechanism composition

In the study, when accelerated neuronal firing near the subject's brain implant was detected, an external computer system played a higher-pitched sound. If the discharge rate slows down, a lower-pitched sound is played.

The researchers asked the subject to use any strategy to change the tone of feedback from an external machine, such as moving his eyeballs imagined.

On the first day of the process change, the subjects achieved their goal of changing the tone of the feedback.

On day 12, for the first time, the subjects adjusted their brain activity so that the feedback tone successfully matched the target pitch.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

Experimental records

Over the course of the study, the researchers adjusted the communication system by mapping the neurons that responded most strongly, determining how the patterns of neurons changed with the patient's efforts.

In this way, the subject can get the listening feedback of neural activity in time, and the author can also guide the subject to control the brain nerve discharge rate so that the frequency of the feedback tone matches the target pitch.

"Frozen man" controls the machine to make high and low tones, expressing the judgment of right and wrong, so as to choose words and spell sentences

The next step in the experiment was to let the subjects control neural activity, so that the rate of neuronal firing near the brain implant lasted more than 250 milliseconds at the high or low end of a given range.

The frequency of the feedback tone would then be treble and bass, which can be interpreted as "yes" and "no" respectively.

The research team divided all the letters into five groups, broadcasting a set of letters first, and asking the subject to answer in a feedback tone whether the letters he wanted were in this group.

The letters in the group are then broadcast one by one for the subjects to choose the letters by answering "yes" or "no", and then repeat this step to spell words and phrases and form sentences.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

Records of neural activity during spelling

After about three weeks of training, the subject was able to say an understandable sentence and asked the paramedic to readjust his lying position and posture.

Over the next year, he spelled words and phrases at a rate of about one word per minute, struggling to utter dozens of phrases: "Goulash soup and sweet pea soup," "Mommy head massage," "I want to listen aloud to The Tale's album," "I love my cool son," and so on.

However, the subject did not necessarily succeed in spelling the sentence every time. Of the 135 days of the experiment, subjects had only 107 days in neural feedback training, achieving a success rate of more than 80 percent in adjusting pitch, which is the lower limit for performing spelling tests.

In the 107 days of spelling tests, only 44 days he uttered comprehensible sentences.

The "frozen man" did not have a muscle in his whole body to move, and the brain implant allowed him to successfully say the whole sentence!

The degree of neural activity of the subjects

Similar researchers at Utrecht University in the Netherlands have speculated that subjects may be asleep, not in the mood, or their brain signals are too weak or too unstable to calibrate the system.

One of the paper's authors speculated that it was possible for the associated neurons to drift beyond the range of the implanted electrodes.

Nonetheless, this study confirms the feasibility of communicating with the outside world in patients in a completely locked state.

The German nonprofit ALS Voice, where the paper's lead author currently works, is seeking funding to provide similar implants to more als patients. He estimates that the system will cost close to $500,000 in the first two years.

However, before it can be extended to clinical use, its long-term nature, suitability in other patients, and the safety and efficacy of the brain-computer interface components used need to be further demonstrated.

The authors say the researchers at the Wyss Center continue to work with the subject, but his spelling ability has declined, and now he mainly answers judgment questions.

Part of the reason, the authors say, may be that the scar tissue around the implant masks neural signals, but cognitive factors may also be related, as the subjects' brains may also lose the ability to output control signals after years of being useless.

The authors of the paper said that as long as the experimental subjects continue to use the brain-computer interface device, the research team promises to maintain it.

Resources:

Read on