If it is a person, it will die.
This fact may sound sad, but the movie "The Wandering Earth 2" at the beginning gave another possibility - mad scientists conducted digital life experiments, uploading thinking consciousness to computers by connecting electrodes to the brain, so as to immortalize people.
The Indian scientist at the beginning of the movie seems to be mentally unstable, but his idea of leaving people in the world in digital form is really pragmatic.
First, digitalization breaks the natural limitations of the physical body - cells will age, and organs will fail. Digital immortality is undoubtedly a more reliable way to live forever.
Second, thinking upload, also known as Whole Brain Emulation (WBE, Whole Brain Emulation), which maps a person's thoughts, personality, emotions, and memories to other carriers, such as computers, robots, and even clones, is a recurring theme in science fiction works, and it is a convincing concept.
Digital immortality, as mysterious as it sounds, is a trick that even the old man and lady of telecom fraud do not bother to use, but the technology may be closer than we think.
For digital immortality, the original author Liu Cixin mentioned in a recent interview: digital immortality requires the joint progress of information technology and brain science, of which information technology advances rapidly and brain science progress is slow, if the information in the brain cannot be retrieved, it is impossible to achieve immortality.
In this way, the key evidence for whether digital immortality can come true is the progress of brain science.
Breaking down the problem further, digital immortality comes in two forms: one-way and two-way, and the process is also different.
Two-way immortality, which means that the digital avatar can also interact and respond to people, just like in "The Wandering Earth 2", Tuya's thinking fragments are transferred to the digital life card, and with the blessing of a powerful quantum computer, it can interact with people and things happening in the outside world.
One-way immortality refers to the uploading of thoughts to non-biological media, such as chips, computers, and exists in a passive "read-only" form.
It is not difficult to see that two-way immortality requires the completion of a whole brain simulation process such as thought copying, thinking upload, thinking preservation, and thinking transfer, and any link dropping the chain may interrupt the reading of the digital avatar, resulting in immortal failure. Therefore, the two-way interactive digital life like Tuya is indeed a little far from reality.
If we look at one-way immortality, we will find that the latest trend in brain-computer interface technology is allowing thinking to enter the real world and become a ladder to uncover the mystery of immortality.
The first step to climbing the ladder of immortality: put a "signal tower" on the brain
Readers who love science fiction or follow science and technology news may have heard the name "brain-computer interface" for a long time. Elon Musk's brain-computer interface company, Neuralink, may be the hottest company in this space. Just this past at the end of 2022, Musk showed a monkey using a brain-computer interface to control cursor typing at the press conference.
Neuroscience research has also shown that electrodes and nanosensors can record neurons and create a complete map of the brain.
All in all, acquiring brain signals through BCI chips is the first step in mind upload, which is theoretically possible.
(Brain-Computer Interface (BCI) implementation)
The BCI brain-computer interface chip is equivalent to putting a "signal tower" in the brain, just like a mobile phone base station, but the signal received is the electrical signal sent by brain neurons.
It can also be done non-invasively, such as placing sensors and electrodes directly on the head, which can be easily placed and removed. But just like cell phone signals, the closer the base station is to the phone, the stronger the signal, the non-invasive BCI is blocked by the skull, and the obtained electrical signal is vague and imprecise. The signals sent by the brain cannot be captured by cursors, robotic arms, etc., and naturally cannot be controlled freely.
Of course, Musk's Neuralink's idea of "symbiosis between the human brain and artificial intelligence" is too far ahead. The next step in brain-computer interface may be to let some people "brain control" first.
The second step of climbing the ladder of immortality: let some people learn "brain control" first
In fact, the earliest research angle of brain-computer interface is military. UCLA began working on BCI in 1970, funded by the National Science Foundation, and subsequently awarded a contract with the Defense Advanced Research Projects Agency (DARPA). Relying on brain control to remotely control helicopters, parse passwords and other operations, belongs to the high-end bureau of the atmosphere, far from the lives of ordinary people, these "scientific and technological military ghost stories", we will not expand more.
At present, ordinary people can quickly benefit from brain-computer interfaces, mainly in two categories:
The first group of people - patients.
We know that some diseases can cause the destruction of some nerves and complete disability, which hinders many normal life functions, such as paralysis caused by nerve damage, and the patient's arm or leg cannot move; Alzheimer's disease, epilepsy and other diseases, the nerve endings of the hands cannot be controlled, violently shaking and unable to take care of themselves. As an advanced form of neural communication, brain-computer interface can help patients restore some functions, such as brain-controlled wheelchairs, brain-controlled robotic arms, brain-controlled keyboards, etc.
In recent years, the medical application of brain-computer interface has also begun to show some new changes: from inefficient to efficient.
In the past, we believed that patients using BCI to control complex machinery was very physically exhausting and slow, but the accuracy, precision, and calculation speed of technology were improving, and brain control was no longer a curious news.
In 2017, Peter Scott was diagnosed with ALS and had to undergo total laryngectomy, unable to make his voice heard again. So before the operation, he specially recorded 15 hours of audio corpus, and used AI for training and learning, after the operation, through the brain-computer interface to collect brain waves, AI learns his expression habits, through context perception to predict what the next word will enter, and then use synthesized speech to speak, greatly improving output efficiency and reducing the physical burden of patients.
From intrusive to non-intrusive. The application of brain control in professional medical treatment has long been achieved through invasive BCI. However, in recent years, non-invasive BCI has also begun to show certain effects in medical scenarios, which will be of great help to the improvement of the lives of more patients, after all, invasive surgery is more risky and costly.
A new study published in the journal iScience in 2022 shows that several quadriplegic patients successfully operated a brain-controlled wheelchair through a room full of barriers. This is the first case of brain-controlled wheelchairs achieved through non-invasive BCI.
Another category of people - geeks.
Non-invasive brain-computer interface acquisition signals are crude and cannot be manipulated accurately. As a wearable device, simple interaction and entertainment can be competent. Brain-controlled games, brain-controlled metaverses and other new things, futurists or geeks play the role of being the first to eat crabs.
South Korea's Looxid Labs has sneaked brainwave sensors in VR glasses to collect users' emotional data to determine the user's emotional state when watching ads, thereby supporting better advertising strategies.
Neurable has developed the world's first brain-controlled VR game, where players can drive a car remotely while sitting in front of a computer wearing an EEG headset. In 2021, the company launched Enten headphones, which use smart technology to detect attention and help users develop the habit of concentration. It sounds like a good fit for student education.
NextMind, which launched the NextMind Dev Kit, a wearable device that can record electrical activity in the brain, is an upgraded version of non-invasive eye tracking software at CES 2020, and the company was also acquired by Snap in 2022.
China also has a group of pioneers such as DAMO Academy, Tianqiao Research Institute, iFLYTEK, and Hanvon Technology, which have carried out research in the fields of brain-computer interface and artificial intelligence.
As Hawking said, the future of communication is brain-computer interfaces, which can use the tools of the new technological revolution to improve human life.
The potential of brain-computer interfaces may not be distant immortality, but act as a bridge between the human brain and external devices such as smartphones, headsets, and VR, allowing some people to hear the sound of the wind again, send a Weibo, and pour themselves a glass of water.
This may be a metaphor that only by valuing people's happiness and comfort in the here and now can we finally open the door to eternal life. As Liu Cixin once said: give civilization to the years, not to give civilization years.
The third step in climbing the ladder of immortality: make the brain-computer interface work better
So, what can the tech industry do to make brain-computer interfaces better serve people today?
At present, there are three directions worth looking forward to:
1. Material evolution.
In the past, invasive brain-computer interfaces used rigid devices, such as metal probes, and BCI chips were large, which could cause damage to delicate brain tissue, and most people stayed away.
In 2015, when the MIT team measured brain dopamine levels with an electrode, the electrode only lasted a day's work, and because it was so large, the brain produced scar tissue.
However, now invasive BCIs are beginning to use new technologies such as flexible materials and thin-film devices, and manufacturing methods are constantly improving, greatly improving biocompatibility, shrinking in size, durability and stretchability. Let invasive BCI reduce trauma, be more stable in the body, and even achieve non-invasive neuromodulation.
For example, a team of researchers at the University of California, Berkeley, has created an implantable sensor the size of a grain of sand, called "neurodust", which is also absorbed by Musk's Neuralink.
MIT's newly developed 10-micron probe sensor was tested for reliability in the brains of each experimental animal for 393 days, during which time it did not interrupt its work and did not find scar tissue in the brain.
These breakthroughs mean that intrusive BCIs are becoming more and more secure.
For a long time, BCI research was in the "progress today, backlash tomorrow." (Progress today, rebound tomorrow) amid twists and turns. In 2016, AI with deep learning as the core became popular, and it continued to penetrate in the field of brain-computer interface, artificial intelligence algorithms and computing power can help BCI systems improve performance, and the next generation of brain-computer interfaces based on AI enhancement has attracted more and more attention.
For example, deep neural networks (DNNs) are used to extract features from brain signals and decode brain states to accurately assess people's psychology and cognition. Human consciousness activity signals are high-dimensional and complex, and the neural network is used to construct an emotion recognition transfer model based on electroencephalogram (EEG) to gain insight into mental activity.
In addition, human psychological activity will be reflected in a variety of dimensions of data, such as skin reactions, facial expressions, eye movements, body temperature changes, etc., these different dimensions of huge data to be classified, processed, comprehensive analysis, is a very large work, with the addition of AI, processing multimodal data will become more efficient, which is also an important research direction in the field of BCI.
The paradigm shift of AI-driven scientific research will also bring subversive changes to the high-precision technology of brain-computer interface.
3. Accelerate innovation in commercial products.
Previously, it was generally believed that the return on investment ratio of invasive BCI was not high, and the subjects took the risk of brain implantation surgery, but scientists have a very limited understanding of the complex operating mechanism of tens of billions of neurons in the brain, and brain-computer interfaces cannot greatly improve the quality of life of subjects, mostly in laboratories or clinical trials.
However, with the gradual development of technology, brain-computer interfaces have shown strong technical practicality, and people's cognition of the concept of "brain-computer interfaces" is more mature and rational, such as cochlear implants is a brain-computer interface that helps deaf people find hearing, which has been deeply rooted in people's hearts and widely used.
McKinsey's The BioRevolution Report estimates that the global brain-computer interface industry will generate $70 billion to $200 billion in economic value over the next 10 to 20 years. It can be expected that whether it is medical applications or daily wearable devices, more institutions will join the BCI field to accelerate product innovation and optimization iteration, and brain-computer interface products will be more and more, more and more easy to use, and inclusive.
Perhaps in the near future, in the future, patients will use brain signals to control robotic arms, eat and dress themselves, and communicate freely with the outside world, which will also be as popular as cochlear implants, and will no longer be the privilege of a few.
Singularity University founder Ray Kurzweil once gave the path to digital immortality that most people can do in Dream Journey: Live Long Long to Live Forever — you have to live well and limit the effects of aging and disease as much as possible.
If we live long enough, maybe we can really wait until the day when digital becomes a reality forever.
Uncharted territory above the ladder of immortality
Having said so many things in reality, the prospects of brain-computer interfaces seem to be very bright and smooth.
But the tech industry has never been just about the technology itself. Regarding the ethical and moral challenges of brain-computer interfaces, the industrial chain system that supports universal commercialization, and mature business models, etc., it still needs a long period of iteration, exploration, and game.
It has to be said that the idea of digital immortality raises many issues that deserve serious consideration. To what extent can people retain themselves after implanting certain devices?
Will technology slow aging and even death exacerbate social, economic and class disparities?
How will a society with both living humans and immortal digital life work?
Liu Cixin once imagined such a scene in "Time Migration", 1,000 years later, human society has entered the "invisible era", real people with bodies live in the tangible world, but a large number of people choose digital immortality, even the body of the machine is not wanted, just live in a quantum chip, living into some quantum pulses.
In the digital world, people can truly do whatever they want and create everything they want, more powerful than God. In "The Wandering Earth 2", Tuya and Tu Hengyu gained eternal life and lived happily together forever, and this ending also comforted many audiences.
In short, compared to the troubled real world, the invisible world is as tempting as drugs. Is this heaven or the end for all mankind? This is the domain of philosophers.
Of course, if we can never solve the series of technical challenges of brain-computer interfaces and even digital immortality, all problems are meaningless. At least in the field of technology, the curtain of immortality has been lifted.