laitimes

Advanced stage of AI development: brain-computer integration realizes harmonious symbiosis between man and machine, and mind control is achieved by wishes

author:Global Technology Map
Advanced stage of AI development: brain-computer integration realizes harmonious symbiosis between man and machine, and mind control is achieved by wishes

Advanced stage of AI development: brain-computer integration realizes harmonious symbiosis between man and machine, and mind control is achieved by wishes

With the continuous application of 5G, 6G, cloud computing, AI (artificial intelligence), big data and other technologies in various fields, AI robots' perceptual cognition, rational cognition, analysis and judgment, search and capture, comprehensive induction, and autonomous decision-making capabilities continue to improve, which will promote the harmonious coexistence of people and unmanned, and the future society of human-machine integration from fantasy to reality.

Since the term artificial intelligence (AI) was proposed at the Dartmouth Society in 1956, researchers have developed new digital products such as robots, language recognition, image recognition, natural language processing and intelligent expert systems, as well as chatbots and broadcast robots, and now basically realized from 1.0 to 2.0.

AI robot, is the use of artificial intelligence technology to manufacture the robot, compared with the general robot, has a fairly developed "brain" in the central computer operation control system, with a variety of internal information sensors and external information sensors, can understand human language, understand human words, use human language and operators between direct dialogue and communication, replace people to complete any style of text and other work.

The birth of AI robots was originally an assistant to humans, and at that time it did not have human consciousness and thinking, just like a job security platform, and someone entered a fixed program in advance for it to complete various support tasks instead of people. Since then, AI robots, assisted by intelligent expert systems generated by computer systems, have begun to calculate and transmit information like humans, which can help people store and quickly process massive data, with simple human capabilities, but no one's perception and cognitive ability, and can not get rid of human control. Until now, AI robots have the ability to think smart, perceive cognition, understand and understand, analyze and judge, and make autonomous decisions, which can help people efficiently complete the work of "seeing" and "listening", that is, acting according to human control procedures, and can also make autonomous decisions without human commands.

First, the AI race drives the development of robot "brain" to "human brain" thinking

At present, in the AI competition launched by countries around the world, although the computing thinking ability of AI systems is not comparable to that of human brains, researchers can compete with humans after implanting the latest algorithms into AI systems and training and guiding their "deep learning".

In January 2016, Alpha Go shocked the world by defeating the European champion in real combat. After that, he defeated the Korean Lee Se-woo in the first peak duel in Seoul. In 2022, the work "Space Opera House" generated by Al automatic painting won the first place in the "Digital Art/Digital Retouching Photo" category in the Exposition Fine Arts Competition held in Colorado, USA. On March 15, 2023, an artificial intelligence research company called OpenAI in the United States announced the launch of GPT-4, the latest in its series of AI language models to support applications such as ChatGPT and the new Bing. Compared to the model used by ChatGPT, GPT-4 is not only able to process image content, but also has improved the accuracy of responses. According to OpenAI, GPT-4 is a large multimodal model that, while inferior to humans in many real-world scenarios, has outperformed the vast majority of humans in many professional tests. In view of this, OpenAI's artificial intelligence research company announced the development of GPT-5, triggering a new round of AI competition, which is changing the modern lifestyle of the information age with people as the main and intelligent technology as the supplement, intelligent society, smart factory, intelligent industry, smart supermarket, intelligent workshop, intelligent editorial office, etc.

During the 2023 National Two Sessions, in the "Crown" Inspection and Two Sessions" launched by CCTV, the ultra-simulated virtual host "AI Crown" and the crown Buddha once again shared the stage, communicating anytime and anywhere, opening up a new way to report on hot topics. In the "AI Crown 2.0" new reporting mode short video pilot, "AI Crown" said that he is most interested in the field of scientific and technological innovation, "If my computing power is further improved, the performance will surpass you". During the two sessions, the performance of the "AI Crown" was remarkable, its information expression was accurate and clear, and its control ability was strong. Supported by deep learning algorithms, the "AI crown" can continuously learn and iterate on self-functions.

In addition to Google, IBM, Microsoft, Facebook, Yahoo, Amazon and other foreign giants, as well as domestic Baidu, Tencent, Alibaba platform, have invested in the AI robot industry, AI writing robots in many media and information websites have also accelerated the layout, AI writing robots have been integrated into the digital news reporting industry field, various new media live broadcast artifacts such as VR, AV, H5 technology has advanced by leaps and bounds, AI writing robots take the cloud report express of big data + cloud computing to start, It is profoundly changing the way of integrating media with people-oriented editors and reporters to interview and report, and opening up a new model of human and machine (editors, reporters and AI writing robots) combined with digital news reporting, while some industry players also breed intellectual phobia.

Take the current ChatGPT, it is true that some industry players are a little uneasy after reading the report, worried that the dominance of the human brain will be replaced by AI robots and marginalized, and they are even more afraid that ChatGPT will take away the jobs of some industry players, and some germinate to rely on ChatGPT to be opportunistic... In short, ChatGPT was deified, and humanity had a sense of crisis. It is undeniable that as a pioneer of the AI race, the AI robot ChatGPT derived from artificial intelligence technology has the advantage of being able to quickly capture and process a large amount of information and complete various tasks such as text. At the same time, ChatGPT can also play the role of a "virtual assistant", conduct humanized interactive conversations with users, and chat conversations like real people. However, ChatGPT is not a panacea, and there are some potential risks and challenges. Moreover, ChatGPT is controlled by the human brain, although the upper knows the astronomy and the lower knows the geography, but after all, it is a program written by humans, and it is ultimately inseparable from human control, and it is an auxiliary tool for human beings to improve the quality and output of work, and the difference between people is human thinking. This means that smarter people will be able to take better advantage of ChatGPT. Therefore, it can be asserted that perhaps ChatGPT will replace many jobs for humans, but definitely not the human brain. No matter how intelligent AI robots are, the decisive role and status of people in the intelligent society have not changed, but people's dependence on AI robots is increasing, and the requirements for people's digital technology quality are getting higher and higher.

In the near future, AI robots can contact users, conduct human-computer dialogue, begin to understand, analyze, think and make decisions like humans, and even have superhuman capabilities, and gradually realize the development of human-control, human-computer interaction to idea control, and human-computer integration.

Second, the movie "Avatar" stimulated the development of "human brain-machine" interface idea control robot technology spark

Those who have seen the movie "Avatar" will not forget such a clip in the film: on Pandora, Jack Sully, a paralyzed former naval soldier, lies in a sealed cabin, using the complex equipment worn on his head to manipulate the artificial hybrid Avatar to complete the various tasks that the owner wants to do. Of course, Pandora Star and Avatar are just the fiction of director Cameron, and it is naturally impossible to manipulate Avatar with ideas. However, you may not know that the manipulation of objects with thoughts is no longer a human fantasy.

At present, brainwave-related technologies are continuously used in the medical field, such as the treatment of patients with brain diseases such as epilepsy. In addition, several scientific institutions have claimed to have developed prosthetic limbs that can use mind control, but most of them are still in the laboratory stage. Although the extraction of multimodal brain features has been realized, due to some inherent limitations of brain-computer interface, such as slow speed and insufficient accuracy rate, it has become a major obstacle in international research on brain control technology. In order to solve this problem, the School of Electronic Information of Northwestern Polytechnical University in China took the lead in proposing the concept of "brain-computer integration", which strongly promoted the practical application of brain control technology.

Today, based on the characteristics of brain waves that change with mood swings, humans have developed "human brain-machine" interface idea control robot technology.

A brain-computer interface, sometimes called a brain port or brain-computer fusion perception, is a direct connection pathway established between the human or animal brain (a culture of brain cells) and an external device. In the case of a one-way brain-computer interface, the computer either receives commands from the brain or sends signals to the brain (such as video reconstruction), but cannot send and receive signals at the same time. The bidirectional brain-computer interface allows the two-way exchange of information between the brain and external devices.

On February 21, 2012, the brain-computer interface research team of the Qiushi Institute for Advanced Study of Zhejiang University in China announced that they used computer information technology to successfully extract and decipher the neural signals of the monkey brain on the four gestures of grasping, hooking, grasping and pinching, so that the monkey's "mind" can directly control external machinery. In 2012, the idea control system developed by Zhejiang University students won the third prize of the National College Student Virtual Instrument Competition, which can select the system function through the blink of an eye, concentrate on the execution of the system function, and realize the idea control manipulator, audio and video broadcasting, Internet information transmission, wheelchair model and game play. In November 2013, Zhejiang University students developed a Wifi communication-based idea control video car, which can control the speed of the racing car through attention, and the video of the camera can be transmitted to the PC and mobile phone in real time. This design won the first prize of the National College Students Measurement Control and Instrumentation Design Competition due to its innovation and technology, and obtained the national patent. In January 2014, Zhejiang University set up a Qingmang innovation and entrepreneurship team to develop idea-controlled fan lights, mind-controlled racing cars, mind-controlled aircraft, etc., and exhibited them in Zhejiang Science and Technology Museum. In the idea control fan light, the user can control the rotation of the fan through attention, and the setting text will be displayed on it after the fan is rotated, which is very cool. In Idea Control Racing, users can control the start-stop and speed of the car through attention. In the mind control aircraft, the user can control the height of the aircraft through attention, and the more the user concentrates, the higher the flight altitude of the aircraft. In March 2014, Zhejiang University developed an Emotiv-based idea-controlled vehicle-mounted manipulator, which allows users to control the movement of the manipulator through motion imagination and complete specific actions through expression control of the manipulator.

In June 2015, the School of Mechanical and Power Engineering of Shanghai Jiao Tong University successfully used human brain ideas to remotely control live cockroaches. Under the command of the human brain, this cockroach actually completed tasks such as S-shaped trajectory and Z-shaped trajectory.

At the presentation of the pure mind-controlled artificial neurological rehabilitation robot system jointly held by Tianjin University and Tianjin People's Hospital, Ms. Dong, who was paralyzed due to stroke, was able to command her originally unable to move limbs to obediently complete the corresponding movements just by thinking, realizing the dream of paralyzed patients moving at will, thinking and acting as one. This is the latest research result jointly developed by the neural engineering research team of Tianjin University and Tianjin People's Hospital, named "Shengong No. 1". Compared with the brain-controlled mechanical exoskeleton that appeared in the World Cup, "Shengong No. 1" can embody pure ideas. The imagination of the experiencer is not just a simple control command, but a way of activating the brain area, which is the actual body action to be implemented, and truly realizes the synchronous coupling of the cerebral cortex and muscle activity. The system includes six parts: non-invasive EEG sensing module, imaginary action feature detection module, motor intent recognition module, instruction coding interface module, stimulus information conditioning module, and stimulus current output module. The experiencer needs to wear an EEG detector equipped with electrodes on the head and install electrodes on the muscles of the diseased limb, and with the connection of "Shengong One", he can use his mind to control his otherwise unable to move limb.

The pure mind-controlled artificial neurorehabilitation robot "Shengong II" developed by Tianjin University was released in Yantai, Shandong Province and entered clinical use, allowing some paralyzed patients to resume movement.

In 2008, scientists at the University of Pittsburgh in the United States realized that monkeys can use their "minds" to control the movement of a robotic arm. At the 2009 Consumer Electronics Show, Mattel, the world's largest toy manufacturer, launched MindFlex, a toy based on brainwave technology.

MindFlex is a brainwave-controlled toy that allows the player to levitate a ball into the air with their will, and the more focused the mind, the higher the ball will float. With assisted manual control devices, players can control the ball to traverse various obstacles. Just 5 weeks after MIndFlex launched, the first products sold out, and the crazy performance earned it the number one spot on Amazon's 2009 Christmas toy shopping list. In October 2011, scientists at Duke University Medical Center published an article in the journal Nature announcing that they could not only allow monkeys to move their virtual palms with their minds, but also feel the tactile signals of virtual palms touching objects.

Russia's "Foundation for Future Research" has mastered brain-computer interface technology for mind control machinery.

British researchers have developed a brain-computer interface device for controlling spacecraft simulators, which, when worn on the tester's head, can successfully control the flight of spacecraft models, and is expected to fuse people and unmanned.

Third, brain-computer integration, allowing soldiers to remotely manipulate their "Avatar" avatars with their minds to fight on the battlefield

Brain-computer integration is to combine the intelligence of the brain and artificial intelligence based on computer technology through the brain-computer interface, and use the brain as a nerve center in the computer control system, forming a new type of system that has both the flexibility and intelligence of the brain and the high speed and large capacity of the computer, which is used to control various devices and systems. It does not rely solely on the "brain" or entirely on the computer for full control. Such as: in the future land unmanned combat vehicle application, the operation of the combat vehicle is mainly based on the computer system control of artificial intelligence technology, but the "intelligence" of the artificial intelligence system relies on the battlefield situation it has experienced, the battlefield situation is complex and changeable, and the conventional intelligent system cannot be adaptable, which is why the unmanned combat vehicle sometimes cannot operate according to human control. By joining brain control, various warnings and mode switching instructions can be issued in time in case of possible emergencies, avoiding various accidents. The neuro-information team of China's Western Institute of Technology made a new breakthrough and realized the control of the entire drone formation with ideas.

Since 2004, the Defense Advanced Research Projects Agency of the US Department of Defense has invested heavily in research related to "mind control robots" in six laboratories across the United States, including the Center for Neural Engineering at Duke University. Although it is still early to achieve this "ultimate goal", scientists have already made some breakthroughs. In 2008, scientists in North Carolina were able to make a macaque walk upright on a treadmill and acquire neural signals from electrodes implanted in the macaque's brain, which were sent to laboratories in Japan along with videos via the Internet, and finally the American macaque succeeded in "mind control." On the basis of more than 10 years of practice in animal experiments, the early implantable devices used in the human body by the US military were designed and manufactured, and the BCI (brain-computer interface technology) appeared in the "Avatar" film began to be studied, and it is intended to create a giant "mechanical warrior" in the movie in the future, allowing soldiers to remotely manipulate their "Avatar" avatars to fight on the battlefield with their minds.

In June 2013, a team of Chinese scientists at the University of Minnesota presented their thought control research results. It is different from previous thought control technologies that require electrodes to be implanted in the brain. This latest thought-control technology is completely non-invasive and does not require brain implantation. The user only needs to wear a hat and record the user's brain waves through the electrodes on the hat. On this EEG scanning cap, 64 electrodes are mounted against the scalp. These electrodes monitor electrical activity from the brain and relay signals (or signal interruptions) to a computer. The computer processes this data and converts it into another electronic signal, which is transmitted to the receiver of the aircraft through Wifi, so as to control the flight movements of the aircraft. The team showed how to use their own ideas to control a model helicopter to fly, dive and ascend in the air. Through brain-computer interface technology, not only can UAVs and other aircraft be controlled through ideas, but also can control various unmanned equipment on land, through single-electrode brainwave acquisition equipment equipped with advanced brainwave processing chips, high-precision gyroscopes, and output head position, user attention, relaxation, primitive brainwaves and other parameters through Bluetooth to control land aircraft actions.

In the future, individual soldiers can control dozens of battlefield robots, assign a specific task to each robot, give the robot a certain autonomy within the mission framework, and command and control the surrounding robots through ideas. Robots can also operate independently according to changes in battlefield conditions, or work autonomously with manned troops and other unmanned combat units. Commanders and operators sit in the command post and look at the screen monitors to command formed robot units; Individual soldiers wear smart helmets with mind control technology, which not only rotates the camera and sensor platform to observe the robot's behavior, but also controls the robot's actions. Manned pilots, who can control the smart helmet display by wearing their minds, can command and control a small group of drones flying nearby or multiple small drones launched from various platforms.

Of course, while the mind controls the AI robot, the robot can also act autonomously, and even have superior wit and superhuman skills. When the intelligent brain of the unmanned system gradually evolves to the human intelligent brain, it will increasingly break through the boundaries of people's military imagination, and one day the intelligent brain may replace human brain power, and it is not a fantasy that it is not a fantasy to have superhuman intelligence. In August 2020, AI (artificial intelligence) defeated top F-16 fighter pilots in simulated air combat by the U.S. Air Force, and the human pilot said "nothing we do as fighter pilots works." During the April 6, 2023 exercise, the U.S. military tested the interoperability of AI pilots and human operators on over-the-horizon missions, the first "low-Earth orbit satellite communication" in the history of drones. Commands from human operators are sent to LEO-SATCOM via "manual throttle and joystick" and commands are sent to AI pilots via LEO-SATCOM. AI pilots autonomously track and dynamically maneuver and update relevant information through HOTAS. The information provided by the AI pilot is displayed through the HUD, and the human operator reassigns tasks to the AI through HOTAS based on this information. In this test, with LEO-SATCOM, operators on the ground can quickly train and deploy AI pilots as drones take off, demonstrating GA-ASI's ability to update AI pilots in minutes.

In early June 2021, the U.S. Navy's Surface Development No. 1 Squadron remotely operated the second USS Wanderer in the U.S. military's "Ghost Fleet" test program from the Gulf of Mexico through the Panama Canal and into the Pacific Ocean at the Unmanned Operations Center at the San Diego Naval Base in California, with 98% of the voyages in autonomous mode. The U.S. Navy's unmanned warship "Sea Hunter" has successfully completed its first sea trial and can sail thousands of miles without the need for personnel on board.

The Navy's future F/A-XX sixth-generation fighters will participate in combat as the Navy's "quarterback" (on-field commander in American football games) of manned and unmanned aircraft crew in future carrier operations, using manned/UAV crews to provide greater lethality and survivability. The American company Ostal officially delivered the expeditionary fast transport ship "Appalachikola" to the US Navy. This is the 13th Vanguard-class expeditionary fast transport ship commissioned by the U.S. Navy and is expected to be the largest unmanned ship in the U.S. Navy's fleet. Launched in November 2021, the autonomous Appalachicola sailed 678 nautical miles (1 nautical mile equals 1.852 kilometers) from Port Mobile, Alabama, in the southern United States, to the Port of Miami, Florida, in eastern Florida during acceptance and unmanned tests on September 12, 2022. 85% of the time, the ship is in autonomous driving. The U.S. Navy announced plans to build an unmanned fleet of 10 large unmanned surface ships in the next five years for independent operations or joint operations with surface forces.

In 2015, after Russia put the robot detachment into the Syrian battlefield for the first time in an organized form, it used a robot corps composed of 6 platform-M crawler robots, 4 code wheeled robots, 1 acacia automated artillery group, and several drones on the Syrian anti-terrorist battlefield, through the Andromeda-D command system, the terrorist organization carried out air-ground integrated human-machine coordinated joint operations, and achieved shocking combat effects to the outside world. In February 2021, Russian media disclosed for the first time the footage of the Orion drone in actual combat in Syria, disclosing the scene of the aircraft using desert paint to carry out precision strikes at terrorist target facilities on the ground.

In October 2022, the Ukrainian Navy used suicide unmanned boats to raid the Sevastopol naval base where the Russian Black Sea Fleet is stationed, which is seen as an innovative use of such unmanned combat equipment. On February 10, 2023, the Russian army responded with a tooth for a tooth, also using unmanned boats to raid a strategic bridge connecting Ukraine and Moldova. Western media claim that this is the first time that the Russian army has used unmanned boats. The Kremlin did not comment on the matter.

The Russian Navy is also testing underwater drones (unmanned underwater vehicles) that can effectively monitor the underwater environment and detect approaching enemies, watch ships and underwater facilities, as well as find mines and carry out missions to protect ports from invasion and destruction. Underwater drones can be operated by an operator or act autonomously in designated areas. For example: "Harpsichord-1R", Russia's first heavy-duty autonomous unmanned underwater vehicle that can be used in practice, passed navigation tests in the Far East in 2005-2006, and achieved test use in the Arctic in 2007; The Warrior-D autonomous unmanned underwater vehicle, capable of operating in the deepest parts of the world's oceans, was tested in 2019 and descended to the bottom of the Mariana Trench in May 2020, recording depths of more than 10 kilometers.

Israel, as a country that places special emphasis on reducing casualties among its soldiers, has pioneered the modern Protector unmanned surface ship program, which is used to patrol the Lebanese coast and monitor the activities and arming of the Allah. In 2024, Israel will begin production of a new autonomous drone called "Kizilelma", which is planned to fly in formation with other UAVs.

In 2016, the Royal Navy tested an unmanned fleet, including unmanned ships and drones, off the coasts of Scotland and West Wales. The Royal Navy hopes to build fully automatic, unmanned, laser- and guided missile unmanned "Spectre" warships that will replace manned frigates and destroyers within the next 10 years. The British Royal Navy is exploring the feasibility of using unmanned system technology for the Queen Elizabeth-class aircraft carrier, and is currently testing the Banshee fixed-wing drone developed by Quenetik on the aircraft carrier Prince of Wales.

Iran has been equipped with several series of drones, and in May 2022, the 313th underground drone base was first disclosed to the public. As Iran's first UAV-specific underground combat facility, the base stores hundreds of large, medium and small reconnaissance and combat integrated UAVs, including "Swallow-5", "Miger-6", "Fallen Angel", "Kaman 22" series and other UAVs. On July 15, 2022, the Iranian Navy announced the creation of its first naval drone division.

The development of 5G to 6G can allow unmanned system platforms and warfighters to achieve interconnection in a wider space, enabling military commanders to hand over more and more command power and decision-making power to new artificial intelligence algorithms. Commanders simply sit in front of a computer, watch the video transmitted from the camera, and use their thoughts and expressions to exercise various controls on the robot soldier or the unmanned system platform. At that time, it will become a reality for unmanned troops to go into battle in formation. It can be predicted that the future army is a combination of man and machine, squad and platoon company commanders may be gradually replaced by robots, intelligent command posts, intelligent imaginary enemies, unmanned military camps, etc. will be born.

Disclaimer: This article is transferred from strategic frontier technology, the original author Wei Yuejiang. The content of the article is the personal views of the original author, this public account compilation/reprint is only to share, convey different views, if you have any objections, welcome to contact us!

Transferred from 丨Strategic frontier technology

Author丨Wei Yuejiang

Advanced stage of AI development: brain-computer integration realizes harmonious symbiosis between man and machine, and mind control is achieved by wishes

About the Institute

Founded in November 1985, the International Institute of Technology and Economics (IITE) is a non-profit research institution affiliated to the Development Research Center of the State Council, whose main functions are to study major policy, strategic and forward-looking issues in the development of the mainland's economy, science and technology, and economy, track and analyze the development trend of world science and technology and economy, and provide decision-making consulting services for the central government and relevant ministries and commissions. "Global Technology Map" is the official WeChat account of the International Institute of Technology and Economics, dedicated to conveying cutting-edge technology information and scientific and technological innovation insights to the public.

Address: Building A, Building 20, Xiaonanzhuang, Haidian District, Beijing

Tel: 010-82635522

WeChat: iite_er