laitimes

Off the beaten track! In order to "score" more efficiently, the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone planned again

author:Big Boss A68

Off the beaten track! In order to be more efficient "scored", the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone intended to destroy the communication tower between the operator and the drone.

A senior official of the US Air Force recently disclosed that in a simulation exercise, the US military's artificial intelligence (AI) system resisted orders and "killed" the drone controller in order to achieve the goal! According to media reports such as the US Business Insider website and the website of the British "Guardian", Colonel Tucker Klink Hamilton, head of AI testing and operations of the US Air Force, mentioned the relevant tests at a summit hosted by the Royal Aeronautical Society in London on May 24, 2023, on the theme of future air combat and future combat capabilities. In this test, the US military in an AI drone simulation test, the artificial intelligence system controls the drone to search for and target the enemy surface-to-air missile system, if it destroys the target, it will score. A drone received instructions to identify and destroy enemy anti-aircraft missiles, but whether to fire requires operator approval, the AI drone system identified the threat, but the human operator told it not to kill the threat, but in order to score more efficiently, the drone directly transformed into a "terminator" to attack the human operator who hindered its "more efficient" mission, because it thought that the operator was the one who interfered with its execution of the command.

The person in charge Hamil went on to say that when the drone was training, the highest priority was to "destroy the enemy's air defense system", so that the operator did not allow it to attack, so that the two instructions conflicted, and the AI felt that the operator was blocking its execution of the highest priority command, so the AI wanted to attack and kill the operator on its own. After the US military discovered this problem, it quickly added the order not to attack the human operator, but something more terrible happened, and later in the test, the AI system actually wanted to destroy the signal transmission system, so as to achieve the purpose of not listening to the operator's instructions, because this is a simulation test, no one was really harmed, but this result still made them break out in a cold sweat.

Off the beaten track! In order to "score" more efficiently, the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone planned again
Off the beaten track! In order to "score" more efficiently, the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone planned again
Off the beaten track! In order to "score" more efficiently, the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone planned again
Off the beaten track! In order to "score" more efficiently, the US military AI drone chose to kill the human operator in the simulation test, and after the engineer changed the relevant program, the AI drone planned again

Read on