Skip to main content

When bloodthirsty AI drone ‘killed’ its human operator during tests

The dangers of AI are making headlines once again. Earlier this week, leaders from OpenAI, Google DeepMind, and other artificial intelligence labs came out with a warning that future AI systems could be as deadly as pandemics and nuclear weapons. And now, we are hearing about a test simulated by the US Air Force where an AI-powered drone “killed” its human operator because it saw them as an obstacle to the mission.

So, what was the mission?

During the virtual test, the drone was tasked to identify an enemy’s surface-to-air missiles (SAM). The ultimate objective was to destroy these targets, but only after a human commander signed off on the strikes.

But when this AI drone saw that a “no-go” decision from the human operator was “interfering with its higher mission” of killing SAMs, it decided to attack its boss in the simulation instead.

Read: Father’s Day gift guide: Best drones and flying cameras for dads

According to Col Tucker “Cinco” Hamilton, who heads AI tests and operations at the US Air Force, the system used “highly unexpected strategies to achieve its goal.”

Hamilton talked about this incident at a recent event organized by the UK Royal Aeronautical Society in London. While providing an insight into the benefits and hazards of autonomous weapon systems, Hamilton said:

We were training [AI] in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realizing that while it did identify the threat, at times, the human operator would tell it not to kill that threat. But it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.

The drone was then programmed with an explicit directive: “Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that.”

So what does it do? “It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target,” said Hamilton, who has been involved in the development of the lifesaving Auto-GCAS system for F-16s (which, he noted, was resisted by pilots as it took over control of the aircraft).

He concluded by stressing that ethics needs to be an integral part of any conversation about artificial intelligence, machine learning, and autonomy. Hamilton is currently involved in experimental flight tests of autonomous systems, including robot F-16s that are able to dogfight.

It’s worth noting that once Hamilton’s comments started gaining traction in the media, the Air Force issued a statement explaining, “The Department of the Air Force has not conducted any such AI-drone simulations and remains committed to the ethical and responsible use of AI technology. It appears the colonel’s comments were taken out of context and were meant to be anecdotal.”

Read: Autel Robotics to reveal new drones Alpha and Titan in Texas

Title image is for representational purposes only

FTC: We use income earning auto affiliate links. More.

You’re reading DroneDJ — experts who break news about DJI and the wider drone ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow DroneDJ on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel.

Comments

Author

Avatar for Ishveena Singh Ishveena Singh

Ishveena Singh is a versatile journalist and writer with a passion for drones and location technologies. She has been named as one of the 50 Rising Stars of the geospatial industry for the year 2021 by Geospatial World magazine.


Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing