A UN report states that a Turkish-built quadcopter was used to attack retreating Libyan fighters last year. But, in what appears to be a first that has some concerned, the device was operating in a fully autonomous mode – hunting down its targets without human intervention.
The information, contained in a United Nations document, was reported on by Aviation Week and New Scientist. It details an attack that apparently took place in 2020 and used a product manufactured by Turkey. It’s called the Kargu-2 and carries an explosive charge.
And, according to the report, it carried out its deadly mission – against a rival Libyan faction – autonomously. The report, in fact, says retreating forces were “hunted down” by the device.
The machine is manufactured by STM. Here’s what it looks like:
Here’s how STM describes the machine’s capabilities:
KARGU® is a rotary wing attack drone that has been designed for asymmetric warfare or anti-terrorist operations. It can be carried by a single personnel in both autonomous and manual modes. KARGU® can be effectively used against static or moving targets through its indigenous and real-time image processing capabilities and machine learning algorithms embedded on the platform. KARGU®, which is included in the inventory of the Turkish Armed Forces, enables soldiers to detect and eliminate threats in a region, and can be used easily by the soldiers in the area without entering the risky areas, especially in asymmetric terrorist operations and asymmetric warfares.
Video of Kargu-2
The STM web site also includes a video that demonstrates the Kargu-2 in action. Essentially, it flies in like a Kamikaze pilot, detonating an explosive charge in close proximity to the target. On detonation, munitions designed to kill people or damage equipment (pre-loaded, depending on mission), are dispersed.
You might wonder what that looks like. STM has a demo video up:
Moment of impact
We screen-grabbed a couple of frame so you can see this in a little more detail:
The machine has a 10X optical Zoom and built-in AI that allows it to identify a target. And while you can see that could be useful in a weapon, such capabilities are highly controversial. In fact, back in 2018, there were preliminary UN discussions to consider drafting a treaty that would ban the development of what critics called “Killer Robots.” As Politico reported, the US and Russia shut down the process before it could get anywhere.
And so now, we’re seeing an autonomous drone that’s capable of making some of its own deadly decisions.
A few years ago, someone pulled together a very Black Mirror-like video to highlight the potential of such weapons. A lot of people shared the video, believing it to be real.
It wasn’t. But you can imagine this kind of technology is not far off (or could already exist somewhere). The film is called “Slaughterbots.”
At the end of that film, a professor warns against this kind of future, and directs those interested in learning more to this website. The site, autonomousweapons.org, describes the kind of weapons it’s concerned about:
Lethal autonomous weapons systems, sometimes called “killer robots,” are weapon systems that use artificial intelligence to identify, select, and kill human targets without human control. This means that the decision to kill a human target is no longer made by humans, but by algorithms. A common misconception is that these weapons are “science fiction.” In fact, given the increasing interest in the militarization of AI, lethal autonomous weapons systems are currently under development by some countries. They could be used on the battlefield very soon.
Now, it appears, they already are.
Though not fans of armed conflict, it’s an ongoing part of life on parts of this planet. But there have been grave concerns, expressed by scientists and Think Tanks, about the downside of such devices. Once they’ve been dispatched, they’re making their own decisions.
As the UN report states: “The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.”
And if those decisions aren’t perfect every single time, the consequences could be devastating.
FTC: We use income earning auto affiliate links. More.