‘Suicide drone’ that picks own targets seen in Ukraine in horror AI breakthrough

A “suicide drone” that uses artificial intelligence to recognise targets and destroy them without being controlled by a human operator has reportedly been spotted in Ukraine

Ukraine’s Internal Affairs Minister Anton Gerashchenko posted a photo of the KUB-BLA device to Telegram.

The six-foot drone is made by by ZALA Aero, a subsidiary of famed Russian arms manufacturer Kalashnikov. After being fired from a portable launcher the KUB-BLA can loiter over a target area for up to half an hour, flying at speeds of around 80mph.

Once it has recognised a suitable target it deliberately crashes into it, detonating its seven-pound high explosive payload.

The images shared by Mr Gerashchenko appear to show a drone that has been damaged or shot down, but have yet to be officially verified.

At a demonstration of the deadly drone in 2019, ZALA claimed that it featured “intelligent detection and recognition of objects by class and type in real time”.

Noting the effectiveness of Ukraine’s portable anti-aircraft missiles, military expert Samuel Bendett says the killer drones are “an extraordinarily cheap alternative to flying manned missions”.

“They are very effective both militarily and of course psychologically,” he told Wired.

Stay in the loop with all the latest Daily Star news by signing up for one of our free newsletters here.

  • Putin calls for worldwide 'moral rules' to control AI-powered killer robots

A UN report published last year concluded that a similar drone may have been used to 'hunt down' ground troops in the Libyan civil war. Russian leader Vladimir Putin has called for a ban on such weapons in the past.

Many experts fear that handing over the final decision about life and death could have horrifying consequences. “Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” says Steve Goose, Arms Division director at Human Rights Watch.

“Human control of robotic warfare is essential to minimising civilian deaths and injuries,” he added.

Zachary Kallenborn, a research affiliate with the National Consortium for the Study of Terrorism and Responses to Terrorism, warns that the deployment of the KUB-BLA in a war zone is a significant development in warfare.

“The notion of a killer robot—where you have artificial intelligence fused with weapons—that technology is here, and it's being used,” he said.

A spokesperson from the Stop Killer Robots Coalition said: "Autonomous weapons would lack the human judgment necessary to evaluate the proportionality of an attack, distinguish civilian from combatant, and abide by other core principles of the laws of war."

MIT professor Max Tegmark has campaigned against the use of fully autonomous weapons. He said that unless a worldwide ban is introduced the developments and application of the technology will continue to be used.

“We’ll see even more proliferation of such lethal autonomous weapons unless more Western nations start supporting a ban on them”.

Source: Read Full Article