Ex-Google employee fears 'killer robots' are on the horizon

A former software engineer at Google has warned that autonomous killer robots could accidentally start a war in the future.

Laura Nolan, who used to work at the tech giant before resigning last year, has called for automated weaponry to be outlawed by international treaties.

She says that artificially intelligent military technology should be outlawed the same way chemical weapons are. Her reasoning is that, unlike a human-controlled drone, an autonomous machine could do something unexpected and end up starting World War 3.

Nolan quit Google in protest after being asked to work on a project to improve the US military’s drone technology. However, there’s no suggestion that Google is building or researching any kind of autonomous killing machine.

Still, Nolan is a vocal member of the Campaign to Stop Killer Robots which talks to the UN in New York and Geneva about this possibility of something like this actually happening.

‘There could be large-scale accidents because these things will start to behave in unexpected ways,’ she told the Guardian.

Which is why any advanced weapons systems should be subject to meaningful human control, otherwise they have to be banned because they are far too unpredictable and dangerous.’

The project Nolan worked on at Google – known as Project Maven – was shelved after over 3,000 employees signed a petition against it. The idea behind it was that a trained artificial intelligence could cycle through hours of drone video content and quickly differentiate between humans and objects. In the future, it could make UAVs even more accurate when searching for a target.

After resigning, Nolan has predicted that the demand for autonomous weapons will grow with fatal consequences.

‘You could have a scenario where autonomous weapons that have been sent out to do a job confront unexpected radar signals in an area they are searching; there could be weather that was not factored into its software or they come across a group of armed men who appear to be insurgent enemies but in fact are out with guns hunting for food,’ she said.

‘The machine doesn’t have the discernment or common sense that the human touch has.’

‘The other scary thing about these autonomous war systems is that you can only really test them by deploying them in a real combat zone. Maybe that’s happening with the Russians at present in Syria, who knows? What we do know is that at the UN Russia has opposed any treaty let alone ban on these weapons by the way.’

Bear Braumoeller, professor of political science at The Ohio State University has analysed data relating to international warfare during the last 200 years.

He said the John Lennon-inspired belief that war is over has lulled us into a false sense of security and cautioned that our complacency about a peaceful future is misplaced.

‘We really don’t get how big a threat war is – not by a longshot,’ he said.

‘The process of escalation that led to two world wars in the last century are still there. Nothing has changed.

‘And that scares the hell out of me.’

Source: Read Full Article