I've spent my life working on artificial intelligence (AI), and there are many reasons why I am fearful of the development of killer robots. Here's five of them.
1. Killer robots are near
You might be thinking of "Terminator" -- a robot which, if you believe the movie, will be available in 2029. But the reality is that killer robots will be much simpler to begin with and are, at best, only a few years away. Think Predator drone and its aptly named Hellfire missiles, but with the human controller replaced by a computer program. This is technically possible today.
2. There will be an arms race
Once this genie is out of the bottle, there will be an arms race to improve on the initially rather crude robots. And the end point of such an arms race is precisely the sort of terrifying technology you see in "Terminator." Hollywood got that part right.
Moore's Law predicts that computer chips double in size every two years. We're likely to see similar exponential growth with killer robots. I vote to call this "Schwarzenegger's Law" to remind us of where it will end.
3. Killer robots will proliferate
Killer robots will be cheap. And they'll only get cheaper. Just look at the speed with which drones have dropped in price over the last few years. They'll also be easy to make, at least crudely.
Get yourself a quadcopter, and add a smartphone and a gun or a small bomb. Then all you need is someone like me to write you some AI software.
And the military will love them, at least at first, as they don't need sleep or rest, long and expensive training, or evacuation from the battlefield when damaged.
And the military will love them, at least at first, as they don't need sleep or rest, long and expensive training, or evacuation from the battlefield when damaged.
However, once the military start having to defend themselves against killer robots, they might change their mind.
4. Killer robots will be killing lots of civilians
According to The Intercept, during a five-month stretch of a 2011-3 U.S. military operation against the Taliban and al Qaeda in the Hindu Kush, "nearly nine out of 10 people" who died in drone strikes "were not the Americans' direct targets."
This is when we still have a human in the loop, making that final life or death decision. The current state of the art in AI does not approach the situational awareness, or decision-making of a human drone pilot.
The statistics for a fully autonomous drone will therefore likely be even worse.
Over time, they'll get better and I fully expect them to equal if not exceed human pilots.
Different arguments then come into play. For example, killer robots will surely fall into wrong hands, including people who have no qualms at using them against civilians.
They are a perfect weapon of terror. Killer robots will also lower the barriers to war. By further distancing us from the battlefield, they'll turn war into a very real video game.
Different arguments then come into play. For example, killer robots will surely fall into wrong hands, including people who have no qualms at using them against civilians.
They are a perfect weapon of terror. Killer robots will also lower the barriers to war. By further distancing us from the battlefield, they'll turn war into a very real video game.
5. Killer robots will be hard to regulate
Tesla updates their Model S car to drive autonomously on the highway with a simple software update delivered over the air.
We have to expect therefore that simple software updates will in the future be able to turn systems that are either not autonomous or not lethal into lethal autonomous weapons. This is going to make it very hard to control killer robots.
We have to expect therefore that simple software updates will in the future be able to turn systems that are either not autonomous or not lethal into lethal autonomous weapons. This is going to make it very hard to control killer robots.
And we are going to want the technologies that go into killer robots. They are much the same technologies that go into autonomous cars, most of which already exist.
Each year, roughly 30,000 people die on the roads of the United States, and 1.2 million worldwide. This statistic will plummet once autonomous cars are common.
Each year, roughly 30,000 people die on the roads of the United States, and 1.2 million worldwide. This statistic will plummet once autonomous cars are common.
But just because something is going to be hard, doesn't mean we shouldn't try. And even a ban that is partially effective, like that for anti-personnel mines, is going to be worth having.
My view that we need to regulate killer robots to prevent an arms race -- and the view that we need to act quickly is shared by many others in the know. An open lettercalling for such a ban was released in July this year.
The signatures include many leading researchers in AI and robotics, the CEOs of Google's DeepMind, Facebook's AI Research Lab, and the Allen Institute for AI, as well as thousands of others from around the world.
In November, the U.N.
Convention on Certain Conventional Weapons meets again in Geneva to decide whether to continue with this issue, and whether to take the next step forwards towards a ban. For the world's sake, I hope they do.
Source: CNN
Convention on Certain Conventional Weapons meets again in Geneva to decide whether to continue with this issue, and whether to take the next step forwards towards a ban. For the world's sake, I hope they do.
Source: CNN
No comments:
Post a Comment
All comments, advise and encouragement are always welcome.
We are here to serve you better.
Thanks for visiting!
Management