Matthew Klunder

You are currently browsing articles tagged Matthew Klunder.

In a Wall Street Journal article, Christopher Mims writes that killer robots aren’t inevitable, spoiling it for everyone. I mean, we need to be obliterated by really smart robots, the sooner, the better. Please.

Mims is right, of course, that banning research on Strong Ai is the wrong tack to take to ensure our future. This work is going to go ahead one way or another, so why not proceed, but with caution? He also points out that many of the scientists and technologists signing the Open Letter on Artificial Intelligence are engaged in creating AI of all sorts.

An excerpt about the bad news:

Imagine the following scenario: It’s 2025, and self-driving cars are widely available. Turning such a vehicle into a bomb isn’t much harder than it is to accomplish the same thing with a conventional vehicle today. And the same goes for drones of every scale and description.

It’s inevitable, say the experts I talked to, that nonstate actors and rogue states will create killer robots once the underpinnings of this technology become cheap and accessible, thanks to its commercial use.

“I look back 10 years, and who would have thought people would be using cellphone technology to detonate IEDs?” says retired Rear Admiral Matthew Klunder, who as chief of research spent four years heading up the Navy’s work on autonomous systems.

And what about killing machines driven by artificial intelligence, which could learn to make decisions themselves, a fear that recently bubbled to the surface in an open letter signed by the likes of Elon Musk and Stephen Hawking. The letter warned that an arms race was “virtually inevitable” between major powers if they continue to develop these kinds of weapons.•

 

Tags: ,