- The human brain is the most amazingly complex machine, until the day it becomes a simple one. If we last long enough, that moment will arrive, and consciousness will no longer be the hard problem or any problem at all.
- I don’t think intelligent machines are happening anytime soon, but they’re likely if the Anthropocene or some other age doesn’t claim us first. In fact, we may ultimately become them, more or less. But I’m not talking about today or tomorrow. In the meanwhile, Weak AI will be enough of a boon and bane to occupy us.
- The problem I have with concerned technologists attempting to curb tomorrow’s superintelligence today is that any prescripts we create now will become moot soon enough as realities shift. New answers will alter old questions. It’s better to take an incremental approach to these challenges, and try to think through them wisely in our era and trust future humans to do the same in theirs.
From Jane Wakefield’s BBC article “Intelligent Machines: Do We Really Need to Fear AI?“:
Already operating on the South Korean border is a sentry robot, dubbed SGR-1. Its heat-and-motion sensors can identify potential targets more than two miles away. Currently it requires a human before it shoots the machine gun that it carries but it raises the question – who will be responsible if the robots begin to kill without human intervention?
The use of autonomous weapons is something that the UN is currently discussing and has concluded that humans must always have meaningful control over machines.
Noel Sharkey co-founded the Campaign to Stop Killer Robots and believes there are several reasons why we must set rules for future battlefield bots now.
“One of the first rules of many countries is about preserving the dignity of human life and it is the ultimate human indignity to have a machine kill you,” he said.
But beyond that moral argument is a more strategic one which he hopes military leaders will take on board.
“The military leaders might say that you save soldiers’ lives by sending in machines instead but that is an extremely blinkered view. Every country, including China, Russia and South Korea is developing this technology and in the long run, it is going to disrupt global security,” he said.
“What kind of war will be initiated when we have robots fighting other robots? No-one will know how the other ones are programmed and we simply can’t predict the outcome.”•
Tags: Jane Wakefield, Noel Sharkey