“I Don’t Believe Consciousness Is Necessary For Human-Level Artificial Intelligence”

ai90909

In a smart Gizmodo post, George Dvorsky rifles through numerous myths about AI, separating what he believes fact from fiction. One item particularly caught my eye. It has to do with machines “coming alive,” achieving consciousness.

Working to understand consciousness in humans is a fascinating pursuit, and trying to transfer this state on to machines is a fraught if likewise absorbing business. But is it necessary for machines to be self-aware like we are to surpass us? Probably not.

I think such a passing of the torch is possible in the very long term, but it’s probably no more needed for AI to knock us from atop the food chain than it is for planes to flap their wings like birds to fly. Machines will attain superintelligence long, long before consciousness.

A excerpt: 

Myth: “Artificial intelligence will be conscious.”

Reality: A common assumption about machine intelligence is that it’ll be conscious—that is, it’ll actually think the way humans do. What’s more, critics like Microsoft co-founder Paul Allen believe that we’ve yet to achieve artificial general intelligence (AGI), i.e. an intelligence capable of performing any intellectual task that a human can, because we lack a scientific theory of consciousness. But as Imperial College of London cognitive roboticist Murray Shanahan points out, we should avoid conflating these two concepts.

“Consciousness is certainly a fascinating and important subject—but I don’t believe consciousness is necessary for human-level artificial intelligence,” he told Gizmodo. “Or, to be more precise, we use the word consciousness to indicate several psychological and cognitive attributes, and these come bundled together in humans.”

It’s possible to imagine a very intelligent machine that lacks one or more of these attributes. Eventually, we may build an AI that’s extremely smart, but incapable of experiencing the world in a self-aware, subjective, and conscious way. Shanahan said it may be possible to couple intelligence and consciousness in a machine, but that we shouldn’t lose sight of the fact that they’re two separate concepts.

And just because a machine passes the Turing Test—in which a computer is indistinguishable from a human—that doesn’t mean it’s conscious. To us, an advanced AI may give the impression of consciousness, but it will be no more aware of itself than a rock or a calculator.•

Tags: