Vernor Vinge

You are currently browsing articles tagged Vernor Vinge.

In a Popular Science piece, Erik Sofge offers a smackdown of the Singularity, thinking it less science than fiction. An excerpt:

“The most urgent feature of the Singularity is its near-term certainty. [Vernor] Vinge believed it would appear by 2023, or 2030 at the absolute latest. Ray Kurzweil, an accomplished futurist, author (his 2006 book The Singularity is Near popularized the theory) and recent Google hire, has pegged 2029 as the year when computers with match and exceed human intelligence. Depending on which luminary you agree with, that gives humans somewhere between 9 and 16 good years, before a pantheon of machine deities gets to decide what to do with us.

If you’re wondering why the human race is handling the news of its impending irrelevance with such quiet composure, its possible that the species is simply in denial. Maybe we’re unwilling to accept the hard truths preached by Vinge, Kurzweil and other bright minds.

Just as possible, though, is another form of denial. Maybe no one in power cares about the Singularity, because they recognize it as science fiction. It’s a theory that was proposed by a SF writer. Its ramifications are couched in the imagery and language of SF. To believe in the Singularity, you have to believe in one of the greatest myths ever told by SF—that robots are smart, and always on the verge of becoming smarter than us.

More than 60 years of AI research indicates otherwise.”

Tags: , ,

From Vernor Vinge’s famous 1993 essay, “The Coming Technological Singularity: How to Survive in the Post-Human Era,” which coined the word which describes that moment when machine knowledge surpasses the human kind:

“What are the consequences of this event? When greater-than-human intelligence drives progress, that progress will be much more rapid. In fact, there seems no reason why progress itself would not involve the creation of still more intelligent entities — on a still-shorter time scale. The best analogy that I see is with the evolutionary past: Animals can adapt to problems and make inventions, but often no faster than natural selection can do its work — the world acts as its own simulator in the case of natural selection. We humans have the ability to internalize the world and conduct ‘what if’s’ in our heads; we can solve many problems thousands of times faster than natural selection. Now, by creating the means to execute those simulations at much higher speeds, we are entering a regime as radically different from our human past as we humans are from the lower animals.

From the human point of view this change will be a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control. Developments that before were thought might only happen in ‘a million years’ (if ever) will likely happen in the next century.

I think it’s fair to call this event a singularity (‘the Singularity’ for the purposes of this paper). It is a point where our models must be discarded and a new reality rules. As we move closer and closer to this point, it will loom vaster and vaster over human affairs till the notion becomes a commonplace. Yet when it finally happens it may still be a great surprise and a greater unknown.”

Tags: