Brian Christian

You are currently browsing articles tagged Brian Christian.

The real shift in our time isn’t only that we’ve stopped worrying about surveillance, exhibitionism and a lack of privacy, but that we’ve embraced these things–demanded them, even. There must have been something lacking in our lives, something gone unfulfilled. But is this intimacy with technology and the sense of connection and friendship and relationship that attends it–often merely a likeness of love–an evolutionary correction or merely a desperate swipe in the wrong direction?

The opening of Brian Christian’s New Yorker piece about Spike Jonze’s Her, a film about love in the time of simulacra, in which a near-future man is wowed by a “woman” who seems to him like more than just another pretty interface:

“In 1966, Joseph Weizenbaum, a professor of computer science at M.I.T., wrote a computer program called Eliza, which was designed to engage in casual conversation with anybody who sat down to type with it. Eliza worked by latching on to keywords in the user’s dialogue and then, in a kind of automated Mad Libs, slotted them into open-ended responses, in the manner of a so-called non-directive therapist. (Weizenbaum wrote that Eliza’s script, which he called Doctor, was a parody of the method of the psychologist Carl Rogers.) ‘I’m depressed,’ a user might type. ‘I’m sorry to hear you are depressed,’ Eliza would respond.

Eliza was a milestone in computer understanding of natural language. Yet Weizenbaum was more concerned with how users seemed to form an emotional relationship with the program, which consisted of nothing more than a few hundred lines of code. ‘I was startled to see how quickly and how very deeply people conversing with DOCTOR became emotionally involved with the computer and how unequivocally they anthropomorphized it,’ he wrote. ‘Once my secretary, who had watched me work on the program for many months and therefore surely knew it to be merely a computer program, started conversing with it. After only a few interchanges with it, she asked me to leave the room.’ He continued, ‘What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.’

The idea that people might be unable to distinguish a conversation with a person from a conversation with a machine is rooted in the earliest days of artificial-intelligence research.”

Tags: , , ,

The opening of “Mind vs. Machine,” Brian Christian’s recent Atlantic article about the author’s particpation in the Turing Test, an annual event in which computers compete to exhibit intelligent behavior that can pass for human:

“BRIGHTON, ENGLAND, SEPTEMBER 2009. I wake up in a hotel room 5,000 miles from my home in Seattle. After breakfast, I step out into the salty air and walk the coastline of the country that invented my language, though I find I can’t understand a good portion of the signs I pass on my way—LET AGREED, one says, prominently, in large print, and it means nothing to me.

I pause, and stare dumbly at the sea for a moment, parsing and reparsing the sign. Normally these kinds of linguistic curiosities and cultural gaps intrigue me; today, though, they are mostly a cause for concern. In two hours, I will sit down at a computer and have a series of five-minute instant-message chats with several strangers. At the other end of these chats will be a psychologist, a linguist, a computer scientist, and the host of a popular British technology show. Together they form a judging panel, evaluating my ability to do one of the strangest things I’ve ever been asked to do.

I must convince them that I’m human.

Fortunately, I am human; unfortunately, it’s not clear how much that will help.” (Thanks to The Electric Typewriter.)

••••••••••

Tags: ,

"That it could spin half-discernible essays on postmodern theory before it could be shown a chair and say, as most toddlers can, 'chair'?"

From “Mind vs. Machine,” an article in the Atlantic by Brian Christian.

“As for the prospects of AI, some people imagine the future of computing as a kind of heaven. Rallying behind an idea called ‘The Singularity,’ people like Ray Kurzweil (in The Singularity Is Near) and his cohort of believers envision a moment when we make smarter- than-us machines, which make machines smarter than themselves, and so on, and the whole thing accelerates exponentially toward a massive ultra-intelligence that we can barely fathom. Such a time will become, in their view, a kind of a techno-Rapture, in which humans can upload their consciousness onto the Internet and get assumed—if not bodily, than at least mentally—into an eternal, imperishable afterlife in the world of electricity.

Others imagine the future of computing as a kind of hell. Machines black out the sun, level our cities, seal us in hyperbaric chambers, and siphon our body heat forever.

I’m no futurist, but I suppose if anything, I prefer to think of the long-term future of AI as a kind of purgatory: a place where the flawed but good-hearted go to be purified—and tested—and come out better on the other side.

Who would have imagined that the computer’s earliest achievements would be in the domain of logical analysis, a capacity once held to be what made us most different from everything else on the planet? That it could fly a plane and guide a missile before it could ride a bike? That it could create plausible preludes in the style of Bach before it could make plausible small talk? That it could translate before it could paraphrase? That it could spin half-discernible essays on postmodern theory before it could be shown a chair and say, as most toddlers can, ‘chair’?

As computers have mastered rarefied domains once thought to be uniquely human, they simultaneously have failed to master the ground-floor basics of the human experience—spatial orientation, object recognition, natural language, adaptive goal-setting—and in so doing, have shown us how impressive, computationally and otherwise, such minute-to-minute fundamentals truly are.

We forget how impressive we are. Computers are reminding us.”

Tags: