Steven Levy

You are currently browsing articles tagged Steven Levy.

Nobody calls anymore: It’s all texts, tweets and emojis. Phones are ever-more sophisticated, but most functions are silent. There are attempts, however, to remake the 135-year-old tool to fit the more fluid demands of what is becoming a post-voice world, though privacy may again suffer collateral damage. The opening of “Brave New Phone Call,” a just-published Medium piece by Steven Levy, the leading tech journalist of the personal-computing era:

“It is a gorgeous late summer afternoon, and I am sitting with Ray Ozzie in his spacious home office in Manchester-by-the-Sea, 30 miles up the coast from Boston. The software visionary who created Lotus Notes and who later succeeded Bill Gates as Microsoft’s chief software architect, is explaining to me how the humble phone call is not dying, as many might believe, but is busy being reborn.

It’s not an abstract subject for the 58-year entrepreneur. For the past few weeks I have been using the app his company is announcing today, called Talko. It’s a weird, almost magical, combination of phone calling, text messaging, virtual conferencing and Instagram-ish photo sharing. Depending on how you view it, Talko is three or 39 years in the making.

At one point, Ozzie wants to show me something on the app. We both pull out our iPhones and connect with each other; actually, in that moment, we reconnect to a conversation we’ve been having all month that’s been recorded and archived in the app. I think my editor might be interested in the discussion, so we expand the conversation to include him. He’s unable to join us at the moment—I should have known, because the app lets me see that he’s walking around somewhere on the West Coast—but I shoot a photo for him to look at anyway, and Ozzie and I continue talking. Later, my editor will listen to that part of conversation and see the picture at the moment we shot it. And he’ll have the option to comment, perhaps kicking off a longer discussion down the road, either by convening us together in real time or continuing in the same piecemeal fashion as today.

That’s a typical Talko phone call—mixing presence and playback for a totally new experience. God knows that the old experience of a phone call is getting tired.

A few days later, to note the irony of it all, Ozzie sends me a photo in the same ongoing conversation. It’s a plaque in downtown Boston, a block from a Talko engineering office there:

BIRTHPLACE OF THE TELEPHONE
Here on June 2, 1875, Alexander Graham Bell and Thomas A. Watson first transmitted sound over wires.

That phone call represented an amazing advance in communications. But Ozzie considers it equally amazing that in the 139 years since ‘Mr. Watson, come here,’ phone calls haven’t changed much.”

________________________________

Debut of the Picturephone, 1970:

Tags: ,

As absolutely everyone has mentioned, it’s the 30th anniversary of the Apple Macintosh and the great “1984” ad that introduced it to the masses. The Mac, even if it wasn’t a tremendous success in and of itself, changed everything by popularizing GUI, making text interfaces obsolete, and coming up with a design that aspired to update the modernist beauty of Olivetti.

Steven Levy, the best tech journalist of the personal-computing age, is releasing an unexpurgated version of an interview he did with Steve Jobs just as the Mac was about to drop. In that conversation, the Apple co-founder asserts that the invention of the light bulb influenced the course of history more than Marxism. I probably disagree with that, though a lot fewer people died by misuse of the light bulb. From Nick Bilton at the New York Times:

“There are some aspects of the 30-year-old interview that might answer some unanswerable questions about what Mr. Jobs would have done with his life if he were still alive today.

When Mr. Levy told Mr. Jobs that there was ‘speculation’ that he might go into politics, Mr. Jobs replied that he had no desire to enter the public sector and noted that the private sector could have a greater influence on society. ‘I’m one of those people that think Thomas Edison and the light bulb changed the world more than Karl Marx ever did,’ Mr. Jobs said.

One thing Mr. Levy was continually searching for in the interview, was what was driving Mr. Jobs — a question that was echoed in 2011 in Steve Jobs, the biography written by Walter Isaacson.

In the 1983 interview, it’s clear that money isn’t the answer. Mr. Jobs talked about his net worth falling by $250 million in six months. ‘I’ve lost a quarter billion dollars! You know, that’s very character building,’ he said, and notes that at some point, counting your millions of dollars is ‘just stupid.’

Mr. Levy pressed again. ‘The question I was getting at is, what’s driving you here?

‘Well, it’s like computers and society are out on a first date in this decade, and for some crazy reason we’re just in the right place at the right time to make that romance blossom,’ Mr. Jobs replied, noting that the 1980s were the beginning of the computing revolution. ‘We can make them great, we can make a great product that people can easily use.’

Such passion is something that would follow Mr. Jobs through his career, and what he said next seemed to be the driving force behind that passion.”

Tags: , ,

In an addendum of sorts to his recent Wired article, “How the NSA Almost Killed the Internet,” Steven Levy, who wrote one of my favorite books ever, has published some takeaways from his recent conversations with the embattled government organization. One example about a certain freelancer:

They really hate Snowden. The NSA is clearly, madly, deeply furious at the man whose actions triggered the biggest crisis in its history. Even while contending they welcome the debate that now engages the nation, they say that they hate the way it was triggered. The NSA has an admittedly insular culture — the officials described it as almost like a family. Morale suffers when friends and neighbors think that NSA employees are sitting around reading grandma’s email. Also, the agency believes that the Snowden leaks have seriously hurt national security (though others dispute this). NSA officials are infuriated that all this havoc was caused by some random contractor. They suggest that had Snowden been familiar with the culture and the ethos of the agency, understood the level of training undergone by its employees, seen the level of regulations and oversight, he would have been less likely to abscond with all those documents. (Snowden’s interviews indicate otherwise.) Still, they are stunned that someone ‘inside the fence’ would do what Snowden did. Even if Snowden is eventually pardoned, he’d do well to steer clear of Fort Meade.”

Tags: ,

Garry Kasparov’s defeat at the hands–well, not exactly hands–of Deep Blue was supposed to have delivered a message to humans that we needed to dedicate ourselves to other things–but the coup de grace was ignored. In fact, computers have only enhanced our chess acumen, making it clear that thus far a hybrid is better than either carbon or silicon alone. In the wake of Computer Age child Magnus Carlsen becoming the greatest human player on Earth, Christopher Chabris and David Goodman of the Wall Street Journal look at the surprising resilience of chess in these digital times. The opening:

“In the world chess championship match that ended Friday in India, Norway’s Magnus Carlsen, the cool, charismatic 22-year-old challenger and the highest-rated player in chess history, defeated local hero Viswanathan Anand, the 43-year-old champion. Mr. Carlsen’s winning score of three wins and seven draws will cement his place among the game’s all-time greats. But his success also illustrates a paradoxical development: Chess-playing computers, far from revealing the limits of human ability, have actually pushed it to new heights.

The last chess match to get as much publicity as Mr. Carlsen’s triumph was the 1997 contest between then-champion Garry Kasparov and International Business Machines Corp.’s Deep Blue computer in New York City. Some observers saw that battle as a historic test for human intelligence. The outcome could be seen as an ‘early indication of how well our species might maintain its identity, let alone its superiority, in the years and centuries to come,’ wrote Steven Levy in a Newsweek cover story titled ‘The Brain’s Last Stand.’ 

But after Mr. Kasparov lost to Deep Blue in dramatic fashion, a funny thing happened: nothing.”•

_________________________________________

“In Norway, you’ve got two big sports–chess and sadness”:

Tags: , , ,

The opening of “Twilight of the Trucks,” Steven Levy’s new Wired piece about Apple’s further shift from the desk and the lap into the pocket:

“Almost exactly 2 years ago, Steve Jobs outlined his view of personal computing. We used to be an agrarian nation, he explained, and as a result our vehicles were largely trucks. As the country became more urban and suburban, we moved to an era where the highways were dominated by cars, not their lumbering counterparts.

The same thing was happening in the technology world — and, with the iPhone and the iPad, the movement was accelerating. ‘PCs are going be like trucks,’ Jobs said at the 2010 All Things D Conference. ‘They are still going to be around, but only one out of x people will need them.’ Clearly, he didn’t expect the percentage to be a big number.”

Tags: ,

From Steven Levy’s new book about Google, In the Plex, comes this conversation between company co-founders Larry Page and Sergey Brin:

“It will be included in people’s brains,” said Page. “When you think about something and don’t really know much about it, you will automatically get information.”

“That’s true,” said Brin. ‘Ultimately I view Google as a way to augment your brain with the knowledge of the world. Right now you go into your computer and type a phrase, but you can imagine that it could be easier in the future, that you can have just devices you talk into, or you can have computers that pay attention to what’s going on around them.”

Page said, “Eventually you’ll have the implant, where if you think about a fact, it will just tell you the answer.” (Thanks NYRB.)•

Tags: , ,

"A mathematician, a former peacenik, and an enemy of exclusive government control of encryption systems."

Whitfield Diffie created a tool to help him explain a product, but it was the tool itself that was the great product. To understand how Diffie never made a cent from his creation of the game-changing invention of PowerPoint, read this 2001 article by the excellent New Yorker writer Ian Parker. An excerpt:

“In 1980, though, it was clear that a future of widespread personal computers—and laser printers and screens that showed the very thing you were about to print—was tantalizingly close. In the Mountain View, California, laboratory of Bell-Northern Research, computer-research scientists had set up a great mainframe computer, a graphics workstation, a phototypesetter, and the earliest Canon laser printer, which was the size of a bathtub and took six men to carry into the building—together, a cumbersome approximation of what would later fit on a coffee table and cost a thousand dollars. With much trial and error, and jogging from one room to another, you could use this collection of machines as a kind of word processor.

Whitfield Diffie had access to this equipment. A mathematician, a former peacenik, and an enemy of exclusive government control of encryption systems, Diffie had secured a place for himself in computing legend in 1976, when he and a colleague, Martin Hellman, announced the discovery of a new method of protecting secrets electronically—public-key cryptography. At Bell-Northern, Diffie was researching the security of telephone systems. In 1981, preparing to give a presentation with 35-mm. slides, he wrote a little program, tinkering with some graphics software designed by a B.N.R. colleague, that allowed you to draw a black frame on a piece of paper. Diffie expanded it so that the page could show a number of frames, and text inside each frame, with space for commentary around them. In other words, he produced a storyboard—a slide show on paper—that could be sent to the designers who made up the slides, and that would also serve as a script for his lecture. (At this stage, he wasn’t photocopying what he had produced to make overhead transparencies, although scientists in other facilities were doing that.) With a few days’ effort, Diffie had pointed the way to PowerPoint.” (Thanks Longform.)

••••••••••

More about Whitfield Diffie from Steven Levy: “Mary Fischer loathed Whitfield Diffie on sight. He was a type she knew all too well, an MIT brainiac whose arrogance was a smoke screen for a massive personality disorder. The year of the meeting was 1969; the location a hardware store near Central Square in Cambridge, Massachusetts. Over his shoulder he carried a length of wire apparently destined for service as caging material for some sort of pet. This was a typical purchase for Diffie, whose exotic animal collection included a nine-foot python, a skunk, and a rare genetta genetta, a furry mongooselike creature whose gland secretions commonly evoked severe allergic reactions in people. It lived on a diet of live rats and at unpredictable moments would nip startled human admirers with needlelike fangs.”


Tags: , , ,

Roomba can't intellectualize vacuuming, but it gets the job done. (Image by Larry D. Moore.)

Steven Levy has an excellent piece, “The AI Revolution Is On,”  in the current Wired. In it, Levy points out that artificial intelligence has turned out to be markedly different than what science in the ’50s and ’60s predicted. The reason is because yesterday’s scientists tried to make machines emulate the human brain. But since we still don’t really know how that organ operates, researchers threw away the playbook during the ’80s and have since focused on allowing computers to be “themselves.” An excerpt:

“AI researchers began to devise a raft of new techniques that were decidedly not modeled on human intelligence. By using probability-based algorithms to derive meaning from huge amounts of data, researchers discovered that they didn’t need to teach a computer how to accomplish a task; they could just show it what people did and let the machine figure out how to emulate that behavior under similar circumstances. They used genetic algorithms, which comb through randomly generated chunks of code, skim the highest-performing ones, and splice them together to spawn new code. As the process is repeated, the evolved programs become amazingly effective, often comparable to the output of the most experienced coders.

MIT’s Rodney Brooks also took a biologically inspired approach to robotics. His lab programmed six-legged buglike creatures by breaking down insect behavior into a series of simple commands—for instance, ‘If you run into an obstacle, lift your legs higher.’ When the programmers got the rules right, the gizmos could figure out for themselves how to navigate even complicated terrain. (It’s no coincidence that iRobot, the company Brooks cofounded with his MIT students, produced the Roomba autonomous vacuum cleaner, which doesn’t initially know the location of all the objects in a room or the best way to traverse it but knows how to keep itself moving.)

The fruits of the AI revolution are now all around us. Once researchers were freed from the burden of building a whole mind, they could construct a rich bestiary of digital fauna, which few would dispute possess something approaching intelligence. ‘If you told somebody in 1978, ‘You’re going to have this machine, and you’ll be able to type a few words and instantly get all of the world’s knowledge on that topic,’ they would probably consider that to be AI,’ Google cofounder Larry Page says. ‘That seems routine now, but it’s a really big deal.'”

Tags: , ,

Steven Levy's next book, about Google, is to be published in 2011.

A few months ago, I excerpted a Wired article in which Steven Levy revisited some subjects profiled in his great 1984 book, Hackers: Heroes of the Computer Revolution. That book looked at the pioneers from the ’50s, ’60s, and ’70s who built the foundation of today’s interconnected technology. I’m rereading Hackers now, so I thought I’d provide a passage. This sequence is about the moment when computers passed over from institutions into the hands of Berkeley hackers. Eventually, some of the folks who cut their teeth on this XDS-940 Bay Area behemoth would help personal computing take quantum leaps forward, but initially the work was as unglamorous as it was idealistic and exciting. An excerpt:

“The first public terminal of the Community Memory project was an ugly machine in a cluttered foyer on the second floor of a beat-up building in the spaciest town in the United States of America: Berkeley, California. It was inevitable that computers would come to ‘the people’ in Berkeley. Everything else did, from gourmet food to local government. And if, in August 1973, computers were generally regarded as inhuman, unyielding, warmongering and nonorganic, the imposition of a terminal connected to one of those Orwellian monsters in a normally good-vibes zone like the foyer outside of Leopold’s Records on Durant Avenue was not necessarily a threat to anyone else’s well-being. It was yet another kind of flow to go with.

A faded photo of the Community Memory project in action in Berkeley during the 1970s.

Outrageous, in a sense. Sort of a squashed piano, the height of a Fender Rhodes, with a typewriter keyboard instead of a musical one. The computer was protected by a cardboard box casing, with a plate of glass set in its front. To touch the keys, you had to stick your hands through little holes, as if you were offering yourself for imprisonment in an electronic stockade. But the people standing by the terminal were familiar Berkeley types, with long stringy hair, jeans, and a demented gleam in their eyes that you would mistake for a drug reaction if you did not know them well. Those who did know them well realized that the group was high on technology. They were getting off like they had never gotten off before, dealing the hacker dream as if it were the most potent strain of sinsemilla in the Bay Area.

The name of the group was Community Memory, and according to a handout they distributed, the terminal was ‘a communication system which allows people to make contact with each other on the basis of mutually expressed interests, without having to cede judgements to third parties.’ The idea was to speed the flow of information in a decentralized, non-bureaucratic system. An idea born from computers, an idea executable only by computers, in this case a time-shared XDS-940 mainframe machine in the basement of a warehouse in San Francisco. By opening a hands-on computer facility to let people reach each other, a living metaphor would be created, a testament to the way computer technology could be used as guerrilla warfare for people against bureaucracies.”

Tags:

Richard Stallman, pioneer hacker, at the University of Calgary in 2009. (Image by D'Arcy Norman.)

Wired has a great piece online in which journalist Steven Levy looks back on the flowering o the Information Age 25 years after the publication of his landmark book, Hackers: Heroes of the Computer Revolution.

Back in the good old days hackers weren’t criminals stealing and spying; they were the nerdy genius programmers who remade the way we think, live and communicate. Levy looks back at the monsters of the industry who became household names–Gates, Wozniak, etc.–but also revisits some of those who never spent time hanging with Bono or dancing with the stars.

One passage that’s particularly interesting focuses on legendary hacker Richard Stallman, a brilliant and belligerent soul who despises the commercialization of what the geeks brought to life. An excerpt about him from Levy’s Wired article:

“I first met Richard Stallman, a denizen of MIT’s AI Lab, in 1983. Even then he was bemoaning the sad decline of hacker culture and felt that the commercialization of software was a crime. When I spoke to him that year, as the computer industry was soaring, he looked me in the eye and said, ‘I don’t believe that software can be owned.’ I called him ‘the last of the true hackers’ and assumed the world would soon squash him.

Was I ever wrong. Stallman’s crusade for free software has continued to inform the ongoing struggles over intellectual property and won him a MacArthur Foundation ‘genius grant.’ He founded the Free Software Foundation and wrote the GNU operating system, which garnered widespread adoption after Linus Torvalds wrote Linux to run with it; the combination is used in millions of devices. More important, perhaps, is that Stallman provided the intellectual framework that led to the open source movement, a critical element of modern software and the Internet itself. If the software world had saints, Stallman would have been beatified long ago.

Yet he is almost as famous for his unyielding personality. In 2002, Creative Commons evangelist Lawrence Lessig wrote, ‘I don’t know Stallman well. I know him well enough to know he is a hard man to like.’ (And that was in the preface to Stallman’s own book.) Time has not softened him. In our original interview, Stallman said, ‘I’m the last survivor of a dead culture. And I don’t really belong in the world anymore. And in some ways I feel I ought to be dead.’ Now, meeting over Chinese food, he reaffirms this. ‘I have certainly wished I had killed myself when I was born,’ he says. ‘In terms of effect on the world, it’s very good that I’ve lived. And so I guess, if I could go back in time and prevent my birth, I wouldn’t do it. But I sure wish I hadn’t had so much pain.'”

Tags: ,

Newer entries »