Walter Isaacson

You are currently browsing articles tagged Walter Isaacson.

In a Medium piece, Gerald Huff answers the points made by writer Walter Isaacson and roboticist Pippa Malmgren during a recent London debate, in which they argued against the likelihood of large-scale technological unemployment. Isaacson touting work created by the so-called Sharing Economy, contingent jobs which squeeze laborers, was either his least-researched response or most disingenuous one. 

From Huff:

What is different about the technologies emerging now from academia and tech companies large and small is the extent to which they can substitute for or eliminate jobs that previously only humans could do. Over the course of thousands of years, human brawn was replaced by animal power, then wind and water power, then steam, internal combustion and electric motors. But the human brain and human hands — with their capabilities to perceive, move in and manipulate unstructured environments, process information, make decisions, and communicate with other people — had no substitute. The technologies emerging today — artificial intelligence fed by big data and the internet of things and robotics made practical by cheap sensors and massive processing power — change the equation. Many of the tasks that simply had to be done by humans will in the coming decades fall within the capabilities of these emerging technologies.

When Isaacson says “it always has, and I submit always will produce more jobs, because it produces…more things that we can make and buy” he is falling into the Labor Content Fallacy. Without repeating the entire argument in the linked article, there is no law of economics that says a product or service must require human labor. The simplest example is a digital download of a song or game, which has essentially zero marginal labor content. In the coming decades, for the first time in history, we will be able to “make and buy” a huge variety of goods and services without the need to employ people. The historical correlation between more human jobs due to increased demand for goods and services from a rising population will be broken.•

Tags: , ,

There’s good stuff in James B. Stewart’s New York Times piece “How, And Why, Apple Overtook Microsoft,” though it oversimplifies the reasons for the heavenly resuscitation of Jobs’ near-dead company and the purgatory Bill Gates’ once-mighty empire is now experiencing. In one passage, it reduces the reversal of fortunes to a “vision thing,” making it seem as if Gates was taken unawares by a mobile-dominated future. Oh, Gates knew. From his 1995 book The Road Ahead

What do you carry on your person now? Probably at least keys, identification, money, and a watch. And maybe credit cards, a checkbook, traveler’s checks, an address book, an appointment book, a notepad, something to read, a camera, a pocket tape recorder, a cellular phone, a pager, concert tickets, a map, a compass, a calculator, an electronic entry card, photographs, and maybe a loud whistle to call for help.

You’ll be able to keep equivalent necessities — and more — in an information appliance I call the wallet PC. It will be about the same size as a wallet, which means you’ll be able to carry it in your pocket or purse. It will display messages and schedules and let you read or send electronic mail and faxes, monitor weather and stock reports, and play both simple and sophisticated games. At a meeting, you might take notes, check your appointments, browse information if you’re bored, or choose from among thousands of easy-to-call-up photos of your kids.•

The real distinction between the companies wasn’t vision but execution. Microsoft was too huge to pivot, though Apple might have won even if its rival wasn’t “encumbered” by success. From Stewart:

The most successful companies need a vision, and both Apple and Microsoft have one. But Apple’s was more radical and, as it turns out, more farsighted. Microsoft foresaw a computer on every person’s desk, a radical idea when IBM mainframes took up entire rooms. But Apple went a big step further: Its vision was a computer in every pocket. That computer also just happened to be a phone, the most ubiquitous consumer device in the world. Apple ended up disrupting two huge markets.

“Apple has been very visionary in creating and expanding significant new consumer electronics categories,” [Bernstein analyst Toni] Sacconaghi said. “Unique, disruptive innovation is really hard to do. Doing it multiple times, as Apple has, is extremely difficult. It’s the equivalent of Pixar producing one hit after another. You have to give kudos to Apple.”

Walter Isaacson, who interviewed Mr. Jobs for his biography of the Apple co-founder and chief executive, said: “Steve believed the world was going mobile, and he was right. And he believed that beauty matters. He was deeply moved by beautiful design. Objects of great functionality also had to be objects of desire.”•

Tags: , , , ,

Walter Isaacson, who has written his second Silicon Valley book, The Innovators, just conducted an AMA at Reddit. Elon Musk will no doubt be pleased with the headline quote, though for all his accomplishments, he certainly hasn’t emulated Benjamin Franklin’s political achievements, nor will he likely. A few exchanges follow.

_________________________

Question:

Hey Walter, who is the Ben Franklin of 2014?

Walter Isaacson:

The Ben Franklin of today is Elon Musk.

_________________________

Question:

I thoroughly enjoyed your biography on Steve Jobs! Thank you for your diligence!

I know you talked about how you had never done a biography on a living person before. What it easier to feel like you could get a more accurate picture of a living subject? Did you have a system in place that you felt would prevent the tainting of your perspective based on the bias of the person you were interviewing?

Walter Isaacson:

I have done living people before: Kissinger, the the Wise Men. With a living subject, you get to know (if you take time to do a lot of personal interviews and listen) a hundred times more than you can learn about a historic person. I know much more about the chamfers of the original mac than about all of Ben Franklin’s lightning rod and kite-flying experiments. I tend to be a bit soft when writing about someone alive, because I tend to like most people I get to know.

_________________________

Question:

I’m surprised to see computers have not evolved beyond silicon in nearly 30-40 years. What are your thoughts?

Walter Isaacson:

It would be interesting if we built computers not based on digital circuits using binary logic — and instead tried to replicate the human mind in a carbon-based and wetware chemical system, perhaps even an analog one, like nature did it!

_________________________

Question:

What are your thoughts on singularity? Do you think it will happen, and if so, when? 

Walter Isaacson:

The theme of my book is that human minds and computers bring different strengths to the party. The pursuit of strong Artificial Intelligence has been a bit of a mirage — starting in the 1950s, it’s always seen to be 20 years away. But the combination of humans and machines in more intimate partnership — what JCR Licklider called symbiosis and what Peter Thiel calls complementarity — has proven more fruitful. Indeed amazing. So I suspect that for the indefinite future, the combination of human minds and machine power will be more powerful than aiming for artificial intelligence and a singularity.•

Tags: , , , ,

Walter Isaacson is thought, with some validity, as a proponent of the Great Man Theory, which is why Steve Jobs, with no shortage of hubris, asked him to be his biographer. Albert Einstein and Ben Franklin and me, he thought. Jobs, who deserves massive recognition for the cleverness of his creations, was also known as a guy who sometimes took credit for the work of others, and he sold his author some lines. Bright guy that he is, though, Isaacson knows the new technologies and their applications have many parents, and he’s cast his net wider in The Innovators. An excerpt from his just-published book, via The Daily Beast, in which he describes the evolution of Wikipedia’s hive mind:

“One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that [Jimmy] Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then [Larry] Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, ‘people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.’ Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. ‘As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,’ he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. ‘A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.’ Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. ‘Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,’ Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.”

Tags: , ,

I haven’t yet read Walter Isaacson’s new Silicon Valley history, The Innovators, but I would be indebted if it answers the question of how much Gary Kildall’s software was instrumental to Microsoft’s rise. Was Bill Gates and Paul Allen’s immense success built on intellectual thievery? Has the story been mythologized beyond realistic proportion? An excerpt from Brendan Koerner’s New York Times review of the book:

“The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities ‘shared a resistance to power elites and a desire to control their own access to information.’ Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he’s accustomed to writing biographies of men who operated inside the corridors of power — Benjamin Franklin, Henry ­Kissinger, Jobs — Isaacson seems a bit baffled by committed outsiders like ­Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of ’60s ­counterculture.

Yet this minor shortcoming is quickly forgiven when The Innovators segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk — a spoiled brat and compulsive gambler who ‘was rebellious just for the hell of it.’ Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm — in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine’s MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian.”

Tags: , ,

At Medium, Walter Isaacson posted a new excerpt (which awaits your crowdsourcing) from his forthcoming book on Silicon Valley creators. His latest segment concerns the famed Homebrew Computer Club, the original cult of the microprocessor, which was spread across the country with a Johnny Appleseed approach several years before there was an Apple Computers. The first two paragraphs: 

The Homebrew Mentality

In June 1975, the month that Gates first moved down to Albuquerque, Ed Rogers decided to send the Altair or the road showing off its marvels as if it were a carney show exhibit. His goal was to create computer clubs in towns across America, preferably filled with Altair loyalists. He tricked out a Dodge camper van, dubbed it the MITS Mobile, and sent it on a sixty-town tour up the coast of California then down to the Southeast, hitting such hotspots as Little Rock, Baton Rouge, Macon, Huntsville, and Knoxville. Gates, who went along for part of the ride, thought it was an amazingly neat marketing ploy. ‘They bought this big blue van and they went around the country and created computer clubs everyplace they went,’ he marveled. ‘Most of the computer clubs in America were created by MITs.’ Gates was at the shows in Texas, and Allen joined for the ride when they got to Alabama. At the Huntsville Holiday Inn, sixty people — a mix of hippyish hobbyists and crew-cut engineers — paid $10 to attend, then about four times the cost of a movie. The presentation lasted three hours. At the end of a display of a lunar landing game, doubters came and looked under the table assuming there were cables to some bigger minicomputer hidden underneat. ‘But once they saw it was real,’ Allen recalled. ‘the engineers became almost giddy with enthusiasm.’

One of the stops that summer was Rickeys Hyatt House in Palo Alto. There a fateful encounter occurred after Microsoft BASIC was demonstrated to a group of hackers and hobbyists from a newly-formed local group known as the Homebrew Computer Club. ‘The room was packed with amateurs and experimenters eager to find out about this new electronic toy,’ the club’s newsletter reported. Some of them were also eager to act on the hacker credo that software, like information, should be free. This was not surprising given the social and cultural attitudes — so different from the entrepreneurial zeal of those who had migrated up from Albuquerque — which had flowed together in the early 1970s leading up to the formation of the Homebrew Computer Club.”

Tags: , ,

Walter Isaacson, who’s writing a book about Silicon Valley creators, knows firsthand that sometimes such people take credit that may not be coming to them. So he’s done a wise thing and put a draft of part of his book online, so that crowdsourcing can do its magic. As he puts it: “I am sketching a draft of my next book on the innovators of the digital age. Here’s a rough draft of a section that sets the scene in Silicon Valley in the 1970s. I would appreciate notes, comments, corrections.” The opening paragraphs of his draft at Medium:

“The idea of a personal computer, one that ordinary individuals could own and operate and keep in their homes, was envisioned in 1945 by Vannevar Bush. After building his Differential Analyzer at MIT and helping to create the military-industrial-academic triangle, he wrote an essay for the July 1945 issue of the Atlantic titled ‘As We May Think.’ In it he conjured up the possibility of a personal machine, which he dubbed a memex, that would not only do mathematical tasks but also store and retrieve a person’s words, pictures and other information. ‘Consider a future device for individual use, which is a sort of mechanized private file and library,’ he wrote. ‘A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.’

Bush imagined that the device would have a ‘direct entry’ mechanism so you could put information and all your records into its memory. He even predicted hypertext links, file sharing, and collaborative knowledge accumulation. ‘Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified,’ he wrote, anticipating Wikipedia by a half century.

As it turned out, computers did not evolve the way that Bush envisioned, at least not initially. Instead of becoming personal tools and memory banks for individuals to use, they became hulking industrial and military colossi that researchers could time share but the average person could not touch. In the early 1970s, companies such as DEC began to make minicomputers, the size of a small refrigerator, but they dismissed the idea that there would be a market for even smaller ones that could be owned and operated by ordinary folks. ‘I can’t see any reason that anyone would want a computer of his own,’ DEC president Ken Olsen declared at a May 1974 meeting where his operations committee was debating whether to create a smaller version of its PDP-8 for personal consumers. As a result, the personal computer revolution, when it erupted in the mid-1970s, was led by scruffy entrepreneurs who started companies in strip malls and garages with names like Altair and Apple.

Once again, innovation was spurred by the right combination of technological advances, new ideas, and social desires. The development of the microprocessor, which made it technologically possible to invent a personal computer, occurred at a time of rich cultural ferment in Silicon Valley in the late 1960s, one that created a cauldron suitable for homebrewed machines. There was the engineering culture that arose during World War II with the growth of defense contractors, such as Westinghouse and Lockheed, followed by electronics companies such as Fairchild and its fairchildren. There was the startup culture, exemplified by Intel and Atari, where creativity was encouraged and stultifying bureaucracies disdained. Stanford and its industrial park had lured west a great silicon rush of pioneers, many of them hackers and hobbyists who, with their hands-on imperative, had a craving for computers that they could touch and play with. In addition there was a subculture populated by wireheads, phreakers, and cyberpunks, who got their kicks hacking into the Bell System’s phone lines or the timeshared computers of big corporations.

Added to this mix were two countercultural strands: the hippies, born out of the Bay Area’s beat generation, and the antiwar activists, born out of the Free Speech Movement at Berkeley. The antiauthoritarian and power-to-the-people mindset of the late 1960s youth culture, along with its celebration of rebellion and free expression, helped lay the ground for the next wave of computing. As John Markoff wrote in What the Dormouse Said, ‘Personal computers that were designed for and belonged to single individuals would emerge initially in concert with a counterculture that rejected authority and believed the human spirit would triumph over corporate technology.'”

Tags:

There are fewer postcards and hand-written notes today, but I don’t think anyone would argue against the idea that more people in the world are writing more in the Internet Age than at any moment in history. What we’re writing is largely bullshit, sure, but not all of it is. It’s really the full flowering of democracy, like it or not. From Walter Isaacson’s New York Times review of Clive Thompson’s glass-half-full tech book, Smarter Than You Think:

“Thompson also celebrates the fact that digital tools and networks are allowing us to share ideas with others as never before. It’s easy (and not altogether incorrect) to denigrate much of the blathering that occurs each day in blogs and tweets. But that misses a more significant phenomenon: the type of people who 50 years ago were likely to be sitting immobile in front of television sets all evening are now expressing their ideas, tailoring them for public consumption and getting feedback. This change is a cause for derision among intellectual sophisticates partly because they (we) have not noticed what a social transformation it represents. ‘Before the Internet came along, most people rarely wrote anything at all for pleasure or intellectual satisfaction after graduating from high school or college,’ Thompson notes. ‘This is something that’s particularly hard to grasp for professionals whose jobs require incessant writing, like academics, journalists, lawyers or marketers. For them, the act of writing and hashing out your ideas seems commonplace. But until the late 1990s, this simply wasn’t true of the average nonliterary person.'”

Tags: ,

Following up his authorized Steve Jobs bio, Walter Isaacson is writing a book about the icons of the Digital Era. Let’s hope he employs a large team of fact-checkers because such people tend to be fabulists. There’s an excerpt at Harvard Magazine from the forthcoming volume, about Bill Gates, who’s told a yarn or two in his day and is no stranger to the author. The opening:

“IT MAY HAVE BEEN the most momentous purchase of a magazine in the history of the Out of Town Newsstand in Harvard Square. Paul Allen, a college dropout from Seattle, wandered into the cluttered kiosk one snowy day in December 1974 and saw that the new issue of Popular Electronics featured a home computer for hobbyists, called the Altair, that was just coming on the market. He was both exhilarated and dismayed. Although thrilled that the era of the ‘personal’ computer seemed to have arrived, he was afraid that he was going to miss the party. Slapping down 75 cents, he grabbed the issue and trotted through the slush to the Currier House room of Bill Gates, a Harvard sophomore and fellow computer fanatic from Lakeside High School in Seattle, who had convinced Allen to drop out of college and move to Cambridge. ‘Hey, this thing is happening without us,’ Allen declared. Gates began to rock back and forth, as he often did during moments of intensity. When he finished the article, he realized that Allen was right. For the next eight weeks, the two of them embarked on a frenzy of code writing that would change the nature of the computer business.

What Gates and Allen set out to do, during the Christmas break of 1974 and the subsequent January reading period when Gates was supposed to be studying for exams, was to create the software for personal computers. ‘When Paul showed me that magazine, there was no such thing as a software industry,’ Gates recalled. ‘We had the insight that you could create one. And we did.’ Years later, reflecting on his innovations, he said, ‘That was the most important idea that I ever had.’

In high school, Gates had formed the Lakeside Programming Group, which made money writing computer code for companies in the Pacific Northwest. As a senior, he applied only to three colleges—Harvard, Yale, and Princeton—and he took different approaches to each. ‘I was born to apply for college,’ he said, fully aware of his ability to ace meritocratic processes. For Yale he cast himself as an aspiring political type and emphasized the month he had spent in Washington as a congressional page. For Princeton, he focused only on his desire to be a computer engineer. And for Harvard, he said his passion was math. He had also considered MIT, but at the last moment blew off the interview to play pinball. He was accepted to all three, and chose Harvard. ‘There are going to be some guys at Harvard who are smarter than you,’ Allen warned him. Gates replied, ‘No way! No way!'”

Tags: , ,

Not everyone believes in the Tao of Steve, but Walter Isaacson, Steve Jobs’ authorized biographer, feels, as many of us do, that Apple has limped along since its co-founder’s death, offering new iterations instead of innovations. In a Financial Times article about the state of Apple and other topics, the former Time Managing Editor also analyzes the tense situation in Syria, seeing an intersection of Russian and American interests. An excerpt:

“I was at a dinner in Manhattan a few weeks ago, just as the Syria issue was heating up, with one of my previous biography subjects, Henry Kissinger. He gave a dazzling analysis (I would call it ‘incredible’ except that it was, in fact, exceedingly credible) of how Russia would see its strategic interests, and predicted that Russia’s president would soon insert himself into the situation by calling for an international approach to the problem. So I was impressed but not surprised when Vladimir Putin did precisely that a week later.

On some of the TV shows I went on to talk about Steve Jobs, I was asked instead about Syria – and the question was usually about whether we could possibly trust the Russians. Most of the guests got worked into a lather, saying that Barack Obama was being horribly naive to trust them. But I think it is perfectly sensible to trust the Russians: we can trust them to do what they perceive to be in their own strategic interest.

Some of Russia’s strategic interests clash with ours: they want to protect their client state Syria and minimise US influence in the region (and yank America’s chain when possible). But to a great extent, Russia’s interests in this situation actually coincide with ours – at least for the moment. Russia fears as much as the US does the rise of radical Islam just south of its borders. It doesn’t want chemical weapons to fall into the hands of terrorists. And it would like to keep President Bashar al-Assad in power.

That last interest seems to conflict with ours, since the US has called for regime change. But the Russians believe that toppling Assad is not the best idea when that might lead to al-Qaeda and other jihadist forces taking over much of Syria and getting control of some of the chemical weapons. Thus it is in Russia’s interest to get Assad to surrender his chemical weapons, rather than summarily topple him. That might actually be in the west’s interests as well.”

Tags: , , , , ,

Thanks to the Browser for pointing me in the direction of Evgeny Morozov’s long New Republic consideration of Walter Isaacson’s Steve Jobs bio. The article, largely critical of Isaacson’s work, also devotes space to how much Jobs was influenced by Bauhaus and Braun designs. An excerpt:

“I DO NOT MEAN to be pedantic. The question of essence and form, of purity and design, may seem abstract and obscure, but it lies at the heart of the Apple ethos. Apple’s metaphysics, as it might be called, did not originate in religion, but rather in architecture and design. It’s these two disciplines that supplied Jobs with his intellectual ambition. John Sculley, Apple’s former CEO, who ousted Jobs from his own company in the mid-1980s, maintained that ‘everything at Apple can be best understood through the lens of designing.’ You cannot grasp how Apple thinks about the world—and about its own role in the world—without engaging with its design philosophy.

Isaacson gets closer to the heart of the matter when he discusses Jobs’s interest in the Bauhaus, as well as his and Ive’s obsession with Braun, but he does not push this line of inquiry far enough. Nor does he ask an obvious philosophical question: since essences do not drop from the sky, where do they come from? How can a non-existent product—say, the iPad—have an essence that can be discovered and then implemented in form? Is the iPad’s essence something that was dreamed up by Jobs and Ive, or does it exist independently of them in some kind of empyrean that they—by training or by visionary intuition—uniquely inhabit?

The idea that the form of a product should correspond to its essence does not simply mean that products should be designed with their intended use in mind. That a knife needs to be sharp so as to cut things is a non-controversial point accepted by most designers. The notion of essence as invoked by Jobs and Ive is more interesting and significant—more intellectually ambitious—because it is linked to the ideal of purity. No matter how trivial the object, there is nothing trivial about the pursuit of perfection. On closer analysis, the testimonies of both Jobs and Ive suggest that they did see essences existing independently of the designer—a position that is hard for a modern secular mind to accept, because it is, if not religious, then, as I say, startlingly Platonic.

This is where Apple’s intellectual patrimony—which spans the Bauhaus, its postwar successor in the Ulm School of Design, and Braun (Ulm’s closest collaborator in the corporate world)—comes into play. Those modernist institutions proclaimed and practiced an aesthetic of minimalism, and tried to strip their products of superfluous content and ornament (though not without internal disagreements over how to define the superfluous). All of them sought to marry technology to the arts. Jobs’s rhetorical attempt to present Apple as a company that bridges the worlds of technology and liberal arts was a Californian reiteration of the Bauhaus’s call to unite technology and the arts. As Walter Gropius, the founder of the Bauhaus, declared, ‘Art and technology—a new unity.'”

Tags: , ,

Walter Isaacson, a writer who can communicate complicated ideas lucidly, was the perfect biographer for Steve Jobs, a technologist who could make complex functions work simply. Steven Johnson offers up his thoughts on Isaacson’s Jobs bio immediately after reading it. An excerpt:

‘While Jobs historically had a reputation for being a nightmare to work with, in fact one of the defining patterns of his career was his capacity for deep and generative partnerships with one or two other (often very different) people. That, of course, is the story of Jobs and Woz in the early days of Apple, but it’s also the story of his collaboration with Lasseter at Pixar, and Jony Ive at Apple in the second act. (One interesting tidbit from the book is that Jobs would have lunch with Ive almost every day he was on the Apple campus.) In my experience, egomaniacal people who are nonetheless genuinely talented have a hard time establishing those kinds of collaborations, in part because it involves acknowledging that someone else has skills that you don’t possess. But for all his obnoxiousness with his colleagues (and the book has endless anecdotes documenting those traits), Jobs had a rich collaborative streak as well. He was enough of an egomaniac to think of himself as another John Lennon, but he was always looking for McCartneys to go along for the ride with him.’

Tags: , ,

In 1981, William F. Buckley and Diana Trilling investigated the ramifications of the murder of Dr. Henry Tarnower by his longtime companion, Jean Harris, a slaying which awakened all sorts of emotions about the dynamics between men and women.

From “Jean Harris: Murder with Intent to Love,” the 1981 Time article by Walter Isaacson and James Wilde: “Prosecutor George Bolen, 34, was cold and indignant in his summation, insisting that jealousy over Tarnower‘s affair with his lab assistant, Lynne Tryforos, 38, was the motivating factor for murder. Argued Bolen: ‘There was dual intent, to take her own life, but also an intent to do something else . . . to punish Herman Tarnower . . . to kill him and keep him from Lynne Tryforos.’ Bolen ridiculed the notion that Harris fired her .32-cal. revolver by accident. He urged the jury to examine the gun while deliberating. Said he: ‘Try pulling the trigger. It has 14 pounds of pull. Just see how difficult it would be to pull, double action, four times by accident.’ Bolen, who was thought by his superiors to be too gentle when he cross-examined Harris earlier in the trial, showed little mercy as he painted a vivid picture of what he claims happened that night. He dramatically raised his hand in the defensive stance he says Tarnower used when Harris pointed the gun at him. When the judge sustained an objection by Aurnou that Bolen‘s version went beyond the evidence presented, the taut Harris applauded until her body shook.”

Tags: , , , , , , ,