Books

You are currently browsing the archive for the Books category.

Michael Crichton was one of the more unusual entertainers of his time, a pulp-ish storyteller with an elite education who had no taste–or talent?–for the highbrow. He made a lot of people happy, though scientists and anthropologists were not often among them. The following excerpt, from a 1981 People portrait of him by Andrea Chambers, reveals Crichton (unsurprisingly) as an early adopter of personal computing.

_____________________________

At 38, he has already been educated as an M.D. at Harvard (but never practiced), written 15 books (among them bestsellers like The Andromeda Strain and The Terminal Man), and directed three moneymaking movies (Westworld, Coma and The Great Train Robbery). He is a devoted paladin of modern painting whose collection, which includes works by Frank Stella and Robert Rauschenberg, recently toured California museums. In 1977, because the subject intrigued him, Crichton wrote the catalogue for a Jasper Johns retrospective at Manhattan’s Whitney Museum. “Art interviewers tend to be more formal and discuss esthetics—’Why did you put the red here and the blue there?’ ” says Johns. “But Michael was trying to relate me to my work. He is a novelist and he brings that different perspective.”

Crichton’s latest literary enterprise is Congo (Alfred A. Knopf), a technology-packed adventure tale about a computer-led diamond hunt in the wilds of Africa. Accompanied by a friendly gorilla named Amy, Crichton’s characters confront everything from an erupting volcano to ferocious apes bred to destroy anyone who approaches the diamonds. The novel has bobbed onto best-seller lists, despite critical sneers that it is “entertaining trash.” (A New York Times reviewer called it “literarily vapid and scientifically more anthropomorphic than Dumbo.”)

Crichton cheerfully admits that Congo owes more than its exotic locale to Sir Henry Rider Haggard’s classic King Solomon’s Mines. “All the books I’ve written play with preexisting literary forms,” Crichton says. A model for The Andromeda Strain was H.G. Wells’ The War of the Worlds. The Terminal Man was based on Frankenstein’s monster. Crichton’s 1976 novel Eaters of the Dead was inspired by Beowulf. “The challenge is in revitalizing the old forms,” he explains.

Crichton taps out his books on an Olivetti word processor (price: $13,500) and bombards readers with high-density scientific data and jargon, only some of which is real. “I did check on the rapids in the Congo,” he says. “They exist, but not where I put them.” His impressive description of a cannibal tribe is similarly fabricated. “It amused me to make a complete ethnography of a nonexistent tribe,” he notes. “I like to make up something to seem real.”•

Tags: , ,

I think Pastoralia is still my favorite George Saunders short-story collection, though I really love them all, their devastating deadpan and deep humanity. A very cool T magazine feature publishes new annotations made by authors in 75 first editions to be auctioned at Sotheby’s to benefit the PEN American Center. An example from Saunders’ oeuvre follows.

__________________________

CivilWarLand in Bad Decline by George Saunders, published in 1996.

At the end, Saunders writes: ‘Closing thought: I like the audacity of this book. I like less the places where it feels like I went into Auto-Quirky Mode. Ah youth! Some issues: Life amid limitations; paucity. Various tonalities of defense. Pain; humiliation inflicted on hapless workers – some of us turn on one another. Early on, this read, could really feel this young writer’s aversion to anything mild or typical or bland. Feeling, at first, like a tic. But then it started to grow on me — around ‘400 Pound CEO.’ This performative thing then starts to feel essential; organic somehow – a way to get to the moral outrage. I kept thinking of the word ‘immoderation.’ Like the yelp of someone who’s just been burned.”

Tags:

The original “gone girl,” mystery writer Agatha Christie abandoned her automobile one day in 1926 and vanished without a trace. For 11 days, a large-scale womanhunt fanned across the English countryside, with a thousand police officers and 15,000 volunteers searching for a body, dead or alive. Screaming headlines everywhere expressed concern for her (and provided lurid entertainment). When Christie was discovered alive and living quietly at the Swan Hydropathic Hotel in Yorkshire, the press and people turned, angered by what they thought was perhaps a publicity stunt–and maybe just a little disappointed subconsciously that the story’s final chapter wouldn’t have the worst possible end. Was it martial discord or amnesia or electroshock therapy or something else that drove the novelist from her life? Nobody really knows. Two Brooklyn Daily Eagle articles follow about the aftermath of the case.

From the December 14, 1926 Brooklyn Daily Eagle:

From the December 15, 1926 Brooklyn Daily Eagle:

Tags:

I’m sorry, but I’m just not reading books on a phone. Of course, what’s true for me may not be so for the broader world.

From Ellis Hamburger’s Verge interview with Willem Van Lancker, co-founder of the Oyster book app:

Question:

Where are people reading more, tablets, phones, or on the web?

Willem Van Lancker:

We’ve always been really big believers that the device of the future for books is the phone. That’s the first thing we went to publishers with when we started talking about the differentiation of Oyster, that we can provide the best possible mobile experience.

It’s hard to get the data on this with Android, because, what is a tablet? But between iPhone and iPad, it’s a 50 / 50 split. It might even be higher on the phone in recent months over the iPad. This is an app that people use on their phone constantly, and we see the actual activity spiking during the week at lunchtime, and through the evening and peaks around midnight, and on the weekends it’s pretty sustained. Unlike a lot of products, our biggest days are Saturdays and Sundays, but when we added the web reader, you see it spiking on weekdays because people are reading during work.

We thought about making a button you could hit that would make Oyster look like Microsoft Word like they do for March Madness. It would be funny to bring that to books.

Question:

Why did your gut tell you that people are going to be reading on phones in the future?

Willem Van Lancker:

It was my own behavior. Even when I’m in bed at night, I have an iPad mini with Retina and I still use my phone. And I have an iPhone 6 now, which is even better.•

Tags: ,

The Internet of Things is wonderful–and terrible. How valuable will be the aggregated information when all objects report back to the cloud and the network effect takes hold, and how impossible it will be to opt out, how unfortunately that information will sometimes be used. If we go from the ten million sensors currently connected to the Internet to 100 trillion by 2030 as theorist Jeremy Rifkin predicts, the next digital revolution will have taken place, with all the good and bad that entails. The opening of Sue Halpern’s New York Review of Books analysis of a slew of new titles about how tomorrow may find us all tethered:

“Every day a piece of computer code is sent to me by e-mail from a website to which I subscribe called IFTTT. Those letters stand for the phrase ‘if this then that,’ and the code is in the form of a ‘recipe’ that has the power to animate it. Recently, for instance, I chose to enable an IFTTT recipe that read, ‘if the temperature in my house falls below 45 degrees Fahrenheit, then send me a text message.’ It’s a simple command that heralds a significant change in how we will be living our lives when much of the material world is connected—like my thermostat—to the Internet.

It is already possible to buy Internet-enabled light bulbs that turn on when your car signals your home that you are a certain distance away and coffeemakers that sync to the alarm on your phone, as well as WiFi washer-dryers that know you are away and periodically fluff your clothes until you return, and Internet-connected slow cookers, vacuums, and refrigerators. ‘Check the morning weather, browse the web for recipes, explore your social networks or leave notes for your family—all from the refrigerator door,’ reads the ad for one.

Welcome to the beginning of what is being touted as the Internet’s next wave by technologists, investment bankers, research organizations, and the companies that stand to rake in some of an estimated $14.4 trillion by 2022—what they call the Internet of Things (IoT). Cisco Systems, which is one of those companies, and whose CEO came up with that multitrillion-dollar figure, takes it a step further and calls this wave ‘the Internet of Everything,’ which is both aspirational and telling. The writer and social thinker Jeremy Rifkin, whose consulting firm is working with businesses and governments to hurry this new wave along, describes it like this:

The Internet of Things will connect every thing with everyone in an integrated global network. People, machines, natural resources, production lines, logistics networks, consumption habits, recycling flows, and virtually every other aspect of economic and social life will be linked via sensors and software to the IoT platform, continually feeding Big Data to every node—businesses, homes, vehicles—moment to moment, in real time. Big Data, in turn, will be processed with advanced analytics, transformed into predictive algorithms, and programmed into automated systems to improve thermodynamic efficiencies, dramatically increase productivity, and reduce the marginal cost of producing and delivering a full range of goods and services to near zero across the entire economy.

In Rifkin’s estimation, all this connectivity will bring on the ‘Third Industrial Revolution,’ poised as he believes it is to not merely redefine our relationship to machines and their relationship to one another, but to overtake and overthrow capitalism once the efficiencies of the Internet of Things undermine the market system, dropping the cost of producing goods to, basically, nothing. His recent book, The Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism, is a paean to this coming epoch.

It is also deeply wishful, as many prospective arguments are, even when they start from fact.”

Tags: ,

Eric Brynjolfsson and Andrew McAfee’s The Second Machine Age, a first-rate look at the technological revolution’s complicated short- and mid-term implications for economics, is one of the best books I’ve read in 2014. The authors make a compelling case that the Industrial Revolution bent time more substantially than anything humans had previously done, and that we’re living through a similarly dramatic departure right now, one that may prove more profound than the first, for both good and bad reasons. In a post at his new Financial Times blog, McAfee takes on Peter Thiel’s contention that monopolies are an overall win for society. An excerpt:

“His provocation in Zero to One is that tech monopolies are generally good news since they spend heavily to keep innovating (and sometimes do cool things unrelated to their main businesses such as building driverless cars) and these innovations benefit all of us. If they stop investing and innovating, or if they miss something big, they quickly become irrelevant.

For example, Microsoft’s dominance of the PC industry was once so worrying the US government went after it in an antitrust battle that lasted two decades. Microsoft still controls more than 75 per cent of the market for desktop operating systems today, but nobody is now worried about the company’s ability to stifle tech innovation. Thiel paraphrases Leo Tolstoy’s most famous sentence: ‘All happy companies are different: each one earns a monopoly by solving a unique problem. All failed companies are the same: they failed to escape competition.’

I like Thiel’s attempt to calm the worries about today’s tech giants. Big does not always mean bad and, in the high-tech industries, big today certainly does not guarantee big tomorrow. But I’m not as blithe about monopolies as Thiel. The US cable company Comcast qualifies as a tech monopoly (it’s my only choice for a fast internet service provider) and I struggle mightily to perceive any benefit to consumers and society from its power. And there are other legitimate concerns about monopsonists (monopoly buyers), media ownership concentration and so on.

I once heard the Yale law professor Stephen Carter lay down a general rule: we should be vigilant about all great concentrations of power. We won’t need to take action against all of them but nor should we assume that they’ll always operate to our benefit.”

Tags: , , ,

Walter Isaacson, who has written his second Silicon Valley book, The Innovators, just conducted an AMA at Reddit. Elon Musk will no doubt be pleased with the headline quote, though for all his accomplishments, he certainly hasn’t emulated Benjamin Franklin’s political achievements, nor will he likely. A few exchanges follow.

_________________________

Question:

Hey Walter, who is the Ben Franklin of 2014?

Walter Isaacson:

The Ben Franklin of today is Elon Musk.

_________________________

Question:

I thoroughly enjoyed your biography on Steve Jobs! Thank you for your diligence!

I know you talked about how you had never done a biography on a living person before. What it easier to feel like you could get a more accurate picture of a living subject? Did you have a system in place that you felt would prevent the tainting of your perspective based on the bias of the person you were interviewing?

Walter Isaacson:

I have done living people before: Kissinger, the the Wise Men. With a living subject, you get to know (if you take time to do a lot of personal interviews and listen) a hundred times more than you can learn about a historic person. I know much more about the chamfers of the original mac than about all of Ben Franklin’s lightning rod and kite-flying experiments. I tend to be a bit soft when writing about someone alive, because I tend to like most people I get to know.

_________________________

Question:

I’m surprised to see computers have not evolved beyond silicon in nearly 30-40 years. What are your thoughts?

Walter Isaacson:

It would be interesting if we built computers not based on digital circuits using binary logic — and instead tried to replicate the human mind in a carbon-based and wetware chemical system, perhaps even an analog one, like nature did it!

_________________________

Question:

What are your thoughts on singularity? Do you think it will happen, and if so, when? 

Walter Isaacson:

The theme of my book is that human minds and computers bring different strengths to the party. The pursuit of strong Artificial Intelligence has been a bit of a mirage — starting in the 1950s, it’s always seen to be 20 years away. But the combination of humans and machines in more intimate partnership — what JCR Licklider called symbiosis and what Peter Thiel calls complementarity — has proven more fruitful. Indeed amazing. So I suspect that for the indefinite future, the combination of human minds and machine power will be more powerful than aiming for artificial intelligence and a singularity.•

Tags: , , , ,

Just because Julian Assange is a megalomaniacal creepbag doesn’t mean he’s wrong about everything. He’s most certainly not. In a Newsweek excerpt from his book When Google Met Wikileaks, Assange recounts his 2011 meeting with that company’s Executive Chairman Eric Schmidt and Ideas Director Jared Cohen, and his subsequent realization that the search giant enjoys a cozy relationship with the inner sanctums of D.C.’s biggest power brokers, even the White House. I don’t doubt that Google, the de facto Bell Labs of our time and likely in possession of more information than any other entity in the history of Earth, is indeed ensconced in politics (and vice versa), though I would caution against thinking the Silicon Valley behemoth is some sort of shadow government. In his black-and-white way of viewing the world, Assange needs his foes to be as massive as his ego, and he wants to see Google as an indomitable force shaping our world. While it has some influence–and I wish corporations didn’t have any entrée into such quarters–I think Assange is overestimating the company’s importance as a world-maker to inflate his own. In fact, if Google is mainly a search company a decade or two from now, it won’t have much sway at all–it’ll probably be in a lot of trouble. A passage about Assange’s research into Cohen’s role in geopolitics:

“Looking for something more concrete, I began to search in WikiLeaks’ archive for information on Cohen. State Department cables released as part of Cablegate reveal that Cohen had been in Afghanistan in 2009, trying to convince the four major Afghan mobile phone companies to move their antennas onto U.S. military bases. In Lebanon, he quietly worked to establish an intellectual and clerical rival to Hezbollah, the ‘Higher Shia League.’ And in London he offered Bollywood movie executives funds to insert anti-extremist content into their films, and promised to connect them to related networks in Hollywood.

Three days after he visited me at Ellingham Hall, Jared Cohen flew to Ireland to direct the ‘Save Summit,’ an event co-sponsored by Google Ideas and the Council on Foreign Relations. Gathering former inner-city gang members, right-wing militants, violent nationalists and ‘religious extremists’ from all over the world together in one place, the event aimed to workshop technological solutions to the problem of ‘violent extremism.’ What could go wrong?

Cohen’s world seems to be one event like this after another: endless soirees for the cross-fertilization of influence between elites and their vassals, under the pious rubric of ‘civil society.’ The received wisdom in advanced capitalist societies is that there still exists an organic ‘civil society sector’ in which institutions form autonomously and come together to manifest the interests and will of citizens. The fable has it that the boundaries of this sector are respected by actors from government and the ‘private sector,’ leaving a safe space for NGOs and nonprofits to advocate for things like human rights, free speech and accountable government.

This sounds like a great idea. But if it was ever true, it has not been for decades. Since at least the 1970s, authentic actors like unions and churches have folded under a sustained assault by free-market statism, transforming ‘civil society’ into a buyer’s market for political factions and corporate interests looking to exert influence at arm’s length. The last forty years have seen a huge proliferation of think tanks and political NGOs whose purpose, beneath all the verbiage, is to execute political agendas by proxy.”

Tags: , ,

Few academics sweep as widely across the past or rankle as much in the present as Jared Diamond, the UCLA professor most famous (and infamous) for Guns, Germs and Steel, a book that elides superiority–and often volition–from the history of some humans conquering others. It’s a tricky premise to prove if you apply it to the present: More-developed countries have better weapons than some other states, but it still requires will to use them. Of course, Diamond’s views are more complex than that black-and-white picture. Two excerpts follow from Oliver Burkeman’s very good new Guardian article about the scholar.

_______________________________

In person, Diamond is a fastidiously courteous 77-year-old with a Quaker-style beard sans moustache, and archaic New England vowels: “often” becomes “orphan,” “area” becomes “eerier.” There’s no computer: despite his children’s best efforts, he admits he’s never learned to use one.

Diamond’s first big hit, The Third Chimpanzee (1992), which won a Royal Society prize, has just been reissued in an adaptation for younger readers. Like the others, it starts with a mystery. By some measures, humans share more than 97% of our DNA with chimpanzees – by any commonsense classification, we are another kind of chimpanzee – and for millions of years our achievements hardly distinguished us from chimps, either. “If some creature had come from outer space 150,000 years ago, humans probably wouldn’t figure on their list of the five most interesting species on Earth,” he says. Then, within the last 1% of our evolutionary history, we became exceptional, developing tools and artwork and literature, dominating the planet, and now perhaps on course to destroy it. What changed, Diamond argues, was a seemingly minor set of mutations in our larynxes, permitting control over spoken sounds, and thus spoken language; spoken language permitted much of the rest.

_______________________________

Geography sometimes plays a huge role; sometimes none at all. Diamond’s most vivid illustration of the latter is the former practice, in two New Guinean tribes, of strangling the widows of deceased men, usually with the widows’ consent. Other nearby tribes that have developed in the same landscape don’t do it, so a geographical argument can’t work. On the other hand: “If you ask why the Inuit, living above the Arctic Circle, wear clothes, while New Guineans often don’t, I would say culture makes a negligible contribution. I would encourage anyone who disagrees to try standing around in Greenland in January without clothes.” And human choices really matter: once the Spanish encountered the Incas, Diamond argues, the Spanish were always going to win the fight, but that doesn’t mean brutal genocide was inevitable. “Colonising peoples had massive superiority, but they had choices in how they were going to treat the people over whom they had massive superiority.”

It is clear that behind these disputes, is a more general distrust among academics of the “big-picture” approach Diamond has made his own.•

Tags: ,

In his TED Talk, “New Thoughts on Capital in the Twenty-First Century,” Thomas Piketty has good and bad news. The good: Wealth inequality, although severe now, is not as deep as a century ago. The bad: The shrunken wealth gap post-World War II was an outlier, not a norm that will reestablish itself for any long period under the present system.

Tags:

An algorithmic miscue worthy of 1999, this book suggestion was on my Amazon home page yesterday. Fucking Bezos.

________________________

Featured Recommendation:

Prepper’s Pantry: The Survival Guide To Emergency Water & Food Storage
by Ron Johnson (October 6, 2014)
Auto-delivered wirelessly
Kindle Price: $2.99

In the event of an emergency having an adequate supply of food could mean the difference between life and death!

Are you prepared for any disaster that is about to happen? Do you already have emergency supplies? Is it enough to sustain you and your family’s life for an extended period, when help from others would be close to impossible? Have you discussed and implemented the emergency plans with your family?

Why recommended?

Because you purchased… 

Roughing It [Kindle Edition]
Mark Twain (Author)

The Wild West as Mark Twain lived it

In 1861, Mark Twain joined his older brother Orion, the newly appointed secretary of the Nevada Territory, on a stagecoach journey from Missouri to Carson City, Nevada. Planning to be gone for three months, Twain spent the next “six or seven years” exploring the great American frontier, from the monumental vistas of the Rocky Mountains to the lush landscapes of Hawaii. Along the way, he made and lost a theoretical fortune, danced like a kangaroo in the finest hotels of San Francisco, and came to terms with freezing to death in a snow bank—only to discover, in the light of morning, that he was fifteen steps from a comfortable inn.

As a record of the “variegated vagabondizing” that characterized his early years—before he became a national treasure—Roughing It is an indispensable chapter in the biography of Mark Twain. It is also, a century and a half after it was first published, both a fascinating history of the American West and a laugh-out-loud good time.

In a 1974 People article, Joan Oliver profiled Peter Benchley after his novel Jaws had become a huge bestseller, but before anyone knew that its adaptation would forever change Hollywood. An excerpt:

“The book is the tale of a great white shark which cruises Long Island’s South Shore, gobbling up unwary swimmers, while a resort town’s police chief, civic leaders and citizens battle angrily over which is more important—the safety of the residents or the tourist-based economy of the swank community in its high season.

Jaws grew out of young Benchley’s fascination with sharks, triggered by family swordfishing expeditions off Nantucket. ‘We couldn’t find any swordfish,’ he recalled recently, ‘but the ocean was littered with sharks, so we started catching them.’

As Benchley became a successful journalist—reporter on the Washington Post, free-lancer for such magazines as Life and The New Yorker, an editor of Newsweek—his shark-watching continued. In the 1960s he capitalized
on his interest with two magazine articles, not long after a 4,500-pound great white shark was taken off Long Island’s Montauk Point. A few years later he was assigned to do a piece about Southampton—Long Island’s tony watering place. Benchley remembers thinking, ‘My God, if that kind of thing can happen around the beaches of Long Island, and I know Southampton, why not put the two together.’

The star attraction of Benchley’s book is the marauding monster whose savage attacks Benchley describes with horrifying clarity. On the fate of a child snatched from a raft, he writes: ‘Nearly half the fish had come clear of the water, and it slid forward and down in a belly-flopping motion, grinding the mass of flesh and bone and rubber. The boy’s legs were severed at the hips, and they sank, spinning slowly, to the bottom.'”

________________________

“I wrote a novel about a great white shark”:

Tags: ,

Technology Review has published “On Creativity,” a 1959 essay by Isaac Asimov that has never previously run anywhere. The opening: 

“How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the ‘creation’ of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the ‘generators’ themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s ‘Essay on Population.’

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found.”

Tags:

Walter Isaacson is thought, with some validity, as a proponent of the Great Man Theory, which is why Steve Jobs, with no shortage of hubris, asked him to be his biographer. Albert Einstein and Ben Franklin and me, he thought. Jobs, who deserves massive recognition for the cleverness of his creations, was also known as a guy who sometimes took credit for the work of others, and he sold his author some lines. Bright guy that he is, though, Isaacson knows the new technologies and their applications have many parents, and he’s cast his net wider in The Innovators. An excerpt from his just-published book, via The Daily Beast, in which he describes the evolution of Wikipedia’s hive mind:

“One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that [Jimmy] Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then [Larry] Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, ‘people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.’ Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. ‘As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,’ he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. ‘A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.’ Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. ‘Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,’ Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.”

Tags: , ,

burgess

Following up on Franklin Foer’s New Republic call to arms about Amazon’s price-setting power, here’s an excerpt from Paul Krugman’s balanced look in the New York Times at the robber baron of books:

“Does Amazon really have robber-baron-type market power? When it comes to books, definitely. Amazon overwhelmingly dominates online book sales, with a market share comparable to Standard Oil’s share of the refined oil market when it was broken up in 1911. Even if you look at total book sales, Amazon is by far the largest player.

So far Amazon has not tried to exploit consumers. In fact, it has systematically kept prices low, to reinforce its dominance. What it has done, instead, is use its market power to put a squeeze on publishers, in effect driving down the prices it pays for books — hence the fight with Hachette. In economics jargon, Amazon is not, at least so far, acting like a monopolist, a dominant seller with the power to raise prices. Instead, it is acting as a monopsonist, a dominant buyer with the power to push prices down.

And on that front its power is really immense — in fact, even greater than the market share numbers indicate. Book sales depend crucially on buzz and word of mouth (which is why authors are often sent on grueling book tours); you buy a book because you’ve heard about it, because other people are reading it, because it’s a topic of conversation, because it’s made the best-seller list. And what Amazon possesses is the power to kill the buzz. It’s definitely possible, with some extra effort, to buy a book you’ve heard about even if Amazon doesn’t carry it — but if Amazon doesn’t carry that book, you’re much less likely to hear about it in the first place.

So can we trust Amazon not to abuse that power?”

Tags:

During the insane time in California in the 1960s and early 1970s, when motorcycle gangs dropped LSD and a psychedelic pop band named itself “The Peanut Butter Conspiracy,” Joan Didion was the poet laureate of the Lost Children of the West Coast, runaways who’d run smack into a strange moment in America when all the clocks were broken. But Renata Adler also took a pretty fair shot at that title with her 1967 New Yorker article “Fly Trans-Love Airways” (gated), collected in her subsequent book Toward a Radical Middle, which examined the tensions between hippie kids and law enforcement on the Sunset Strip. A passage in which the journalist tried to make sense of it all:

Some middle-hairs who were previously uncommitted made their choice–and thereby made more acute a division that had already existed between them. At Palisades High School, in a high-income suburb of Los Angeles, members of the football team shaved their heads by way of counter-protest to the incursions of the longhairs. The longhairs, meanwhile, withdrew from the competitive life of what they refer to as the Yahoos–sports, grades, class elections, popularity contests–to devote themselves to music, poetry, and contemplation. It is not unlikely that a prosperous, more automated economy will make it possible for this split to persist into adult life: the Yahoos, an essentially military model, occupying jobs; the longhairs, on an artistic model, devising ways of spending leisure time. At the moment, however, there is a growing fringe of waifs, vaguely committed to a moral drift that emerged for them from the confrontations on the Strip and from the general climate of the events. The drift is Love; and the word, as it is now used among the teen-agers of California (and as it appears in the lyrics of their songs), embodies dreams of sexual liberation, sweetness, peace on earth, equality–and, strangely, drugs.

The way drugs came into Love seems to be this: As the waifs abandoned the social mystique of their elders (work, repression, the power struggle), they looked for new magic and new mysteries. And the prophets of chemical insight, who claimed the same devotion to Love and the same lack of interest to the power struggle as the waifs, were only too glad to supply them. Allen Ginsberg, in an article entitled “Renaissance or Die,” which appeared in the Los Angeles Free Press (a local New Left newspaper) last December, urged that “…everybody who hears my voice, directly or indirectly, try the chemical LSD at least once, every man, woman, and child American in good health over the age of fourteen.” Richard Alpert (the former psychedelic teammate of Timothy Leary), in an article in Oracle (a newspaper of the hallucinogenic set), promised, “In about seven or eight years the psychedelic population of the United States will be able to vote anybody into office they want to, right?” The new waifs, who, like many others in an age of ambiguities, are drawn to any expression of certainty or confidence, any semblance of vitality or inner happiness, have, under pressure and on the strength of such promises, gradually dropped out, in the Leary sense, to the point where they are economically unfit, devoutly bent on powerlessness, and where they can be used. They are used by the Left and the drug cultists to swell their ranks. They are used by the politicians of the Right to attack the Left. And they used by their more conventional peers just to brighten the landscape and slow down the race a little. The waifs drift about the centers of longhair activism, proselytizing for LSD and Methedrine (with arguments only slightly more extreme than the ones liberals use on behalf of fluoridation), and there is a strong possibility that although they speak of ruling the world with Love, they will simply vanish, like the children of the Children’s Crusade, leaving just a trace of color and gentleness in their wake.•

Tags:

Brilliant writer though he was, Sir Arthur Conan Doyle was gullible to lots of complete bullshit, mostly centered around spiritualist shenanigans, even believing frenemy Harry Houdini was doomed to an early death due to his skepticism. In an article in the April 10, 1922 Brooklyn Daily Eagle, the creator of Sherlock Holmes explained his vision of the afterlife, which he believed to be a childless place where a man could trade in his wife for a new model.

Tags: ,

The opening of Dwight Garner’s lively New York Times Book Review piece about the latest volume, a career summation of sorts, by Edward O. Wilson, a biologist who has watched his aunts ants have sex:

“The best natural scientists, when they aren’t busy filling us with awe, are busy reminding us how small and pointless we are.’Stephen Hawking has called humankind ‘just an advanced breed of monkeys on a minor planet of a very average star.’ The biologist and naturalist Edward O. Wilson, in his new book, which is modestly titled The Meaning of Human Existence, puts our pygmy planet in a different context.

‘Let me offer a metaphor,’ he says. ‘Earth relates to the universe as the second segment of the left antenna of an aphid sitting on a flower petal in a garden in Teaneck, N.J., for a few hours this afternoon.’ The Jersey aspect of that put-down really drives in the nail.

Mr. Wilson’s slim new book is a valedictory work. The author, now 85 and retired from Harvard for nearly two decades, chews over issues that have long concentrated his mind: the environment; the biological basis of our behavior; the necessity of science and humanities finding common cause; the way religion poisons almost everything; and the things we can learn from ants, about which Mr. Wilson is the world’s leading expert.

Mr. Wilson remains very clever on ants. Among the questions he is most asked, he says, is: ‘What can we learn of moral value from the ants?’ His response is pretty direct: ‘Nothing. Nothing at all can be learned from ants that our species should even consider imitating.’

He explains that while female ants do all the work, the pitiful males are merely ‘robot flying sexual missiles’ with huge genitalia. (This is not worth imitating?) During battle, they eat their injured. ‘Where we send our young men to war,’ Mr. Wilson writes, ‘ants send their old ladies.’ Ants: moral idiots.

The sections about ants remind you what a lively writer Mr. Wilson can be. This two time winner of the Pulitzer Prize in nonfiction stands above the crowd of biology writers the way John le Carré stands above spy writers. He’s wise, learned, wicked, vivid, oracular.”

Tags: , ,

Long before Moneyball, Earnshaw Cook was, in Frank Deford’s words, the “scholar-heretic” of baseball, a statistician who proved the game’s strategy was backwards. Many of his innovations are common knowledge in the sport today (e.g., sacrifice bunts are usually unproductive), though others are still strangely not implemented. For instance: If you’re in a pivotal post-season game played under National League rules, why not use a relief pitcher who matches up well with the opposing lineup at the beginning of the game, and then pinch hit for him the first time he’s to bat so that you get an extra plate appearance by a good hitter? Then you can insert your “starting” pitcher. Makes sense. Many other of the stat man’s strategies have been disproven, but his underlying message that the sport was being played more from tradition than wisdom was correct.

Deford profiled Cook in Sports Illustrated for the first time in 1964, and while that piece doesn’t seem to be online, here’s the opening of his 1972 portrait, “It Ain’t Necessarily So, and Never Was,” which ran just prior to the publication of Cook’s Percentage Baseball and the Computerwhen the numbers whiz went digital:

For more than a decade Earnshaw Cook, a retired Baltimore metallurgist, has been trying to convince baseball’s bosses that playing the sacred percentages is, to be blunt, dumb baseball. In 1964 Cook brought out a 345-page book, Percentage Baseball, that was full of charts, curves, tables and complicated formulas that sometimes went on for the better part of a page. The book dared to suggest that either: a) baseball is not using the best possible odds on the field, or b) mathematics is a fake.

Nothing has happened since to convince Cook that ‘a’ is wrong and ‘b’ is right. ‘As in the world around us,’ he says, ‘baseball offers a completely balanced, highly complicated statistical system, demonstrably controlled in all its interactions of play by the random operations of the laws of chance. As such, it becomes a fascinating illustration of a process readily susceptible to reliable mathematical analysis. Baseball also furnishes a classic example of the utter contempt of its unsophisticated protagonists for the scientific method.’

That last sentence is Cook’s way of saying that the national pastime thinks he is as nutty as a fruitcake. Since 1964 nobody has dared test out his conclusions even in, say, a winter rookie league. Oh yes, the managers in 1964 were named: Berra, Bauer, Pesky, Lopez, Tebbetts, Dressen, Hodges, Lopat, Rigney, Mele, Kennedy, Hutchinson, Craft, Alston, Bragan, Stengel, Murtaugh, Keane, Mauch and Dark. They all stayed faithful to the memory of Connie Mack—but only Alston is still managing at the same major league shop.

Cook has had some nibbles from the baseball Establishment. The Houston Astros approached him shortly after his book came out and inquired if he thought he could apply his figures in such a way that he could make judgments about minor league prospects. Cook said he would try. He checked the player records Houston sent him, and said that his evaluation indicated the two best prospects were named Jim Wynn and Rusty Staub. This was not bad figuring, as Wynn and Staub are probably still the two best players ever to wear Houston uniforms, but Cook never heard from the Astros again. He also got feelers from the Cubs and Phillies, but nothing came of those.

Ignored, Cook went back to his numbers, and this April his second volume on the subject, Percentage Baseball and the Computer, is scheduled for publication. Basically, it is 207 pages of computer proof that everything he wrote eight years ago was qualitatively correct. Well, not quite everything. The computer has found that Cook’s percentage lineup—with the best hitter leading off, the second best batting second, etc.—is, over a season, 12 runs less effective than the traditional lineup.

Otherwise the computer solidly supports the way Cook says baseball should be played.•

Tags: ,

Innovation, that word which is appropriate sparingly but ascribed constantly, is truly the proper description for the work of inventor Charles Babbage and Ada Lovelace, a one-woman Jobs and Woz. As Steven Johnson points out in his latest book, How We Got to Now, excerpted in a Financial Times article, new inventions usually are born to many parents working within the same base of knowledge, but the Victorian duo thought completely outside of the box, leaping a full century ahead of everyone else with their ideas about computers. From the FT:

“Most important innovations – in modern times at least – arrive in clusters of simultaneous discovery. The conceptual and technological pieces come together to make a certain idea imaginable – artificial refrigeration, say, or the lightbulb – and around the world people work on the problem, and usually approach it with the same fundamental assumptions about how it can be solved.

Thomas Edison and his peers may have disagreed about the importance of the vacuum or the carbon filament in inventing the electric lightbulb, but none of them was working on an LED. As the writer Kevin Kelly, co-founder of Wired magazine, has observed, the predominance of simultaneous, multiple invention in the historical record has interesting implications for the philosophy of history and science: to what extent is the sequence of invention set in stone by the basic laws of physics or information or the biological and chemical constraints of the environment?

If simultaneous invention is the rule, what about the exceptions? What about Babbage and Lovelace, who were a century ahead of just about every other human on the planet? Most innovation happens in the present tense of possibility, working with tools and concepts that are available in that time. But every now and then an individual or group makes a leap that seems almost like time travelling. What allows them to see past the boundaries of the adjacent possible, when their contemporaries fail to do so? That may be the greatest mystery of all.”

 

Tags: , ,

We tend to equate wealth with intelligence in America, and that’s often a false association. Hiltons and Johnsons who inherit money often seem as dumb as posts, and even someone who has basic smarts like Mike Bloomberg has had many points added to his IQ erroneously because he amassed vast wealth by identifying a small shortfall in financial information which could be exploited. He was really great at one particular endeavor, much the same way as Harlan Sanders was with chicken, not an amazing Renaissance Man. It showed in the very uneven job he did as NYC mayor.

So it’s best not to take as gospel the opinions of the super-rich because knowing one thing isn’t knowing everything. That said, I’ll grant Bill Gates is far more intelligent and intellectually curious than your average person, monied or not. Here’s the opening of his review of Thomas Piketty’s Capital in the Twenty-First Century, which he agrees with overall:

“A 700-page treatise on economics translated from French is not exactly a light summer read—even for someone with an admittedly high geek quotient. But this past July, I felt compelled to read Thomas Piketty’s Capital in the Twenty-First Century after reading several reviews and hearing about it from friends.

I’m glad I did. I encourage you to read it too, or at least a good summary, like this one from The Economist. Piketty was nice enough to talk with me about his work on a Skype call last month. As I told him, I agree with his most important conclusions, and I hope his work will draw more smart people into the study of wealth and income inequality—because the more we understand about the causes and cures, the better. I also said I have concerns about some elements of his analysis, which I’ll share below.

I very much agree with Piketty that:

  • High levels of inequality are a problem—messing up economic incentives, tilting democracies in favor of powerful interests, and undercutting the ideal that all people are created equal.
  • Capitalism does not self-correct toward greater equality—that is, excess wealth concentration can have a snowball effect if left unchecked.
  • Governments can play a constructive role in offsetting the snowballing tendencies if and when they choose to do so.

To be clear, when I say that high levels of inequality are a problem, I don’t want to imply that the world is getting worse. In fact, thanks to the rise of the middle class in countries like China, Mexico, Colombia, Brazil, and Thailand, the world as a whole is actually becoming more egalitarian, and that positive global trend is likely to continue.

But extreme inequality should not be ignored—or worse, celebrated as a sign that we have a high-performing economy and healthy society.”

Tags: ,

Google does many great things, but its corporate leaders want you to trust them with your private information–because they are the good guys–and you should never trust any corporation with such material. The thing is, it’s increasingly difficult to opt out of the modern arrangement, algorithms snaking their way into all corners of our lives. The excellent documentarian Eugene Jarecki has penned a Time essay about Google and Wikileaks and what the two say about the future. An excerpt follows.

_________________________

I interviewed notorious Wikileaks founder Julian Assange by hologram, beamed in from his place of asylum in the Ecuadorian Embassy in London. News coverage the next day focused in one way or another on the spectacular and mischievous angle that Assange had, in effect, managed to escape his quarantine and laugh in the face of those who wish to extradite him by appearing full-bodied in Nantucket before a packed house of exhilarated conference attendees.

Beyond the spectacle, though, what got less attention was what the interview was actually about, namely the future of our civilization in an increasingly digital world. What does it mean for us as people to see the traditional town square go digital, with online banking displacing bricks and mortar, just as email did snail mail, Wikipedia did the local library, and eBay the mom and pop shop? The subject of our ever-digitizing lives is one that has been gaining currency over the past year, fueled by news stories about Google Glasses, self-driving cars, sky-rocketing rates of online addiction and, most recently, the scandal of NSA abuse. But the need to better understand the implications of our digital transformation was further underscored in the days preceding the event with the publication of two books: one by Assange and the other by Google Executive Chairman, Eric Schmidt.

Assange’s book, When Google Met Wikileaks, is the transcript (with commentary by Assange) of a secret meeting between the two that took place on June 23, 2011, when Schmidt visited Assange in England. In his commentary, Assange explores the troubling implications of Google’s vast reach, including its relationships with international authorities, particularly in the U.S., of which the public is largely unaware. Schmidt’s book, How Google Works, is a broader, sunnier look at how technology has presumably shifted the balance of power from companies to people. It tells the story of how Google rose from a nerdy young tech startup to become a nerdy behemoth astride the globe. Read together, the two books offer an unsettling portrait both of our unpreparedness for what lies ahead and of the utopian spin with which Google (and others in the digital world) package tomorrow. While Assange’s book accuses Google of operating as a kind of “‘Don’t Be Evil’ empire,” Schmidt’s book fulfills Assange’s worst fears, presenting pseudo-irreverent business maxims in an “aw shucks” tone that seems willfully ignorant of the inevitable implications of any company coming to so sweepingly dominate our lives in unprecedented and often legally uncharted ways.•

Tags: , ,

Somebody makes money from book sales, but those people, most of them, are not writers. Plenty of authors actually lose money publishing their titles, having to pay their own expenses and taxes. At The Popcorn Chronicles, novelist Patrick Wensink reveals the earnings for his Amazon bestseller, Broken Piano for President, and they truly are revealing. An excerpt:

“Even when there’s money in writing, there’s not much money.

I was reminded of a single page in A Heartbreaking Work of Staggering Genius; specifically, the section where Dave Eggers breaks down his $100,000 advance on sales from his publisher. He then lists all his expenses. In the end the author banked a little less than half. It wasn’t bad money — just not the ‘I bet Dave Eggers totally owns a Jaguar’-type of income I expected. I mean, his name was on the cover of a book! He must be rich.

That honesty was refreshing and voyeuristic. I always said if I ever had a chance, I’d make a similar gesture. As a person learning about writing and publishing, there was something helpful about Eggers’ transparency. So here is my stab at similar honesty: the sugar bowls full of cocaine, bathtubs full of whiskey, semi-nude bookstore employees scattered throughout my bedroom tale of bestseller riches.

This is what it’s like, financially, to have the indie book publicity story of the year and be near the top of the bestseller list.

Drum roll.

$12,000.

Hi-hat crash.

I just started getting my royalty checks from July the other day (the publishing industry is slow like that). From what I can tell so far, I made about $12,000 from Broken Piano sales. That comes directly to me without all those pesky taxes taken out yet (the IRS is helpful like that).

Don’t get me wrong; as a guy with a couple of books out on an independent publisher I never thought I’d see that kind of money. Previously, my largest royalty check was about $153. I’m thrilled and very proud to say I earned any money as a writer. That’s a miracle. It’s just not the jewel-encrusted miracle most people think bestseller bank accounts are made from.

The book sold plus or minus 4,000 copies. (The publishing industry is hazy like that. What with sales in fishy-sounding third-world countries like Germany and England.) Being on an indie press I receive a more generous royalty split than most: 50 percent after expenses were deducted.

You can do the math.”

Tags:

If you’re wondering what Aleksandr Solzhenitsyn would have thought of Putin’s engagement in Ukraine, he pretty much answered the question during a 1994 interview, and he was fully in favor of repatriation of the state. From Paul Klebnikov’s discussion with the once and former dissident, which was reprinted in Forbes at the time of his death in 2008:

Forbes:

Tension is mounting between Russia and the now independent Ukraine, with the West strongly backing Ukrainian territorial integrity. Henry Kissinger argues that Russia will always threaten the interests of the West, no matter what kind of government it has.

Aleksandr Solzhenitsyn:

Henry Kissinger, Zbigniew Brzezinski, [historian] Richard Pipes and many other American politicians and publicists are frozen in a mode of thought they developed a long time ago. With unchanging blindness and stubbornness they keep repeating and repeating this theory about the supposed age-old aggressiveness of Russia, without taking into consideration today’s reality.

Forbes:

Well, what about Ukraine? Hasn’t Russia made threats toward several of the former U.S.S.R. member states?

Aleksandr Solzhenitsyn:

Imagine that one not very fine day two or three of your states in the Southwest, in the space of 24 hours, declare themselves independent of the U.S. They declare themselves a fully sovereign nation, decreeing that Spanish will be the only language. All English-speaking residents, even if their ancestors have lived there for 200 years, have to take a test in the Spanish language within one or two years and swear allegiance to the new nation. Otherwise they will not receive citizenship and be deprived of civic, property and employment rights.

Forbes:

What would be the reaction of the United States? I have no doubt that it would be immediate military intervention.

Aleksandr Solzhenitsyn:

But today Russia faces precisely this scenario. In 24 hours she lost eight to 10 purely Russian provinces, 25 million ethnic Russians who have ended up in this very way–as ‘undesirable aliens.’ In places where their fathers, grandfathers, great-grandfathers have lived since way back–even from the 17th century–they face persecution in their jobs and the suppression of their culture, education and language.

Meanwhile, in Central Asia, those wishing to leave are not permitted to take even their personal property. The authorities tell them, ‘There is no such concept as ‘personal property’!’

And in this situation ‘imperialist Russia’ has not made a single forceful move to rectify this monstrous mess. Without a murmur she has given away 25 million of her compatriots–the largest diaspora in the world!

Forbes:

You see Russia as the victim of aggression, not as the aggressor.

Aleksandr Solzhenitsyn:

Who can find in world history another such example of peaceful conduct? And if Russia keeps the peace in the single most vital question that concerns her, why should one expect her to be aggressive in secondary issues?

Forbes:

With Russia in chaos, it does sound a bit far-fetched to see her as an aggressor.

Aleksandr Solzhenitsyn:

Russia today is terribly sick. Her people are sick to the point of total exhaustion. But even so, have a conscience and don’t demand that–just to please America–Russia throw away the last vestiges of her concern for her security and her unprecedented collapse. After all, this concern in no way threatens the United States.•

Tags: ,

With computers so small they all but disappear, the infrastructure silently becoming more and more automated, what else will vanish from our lives and ourselves? I’m someone who loves the new normal of decentralized, free-flowing media, who thinks the gains are far greater than the losses, but it’s a question worth asking. Via Longreads, an excerpt from The Glass Cage, a new book by that Information Age designated mourner Nicholas Carr:

“There’s a big difference between a set of tools and an infrastructure. The Industrial Revolution gained its full force only after its operational assumptions were built into expansive systems and networks. The construction of the railroads in the middle of the nineteenth century enlarged the markets companies could serve, providing the impetus for mechanized mass production. The creation of the electric grid a few decades later opened the way for factory assembly lines and made all sorts of home appliances feasible and affordable. These new networks of transport and power, together with the telegraph, telephone, and broadcasting systems that arose alongside them, gave society a different character. They altered the way people thought about work, entertainment, travel, education, even the organization of communities and families. They transformed the pace and texture of life in ways that went well beyond what steam-powered factory machines had done.

The historian Thomas Hughes, in reviewing the arrival of the electric grid in his book Networks of Power, described how first the engineering culture, then the business culture, and finally the general culture shaped themselves to the new system. ‘Men and institutions developed characteristics that suited them to the characteristics of the technology,’ he wrote. ‘And the systematic interaction of men, ideas, and institutions, both technical and nontechnical, led to the development of a supersystem—a sociotechnical one—with mass movement and direction.’ It was at this point that what Hughes termed ‘technological momentum’ took hold, both for the power industry and for the modes of production and living it supported. ‘The universal system gathered a conservative momentum. Its growth generally was steady, and change became a diversification of function.’ Progress had found its groove.

We’ve reached a similar juncture in the history of automation. Society is adapting to the universal computing infrastructure—more quickly than it adapted to the electric grid—and a new status quo is taking shape. …

The science-fiction writer Arthur C. Clarke once asked, ‘Can the synthesis of man and machine ever be stable, or will the purely organic component become such a hindrance that it has to be discarded?’ In the business world at least, no stability in the division of work between human and computer seems in the offing. The prevailing methods of computerized communication and coordination pretty much ensure that the role of people will go on shrinking. We’ve designed a system that discards us. If unemployment worsens in the years ahead, it may be more a result of our new, subterranean infrastructure of automation than of any particular installation of robots in factories or software applications in offices. The robots and applications are the visible flora of automation’s deep, extensive, and invasive root system.”

Tags: ,

« Older entries § Newer entries »