You are currently browsing the archive for the Excerpts category.

Kevin Kelly asked his readers to concisely predict the technological future in “The Coming Hundred Years in One Hundred Words.” One of the most implausible scenarios is one of my favorites:

“You will sleep in a sort of bathtub for taking care of your skin. The bathtub will be enclosed in an atmosphere enriched with substances to take care of your organs. You will never have to take a bath again. Your clothes will be made from a special polymer and you choose from more than 1.000 looks, and the fabrics will be molded to the look you choose. You will eat all the food you like. You will have special lanes for those who prefer to drive, but 80% choose self-driven cars. People will work 4 hours/week. No Police and no Politics. ”


Two passages, one from five years ago and one from today, about how the anarchy of the Internet has released the devil inside us all.


In 2009, Jim Windolf of Vanity Fair wrote a good article about Internet trolls even before the term existed, but even though biting blogs have been supplanted by social media, his prediction about the decline of anonymous online hating did not come true–at least not yet–in fact it’s taken on new and even more hurtful forms. An excerpt from his piece:

“Online rudeness probably won’t last forever. I think it’s just a fashion. Things change. Stuff that seems cool gets stale. It feels like it won’t, but it does. So it seems reasonable to guess that online nastiness will fade—not through any enforcement, but just because it will go out of style. There will always be flame wars. There will always be online lunkheads and goons. But in a few years maybe you won’t really want to be the one calling someone else a douche-tard in a comments section.”


From Alex Hern’s Guardian article, Tim Berners-Lee on what he hath wrought, complete unintentionally:

“Tim Berners-Lee has expressed sadness that the web has mirrored the dark side of humanity, as well as enabling its ‘wonderful side’ to flourish.

The developer, who created the web in 1990 while working for the particle collider project Cern in Switzerland, said that the web is a reflection of human nature elsewhere, but that he had hoped ‘that the web would provide tools and fora and new ways of communicating that would break down national barriers and allow us to just get to a better global understanding.

‘Well, maybe it’ll happen in the future … Maybe we will be able to build web-based tools that help us keep people on the path of collaborating rather than fighting.’

Speaking to BBC News, Berners-Lee said that it was ‘staggering’ that people ‘who clearly must have been brought up like anybody else will suddenly become very polarised in their opinions, will suddenly become very hateful rather than very loving.'”

Tags: , ,

An Economist article responds to Lee Gomes’ Slate piece about the difficulty of making driverless vehicles truly autonomous, suggesting that the real impediment to such machines might be the large stock of cheap labor created by the disruptive qualities of other technologies. An excerpt:

“Writing at Slate, for instance, Lee Gomes frets that driverless vehicles struggle in unfamiliar territory when they lack good maps, can make errors when sun blinds their cameras, and are occasionally caught out by the unexpected appearance of new traffic signals. Human drivers, of course, share these weaknesses, and others: like difficulty operating in adverse weather conditions. The big difference between driverless vehicles and humans, in these cases, is that the computer can be programmed to behave cautiously when stumped, while humans often plough ahead heedlessly. When critiquing driverless cars it is often useful to recall that human drivers kill and maim millions of people each year.

Ironically, the biggest obstacle to widespread use of driverless vehicles, over the next decade or two at any rate, may be the effects of rapid technological progress in other parts of the economy. As a recent special report explains, technological change over the last generation has wiped out many middle-skill jobs, pushing millions of workers into competition for low-wage work. That glut has contributed to stagnant wages for most workers, and low pay has in turn reduced the incentive to firms to deploy labour-saving technology. Why automate, when there is an enormous stock of cheap labour available? At the same time firms like Uber are making the use of hired cars cheaper and more convenient, reducing the attraction to many households of owning and driving their own personal vehicles.

The combination of Uber and cheap labour could pose a formidable threat to the driverless car. The cost of the sensors and processors needed to pilot an autonomous vehicle is falling and is likely to fall much more as production ramps up. Yet the technology is still pricey, especially compared with a human, which, after all, is a rather efficient package of sensory and information-processing equipment. At low wages, a smartphone-enabled human driver is formidable competition for a driverless vehicle.

It would be a remarkable irony if the driverless car—in many ways the symbol of the technological revolution that is now reshaping modern economies—fails to materialise as an economic reality thanks to the disemploying power of other technologies.”


The type of buoyant journalism career the late Ben Bradlee enjoyed barely exists anymore, and that’s both a good and bad thing. It’s great that American media is in far more hands now in our decentralized world, the “barbarians” having stormed the gates, though it would be better if more of those thumbing at keypads aspired to greatness. Of course, that’s not so easy with that industry’s currently complicated financial picture.

In David Remnick’s excellent New Yorker post-mortem of his late Washington Post boss, he shares that Jason Robard’s big-screen depiction of Bradlee was more restrained than the reality: “Younger people watching the actor Jason Robards’s portrayal of Bradlee in All the President’s Men can be forgiven for thinking it is a broad caricature, an exaggeration of his cement-mixer voice, his cocky ebullience, his ferocious instinct for a political story, and his astonishing support for his reporters. In fact, Robards underplayed Bradlee.”

Bradlee’s reaction to the film’s D.C. premiere, recorded in an April 19, 1976 People article, was far less revealing. An excerpt about Bradlee and his “chum” Sally Quinn:

“Jason Robards’ portrayal of Washington Post executive editor Ben Bradlee drew raves from Sally Quinn, a Post reporter who is a close Bradlee chum. ‘Amazing,’ she gushed. ‘Robards only met him twice, but he had his mannerisms down to a T.’ Bradlee himself would say only: ‘It was an interesting film.’ (The Post reviewer was less kind—he called the movie ‘absorbing,’ but carped at its lack of drama.)

Washington Post publisher Katherine Graham, who declined to be portrayed in the movie, said she wanted to play down the paper’s role. ‘We just kept the story alive,’ she said, ‘until the process took over and worked.’ Then, asked whether the film would lure hordes of young people into investigative reporting, she gulped, ‘God, I hope not.'”

Tags: , , , ,

The Ebola “crisis” in America is an example of more than one bias at play. It’s Availability Bias, with so much media focus on an illness that has killed exactly zero American citizens on U.S. soil, when the flu season will likely claim hundreds as it did last year. It’s also Confirmation Bias, with those opposed to President Obama angling to position this domestic “plague” as a lack of leadership on his part. The more important news of the success of the Affordable Care Act, which raises the threshold for plague in this country, is lost in the hollering.

Ebola and ISIS beheadings and other modern challenges deserve attention, to be sure, but there is a more-hopeful parallel narrative we often ignore. From a New Statesmen article by Matthew Barzun, the U.S. Ambassador to Great Britain:

“We live in challenging, complex, even confusing times. Our world is in constant flux. Charles Dickens’s description of the French Revolution seems just as appropriate today: it is the worst of times. Indeed, it may be even more true now, as the changes are global, rather than confined to one or two countries. Newspaper headlines suggest as much. They are littered with demoralising words such as ‘beheadings,’ ‘aggression,’ ‘hatred’ and ‘fever.’ Of course, ISIL is engaged in barbarity in the Middle East that is reminiscent of some of the most grotesque of the 20th century, while the ebola virus poses a global public health threat on a scale as large as anything we’ve seen in recent decades.

At the same time, the number of refugees and internally displaced people presents a great humanitarian challenge. And human rights violations abound in many parts of the world. But here is an equally valid and, I concede, sweeping narrative that suggests this is also the best of times.

It is a time of levelling. The world has reduced extreme poverty by half since 1990. Global primary education for boys and girls is now equal.

It is a time of enduring. The number of deaths among children under five has been cut in half since 1990, meaning about 17,000 fewer children die each day. And mothers are surviving at a nearly equal rate.

It is a time of flourishing. Deaths from malaria dropped by 42 per cent between 2000 and 2012. HIV infections are declining in most regions.

It is a time of strengthening. Africa is above the poverty line for the first time. Tens of millions have been lifted out of poverty in China. The debt burden on developing coun­tries has dropped 75 per cent since 2000.

It is a time of healing. The ozone layer is showing signs of recovery thanks to global action. And all the while, the technological and communications revolution is making more people better informed than at any time in history.

So why are we intent on fixing our lens on the chaotic?”

Tags: ,

Via Google driverless-car consultant Brad Templeton, a report about Singapore’s limited test run of autonomous public transport:

“In late August, I visited Singapore to give an address at a special conference announcing a government sponsored collaboration involving their Ministry of Transport, the Land Transport Authority and A-STAR, the government funded national R&D centre. I got a chance to meet the minister and sit down with officials and talk about their plans, and 6 months earlier I got the chance to visit A-Star and also the car project at the National University of Singapore. At the conference, there were demos of vehicles, including one from Singapore Technologies, which primarily does military contracting.

Things are moving fast there, and this week, the NUS team announced they will be doing a live public demo of their autonomous golf carts and they have made much progress. They will be running the carts over a course with 10 stops in the Singapore Chinese and Japanese Gardens. The public will be able to book rides online, and then come and summon and direct the vehicles with their phones. The vehicles will have a touch tablet where the steering wheel will go. Rides will be free. Earlier, they demonstrated not just detecting pedestrians but driving around them (if they stay still) but I don’t know if this project includes that.

This is not the first such public demo – the CityMobil2 demonstration in Sardinia ran in August, on a stretch of beachfront road blocked to cars but open to bicycles, service vehicles and pedestrians. This project slowed itself to unacceptably slow speeds and offered a linear route.

The Singapore project will also mix with pedestrians, but the area is closed to cars and bicycles. There will be two safety officers on bicycles riding behind the golf carts, able to shut them down if any problem presents, and speed will also be limited.”


Uber could use all the good publicity it can get right now, its business plan a threat to licensed drivers, its surge pricing unpopular and its own operators prone to screaming headlines for any misdeeds. Even Libertarian Peter Thiel thinks the company may be flouting regulations too much, careering onto a self-destructive Napster path. Perhaps on-demand flu shots, which Uber tried for a day in several U.S. cities, can help. From Dan Diamond at Forbes:

“Uber’s latest one-day promotion kicked off on Thursday: UberHEALTH, the company’s first concerted effort to move into health care delivery.

The company announced that Uber users in Boston, New York City, or Washington, D.C., could order a free flu shot between the hours of 10 a.m. and 3 p.m.

And Flüber’s terms sounded appealing.

‘We’re leveraging the reliability and efficiency of the Uber platform and launching a one-day pilot program — UberHEALTH — in select cities today,’ an Uber blog post reads. ‘Together with our partner Vaccine Finder we will bring flu prevention packs and shots directly to you – at the single touch of a button.’

To be clear, Uber drivers weren’t administering the shots; they’d transport registered nurses to a user’s location, and the nurses could give up to 10 flu shots.”


A fun thing to speculate on which will never happen is Florida, that strange entity, splitting into two distinct states à la the Dakotas, with the politically disparate Texas-ish north and New York-esque South going their separate ways, at least metaphorically. From Allie Conti in Vice:

“Florida is like a parfait. The bottom layer is made up of Miami, gays, and rich people; the middle is basically Disney World, stucco palaces, and suburban sprawl; and the top is more or less South Georgia run-off. In the mind of the average citizen, the state is essentially three different places with distinct cultures—or lack thereof. But what would happen if a man with a vision decided he wanted to make the idea of multiple Floridas a reality?

On October 7, the city of South Miami’s vice mayor proposed just that. His resolution, which passed 3-2, suggests that the new state of South Florida would start from Orlando and go all the way to the Keys. And although the city of North Lauderdale passed a similar resolution in 2008, that version was largely symbolic. This one, according to its author, Walter Harris, is deadly serious. But Harris’s determination doesn’t make the split any more plausible, and the likelihood of South Florida becoming the 51st state is slim, to say the least. As the Sun Sentinel notes, ‘In order for secession to be enacted… the measure would require electorate approval from the entire state and Congressional approval.’

Nevertheless, one can’t blame Harris—or anyone, for that matter—for at least trying to secede from Florida. And his issues with his northern neighbors are valid. One of the main themes in the resolution is that, despite generating 69% of the state’s revenue, southern Florida doesn’t feel the government in Tallahassee is doing enough to address the unique problems that climate change pose to them. ‘South Florida’s situation is very precarious,’ the resolution reads, ‘and in need of immediate attention. Many of the issues facing south Florida are not political, but are now significant safety issues.’ One of those issues, of course, is the sea-level change that some say will soon cause places like Miami to sink into the ocean.”

Tags: ,

My definition of a genius is someone who can creatively make connections among disparate things in a way that others can’t, piecing together a new reality, a new “language,” in art or physics or anything. They see it.

In a Nautilus blog post, Claire Cameron asked for a description of such a person from five members of Mensa, an organization that measures IQ, which is a different thing. The responses:


Can you define ‘genius’ for me, or describe what a genius is?

Richard Hunter (retired finance director): An exceptional ability perhaps? That would satisfy if you were a member of Mensa—you know you have an exceptional ability in IQ if you get in to it. It is one type of genius, but genius takes many forms. An example would be Dave Johnson. He was a famous decathlete in the 80s and 90s. He was clearly a genius athlete: He ran, he could throw javelin, he could do all these things, and he won the Olympic gold decathlon. That must be genius in the sporting field. I am nothing like Dave Johnson—it is far more complicated than one thing or another.

Bikram Rana (director at a business consulting firm): It’s something that you see and you know it when you see it. I think a modern-day genius would be someone like Steve Jobs. It’s someone who has captured the imagination, done something groundbreaking.

Jack Williams (journalist): Oh god. I have no idea. I actually couldn’t. It comes in different forms. I don’t think being a Mensan makes you a genius, as I prove on a weekly basis on a Saturday night. I think there is a creative, innovative element there as well. Genius pushes the boundaries.

John Sheehan (clinical hypnotist): I don’t think you can say there is a ‘typical’ genius. There isn’t a typicalness to it, bar that one exception: Great intellectual ability. Genius has gone from ‘having a [kind of] genius’ to ‘being’ one. I think the word genius now comes from the popular press, it’s easy to say, it’s got a cachet to it. It’s easy, but among the people whose careers are invested in giftedness, high intelligence, then the word ‘genius’ is not often used. It is something I was born with, and that I have had all my life. I don’t think about it until someone asks me, because it is all I know. I think about this a lot though.

LaRae Bakerink (business consultant): It is what you do with your life that defines whether you are a genius. A genius is someone who can create something new.”

Tags: , , , , , ,

A multi-planet humanity is a hedge against an Earth catastrophe eradicating our species, sure, but there are financial considerations as well to interstellar development. From Tim Fernholz’s new Atlantic article about Elon Musk’s SpaceX:

“With 33 commercial launches on its manifest in the next four years, a plan to launch manned missions by 2017, and subsidies from Texas to build its own spaceport there after several years of leasing government facilities, SpaceX is now a serious competitor in the launch industry. That’s a validation for NASA’s public-private partnership, which was focused on developing a business, not a product.

But the question for Musk and his investors now is whether he can be more than just a better rocket builder. They want to unlock something far more challenging: A space economy where humans can vastly increase their productivity in the vacuum around our tiny world and beyond, even if nobody is quite sure how yet. Nolan of Founders Fund compares this hopeful uncertainty to the founding of the internet. ‘It wasn’t clear exactly what kind of business can come out of exchanging information really rapidly,’ he says.

For example, if it weren’t so pricey, investors could imagine putting up hundreds of new satellites in lower orbits than existing ones, making their communications and imaging far more powerful. Because of the high launch costs, current satellites aren’t upgraded frequently and are stationed relatively far from earth so that they can last longer—the closer a satellite flies to earth, the faster its orbit decays, leading to its eventual demise. As a result, the electronics in them are relatively old technology.

Cheap enough launches could also enable terrestrial flights that hop up over the atmosphere, turning a day-long flight around the world into a matter of hours. Space tourism is often cited as a possible source of revenue, as is commercial research, even asteroid mining, but making any of those sustainable will mean—you guessed it—far lower costs, as NASA has found in its failure to drum up much commercial research at the ISS.

Can the $6 million launch—or even cheaper—replace the $60 million launch?”

Tags: ,

Understanding today or tomorrow is an almost impossible task, the present being anything but a sitting duck and the future a black swan. Even those who have a better-than-average idea of where things stand and where they’re heading can misread the details, all but neutralizing their knowledge. From “Nothing You Think Matters Today Will Matter the Same Way Tomorrow,” Frank Rich’s New York magazine look at the last 50 years of American history and what it tells us about prognostication:

“It was a time when many in my boomer generation fell in love with the idea that change was something you could believe in—a particularly liberal notion that has taken hold in other generations, too, whether in the age of Roosevelt or Obama. Even as we recognize that the calendar makes for a crude and arbitrary marker, we like to think that history visibly marches on, on a schedule we can codify.

The more I dove back into the weeds of 1964, the more I realized that this is both wishful thinking and an optical illusion. I came away with a new appreciation of how selective our collective memory is, and of just how glacially history moves, despite the can-do optimism of a modern America besotted with the pursuit of instant gratification. Asked at the time of the 1964 World’s Fair to anticipate 2014, Isaac Asimov got some things right (miniaturized computers, online education, flat-screen television, and what we now know as Skype), but many of his utopian predictions were delusional. His wrong calls included not just his interplanetary fantasies but his vision of underground suburbs that would protect mankind from war, rampaging weather, and the tyranny of the automobile. Asimov also thought birth control would find international acceptance. It was no doubt beyond even his imagination that a half-century hence American lawmakers would introduce ‘personhood’ amendments attempting to all but outlaw contraception.

The screenwriter William Goldman famously summed up Hollywood in three words: ‘Nobody knows anything.’ Would that this aphorism were applicable, as he intended, solely to the make-believe of show business. It often seems that nobody knew anything about anything in 1964. Most everyone was certain that the big political developments of the time, epitomized by LBJ’s victories for civil rights and against Goldwater, would be transformational. Many of the same seers saw the year’s cultural upheavals, starting with the Beatles, as ephemera. More often than not, the reverse has turned out to be true. Are we so much smarter in 2014?”


In a Slate piece, Lee Gomes wonders whether the Google driverless car will ever be a reality, one impediment being the need for real-time maps able to read constantly shifting infrastructure on a national level. His comparison of the search-giant’s autonomous-vehicle plans to the Apple Newton seems a self-defeating argument, however, since all the elements of that ill-fated invention were realized soon thereafter in other tools. The opening:

“A good technology demonstration so wows you with what the product can do that you might forget to ask about what it can’t. Case in point: Google’s self-driving car. There is a surprisingly long list of the things the car can’t do, like avoid potholes or operate in heavy rain or snow. Yet a consensus has emerged among many technologists, policymakers, and journalists that Google has essentially solved—or is on the verge of solving—all of the major issues involved with robotic driving. The Economist believes that ‘the technology seems likely to be ready before all the questions of regulation and liability have been sorted out.’ The New York Times declared that ‘autonomous vehicles like the one Google is building will be able to pack roads more efficiently’—up to eight times so. Google co-founder Sergey Brin forecast in 2012 that self-driving cars would be ready in five years, and in May, said he still hoped that his original prediction would come true.

But what Google is working on may instead result in the automotive equivalent of the Apple Newton, what one Web commenter called a ‘timid, skittish robot car whose inferior level of intelligence becomes a daily annoyance.’ To be able to handle the everyday stresses and strains of the real driving world, the Google car will require a computer with a level of intelligence that machines won’t have for many years, if ever.”


Nelson Bunker Hunt, who sprang from the right-wing nut H.L. Hunt, lived a life as large as Texas. He was born with a silver spoon his mouth and nearly lost everything trying to corner the silver market. Hunt would have been considered just another eccentric oilman if it wasn’t for the anti-Semitism, his boner for the John Birch Society and other unsavory politics. From his lively New York Times obituary by Robert D. McFadden:

“Bunker Hunt was a jovial 275-pound eccentric who looked a bit like the actor Burl Ives. In the 1960s and ’70s, he was one of the world’s richest men, worth up to $16 billion by some estimates. With his five siblings, heirs of the oil billionaire H. L. Hunt, who sired 15 children by three women and died in 1974, he controlled a staggering family fortune whose value was not publicly reported.

In his heyday, Bunker Hunt owned five million acres of grazing land in Australia, 1,000 thoroughbreds on farms from Ireland to New Zealand, eight million acres of oil fields in Libya, offshore wells in the Philippines and Mexico, and an empire of skyscrapers, cattle ranches, mining interests and other holdings. Home was a French provincial mansion in a Dallas suburb and his 2,000-acre Circle T Ranch 30 miles out of town.

Often likened to Jett Rink, the antihero of Edna Ferber’s Giant, or the scheming J. R. Ewing of the long-running CBS television drama Dallas, he was a nonsmoking teetotaler who cultivated a devil-may-care Texas mystique by inhabiting cheap suits, a battered seven-year-old Cadillac, economy-class airline seats, burger and chili joints, and dusty barnyards in the raucous company of ranch hands.

He was an evangelical Christian close to the Rev. Jerry Falwell and Pat Robertson and supported right-wing politicians and causes, including the John Birch Society. He loathed the federal government, warned of international communist conspiracies, spouted anti-Semitic sentiments, did business with the Saudi royal family and bankrolled expeditions to salvage the Titanic and to find Noah’s Ark.”

Tags: ,

An algorithmic miscue worthy of 1999, this book suggestion was on my Amazon home page yesterday. Fucking Bezos.


Featured Recommendation:

Prepper’s Pantry: The Survival Guide To Emergency Water & Food Storage
by Ron Johnson (October 6, 2014)
Auto-delivered wirelessly
Kindle Price: $2.99

In the event of an emergency having an adequate supply of food could mean the difference between life and death!

Are you prepared for any disaster that is about to happen? Do you already have emergency supplies? Is it enough to sustain you and your family’s life for an extended period, when help from others would be close to impossible? Have you discussed and implemented the emergency plans with your family?

Why recommended?

Because you purchased… 

Roughing It [Kindle Edition]
Mark Twain (Author)

The Wild West as Mark Twain lived it

In 1861, Mark Twain joined his older brother Orion, the newly appointed secretary of the Nevada Territory, on a stagecoach journey from Missouri to Carson City, Nevada. Planning to be gone for three months, Twain spent the next “six or seven years” exploring the great American frontier, from the monumental vistas of the Rocky Mountains to the lush landscapes of Hawaii. Along the way, he made and lost a theoretical fortune, danced like a kangaroo in the finest hotels of San Francisco, and came to terms with freezing to death in a snow bank—only to discover, in the light of morning, that he was fifteen steps from a comfortable inn.

As a record of the “variegated vagabondizing” that characterized his early years—before he became a national treasure—Roughing It is an indispensable chapter in the biography of Mark Twain. It is also, a century and a half after it was first published, both a fascinating history of the American West and a laugh-out-loud good time.

In a 1974 People article, Joan Oliver profiled Peter Benchley after his novel Jaws had become a huge bestseller, but before anyone knew that its adaptation would forever change Hollywood. An excerpt:

“The book is the tale of a great white shark which cruises Long Island’s South Shore, gobbling up unwary swimmers, while a resort town’s police chief, civic leaders and citizens battle angrily over which is more important—the safety of the residents or the tourist-based economy of the swank community in its high season.

Jaws grew out of young Benchley’s fascination with sharks, triggered by family swordfishing expeditions off Nantucket. ‘We couldn’t find any swordfish,’ he recalled recently, ‘but the ocean was littered with sharks, so we started catching them.’

As Benchley became a successful journalist—reporter on the Washington Post, free-lancer for such magazines as Life and The New Yorker, an editor of Newsweek—his shark-watching continued. In the 1960s he capitalized
on his interest with two magazine articles, not long after a 4,500-pound great white shark was taken off Long Island’s Montauk Point. A few years later he was assigned to do a piece about Southampton—Long Island’s tony watering place. Benchley remembers thinking, ‘My God, if that kind of thing can happen around the beaches of Long Island, and I know Southampton, why not put the two together.’

The star attraction of Benchley’s book is the marauding monster whose savage attacks Benchley describes with horrifying clarity. On the fate of a child snatched from a raft, he writes: ‘Nearly half the fish had come clear of the water, and it slid forward and down in a belly-flopping motion, grinding the mass of flesh and bone and rubber. The boy’s legs were severed at the hips, and they sank, spinning slowly, to the bottom.'”


“I wrote a novel about a great white shark”:

Tags: ,

Technology Review has published “On Creativity,” a 1959 essay by Isaac Asimov that has never previously run anywhere. The opening: 

“How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the ‘creation’ of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the ‘generators’ themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s ‘Essay on Population.’

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found.”


Kate Greene, who wrote an Aeon essay about living on “Mars” in a Hawaiian simulation, has a brief piece in Wired about the domed habitat designed to keep the participants sane during the next 50th state “space mission.” The opening:

“I’d always wanted to visit Mars. Instead I got Hawaii. There, about 8,200 feet above sea level on Mauna Loa, sits a geodesically domed habitat for testing crew psychology and technologies for boldly going. I did a four-month tour at the NASA-funded HI-SEAS—that’s Hawaii Space Exploration Analog and Simulation—in 2013, and a new 8-month mission is scheduled to start in October. It’s a long time to be cooped up, ‘so the psychological impacts are extremely important,’ habitat designer Vincent Paul Ponthieux says. The key to keeping everybody sane? A sense of airiness. Yep—even on Mars, you’re going to need more space.”

Tags: ,

While it’s not as serious a suck on California’s water supply as swimming pools, illegal marijuana farming is part of the problem. From Pilita Clark in the Financial Times:

“Jerry Brown, California’s governor, declared a state of emergency in January after the driest year on record in 2013, but as the annual wet season beckons, the prospect of a complete drought recovery this winter is highly unlikely, government officials say.

‘Marijuana cultivation is the biggest drought-related crime we’re facing right now,’ says Lt Nores as he pokes at a heap of plastic piping the growers used to divert water from a dried-up creek near the plantation.

But California’s drought is exposing a series of problems in the US’s most populous state that are a reminder of an adage popularised by Michael Kinsley, the columnist: the scandal is often not what is illegal but what is legal.

Growing competition

The theft of 80m gallons of water a day by heavily armed marijuana cartels is undoubtedly a serious concern, not least when the entire state is affected by drought and 58 per cent is categorised as being in ‘exceptional drought,’ as defined by the government-funded US Drought Monitor.

However, this is a tiny fraction of the water used legally every day in a state that, like so many other parts of the world, has a swelling population driving rising competition for more heavily regulated supplies that have long been taken for granted and may face added risks as the climate changes.”

Tags: , ,

While the number of U.S. citizens who’ve died on American soil from Ebola remains constant, Liberia is truly in the grips of a deadly epidemic. It’s a horror of human suffering and loss and must be surreal for a nation that was starting to rise up economically. Marcus DiPaola, a freelance journalist who just returned from reporting in Liberia, did an Ask Me Anything at Reddit about this modern plague. A few exchanges follow.



Are people optimistic that Ebola will be defeated?

Marcus DiPaola:

Optimistic is… not the right word. They’re expecting it to be defeated. You gotta realize how much public knowledge there is about the outbreak here. NO ONE is shaking hands. NO ONE is touching each other. All the radio stations play Ebola songs.



Is it as bad as the media is making it out to be?

Marcus DiPaola:

I am a member of the media, and I think it’s pretty damn bad.

Locals treat every dead body with suspicion now, a crowd gathered in Central Monrovia as an alleged thief jumped into the river in Monrovia and died. Bystanders say he was not a suspected Ebola case, but many people are unwilling to take the risk associated with pulling out the body. As a result, the Liberian Red Cross Ebola burial team was called and arrived in hazmat suits and collect the body.

Cellcom Liberia, one of the country’s largest cell phone providers, has a worker checking temperatures before shoppers are even allowed to enter Cellcom Liberia’s parking lot. The worker writes down the temperature on a name tag, which is then checked by security guards after entering the parking lot, then again as they line up to enter the store, then again checked at the door to the store.

Many restaurants, hotels, banks, and stores have hand washing stations installed, and require you to wash your hands before entering. Some businesses’ attempts at requiring hand-washing fall short of their goals, as many of the buckets contain water with no soap available, or chlorine mixed into the water.



Do you personally think things will continue to escalate? Or does it seem like things are more under control than the American media is making them out to be?

Marcus DiPaola:

No, I think things are escalating right now– the burial teams are working flat out and there are about 10 of them. I think the numbers are being under-reported, especially from the Liberian government.



Should America be worried about Ebola?

Marcus DiPaola:

Yes. But not in the way you might think… it’s going to keep coming here, I have no doubt of it, but I’m willing to bet we’ll never see a full-blown outbreak. The experts I was with made it clear to me that you really really really have to work at it to get it as a non-caretaker. You have to touch bodily fluids. Fomites can theoretically do it, but everyone there kinda didn’t really worry about that, and they’ve been doing this since March.•


Chicago plans to make many of its streetlights “smart,” capable of recording data almost continuously, helping to moderate the flow of auto and pedestrian traffic and other currently fixed processes, something which will be standard on all American streets before long. From Liz Stinson at Wired:

“There you are, standing on a street corner surrounded by a mob of people waiting for the walk signal. In front of you, a single car gets the green light. Again. For all the talk of smart cities, they can be infuriatingly dumb at times. But imagine if your city could monitor the flow of pedestrians and optimize its traffic signals for walkers, not drivers? That’s exactly what Chicago is looking to do.

Later this fall, the Windy City will install a network of 40 sensor nodes on light poles at the University of Chicago, School of the Art Institute of Chicago and Argonne National Laboratory. The goal is to eventually expand the system to 1,000 sensors (enough to cover the Chicago Loop) over the next few years. Spearheaded by the University of Chicago’s Urban Center for Computational Data, it’s called the Array of Thingsinitiative, and the goal is to gather an unprecedented set of ambient data to help government officials and residents understand how their city ticks so they can make it a happier, healthier, and smarter place to live.

Every 15 seconds these sensors will gather information like temperature, humidity, carbon monoxide, vibrations, light, and sound—pretty standard stuff.”


Not having a television makes it easier to live outside the idiot broadcast and cable news culture populated by loudmouths like Sean Hannity, who looks like the equipment manager of a lacrosse team that’s been suspended for hazing, and Chuck Fucking Todd, who announced that the President’s administration was “over” when there were still three full years to go. The important work continues apace despite what that sputtering Van Dyke says. The opening of “Obama’s Moonshot Probes the Space Inside Our Skulls,” Anjana Ahuja’s Financial Times piece:

“Every president needs a blockbuster science project to crown his time in the White House. While John Kennedy reached for the stars, Barack Obama decided to reach for the synapses. A year on from his pledge to unravel the secrets of the human brain, researchers are beginning to spell out how they will tackle this challenge.

The breathtaking implications of the research are also becoming clear; within a generation science may acquire the power to predict a person’s future capacities and possibly determine their life chances. The central idea of Mr Obama’s Brain Research through Advancing Innovative Neurotechnologies Initiative is to understand how the human brain choreographs all the astonishing feats it pulls off. It is the seat of learning and the organ of memory. It generates our character and identity, and influences our behaviour. It is our brains, more than anything else, that make us who we are today – and shape who we will become.

But that neural versatility acts as a veil. How does the squidgy, three-pound lump between your ears – filled with nerve cells (neurons) firing tiny impulses of electricity across junctions called synapses – accomplish such a wide array of functions? Research grants announced this month shed light on how Mr Obama’s vaunted cerebral exploration will proceed. One avenue will be to catalogue the differences between healthy neurons and diseased ones. This will spur research into neurodegenerative diseases, such as Alzheimer’s and Parkinson’s, which cast a shadow over ageing societies.”

Tags: , , ,

Walter Isaacson is thought, with some validity, as a proponent of the Great Man Theory, which is why Steve Jobs, with no shortage of hubris, asked him to be his biographer. Albert Einstein and Ben Franklin and me, he thought. Jobs, who deserves massive recognition for the cleverness of his creations, was also known as a guy who sometimes took credit for the work of others, and he sold his author some lines. Bright guy that he is, though, Isaacson knows the new technologies and their applications have many parents, and he’s cast his net wider in The Innovators. An excerpt from his just-published book, via The Daily Beast, in which he describes the evolution of Wikipedia’s hive mind:

“One month after Wikipedia’s launch, it had a thousand articles, approximately seventy times the number that Nupedia had after a full year. By September 2001, after eight months in existence, it had ten thousand articles. That month, when the September 11 attacks occurred, Wikipedia showed its nimbleness and usefulness; contribu­tors scrambled to create new pieces on such topics as the World Trade Center and its architect. A year after that, the article total reached forty thousand, more than were in the World Book that [Jimmy] Wales’s mother had bought. By March 2003 the number of articles in the English-language edition had reached 100,000, with close to five hundred ac­tive editors working almost every day. At that point, Wales decided to shut Nupedia down.

By then [Larry] Sanger had been gone for a year. Wales had let him go. They had increasingly clashed on fundamental issues, such as Sanger’s desire to give more deference to experts and scholars. In Wales’s view, ‘people who expect deference because they have a Ph.D. and don’t want to deal with ordinary people tend to be annoying.’ Sanger felt, to the contrary, that it was the nonacademic masses who tended to be annoying. ‘As a community, Wikipedia lacks the habit or tra­dition of respect for expertise,’ he wrote in a New Year’s Eve 2004 manifesto that was one of many attacks he leveled after he left. ‘A policy that I attempted to institute in Wikipedia’s first year, but for which I did not muster adequate support, was the policy of respect­ing and deferring politely to experts.’ Sanger’s elitism was rejected not only by Wales but by the Wikipedia community. ‘Consequently, nearly everyone with much expertise but little patience will avoid ed­iting Wikipedia,’ Sanger lamented.

Sanger turned out to be wrong. The uncredentialed crowd did not run off the experts. Instead the crowd itself became the expert, and the experts became part of the crowd. Early in Wikipedia’s devel­opment, I was researching a book about Albert Einstein and I noticed that the Wikipedia entry on him claimed that he had traveled to Al­bania in 1935 so that King Zog could help him escape the Nazis by getting him a visa to the United States. This was completely untrue, even though the passage included citations to obscure Albanian websites where this was proudly proclaimed, usually based on some third-hand series of recollections about what someone’s uncle once said a friend had told him. Using both my real name and a Wikipedia han­dle, I deleted the assertion from the article, only to watch it reappear. On the discussion page, I provided sources for where Einstein actu­ally was during the time in question (Princeton) and what passport he was using (Swiss). But tenacious Albanian partisans kept reinserting the claim. The Einstein-in-Albania tug-of-war lasted weeks. I became worried that the obstinacy of a few passionate advocates could under­mine Wikipedia’s reliance on the wisdom of crowds. But after a while, the edit wars ended, and the article no longer had Einstein going to Albania. At first I didn’t credit that success to the wisdom of crowds, since the push for a fix had come from me and not from the crowd. Then I realized that I, like thousands of others, was in fact a part of the crowd, occasionally adding a tiny bit to its wisdom.”

Tags: , ,


Following up on Franklin Foer’s New Republic call to arms about Amazon’s price-setting power, here’s an excerpt from Paul Krugman’s balanced look in the New York Times at the robber baron of books:

“Does Amazon really have robber-baron-type market power? When it comes to books, definitely. Amazon overwhelmingly dominates online book sales, with a market share comparable to Standard Oil’s share of the refined oil market when it was broken up in 1911. Even if you look at total book sales, Amazon is by far the largest player.

So far Amazon has not tried to exploit consumers. In fact, it has systematically kept prices low, to reinforce its dominance. What it has done, instead, is use its market power to put a squeeze on publishers, in effect driving down the prices it pays for books — hence the fight with Hachette. In economics jargon, Amazon is not, at least so far, acting like a monopolist, a dominant seller with the power to raise prices. Instead, it is acting as a monopsonist, a dominant buyer with the power to push prices down.

And on that front its power is really immense — in fact, even greater than the market share numbers indicate. Book sales depend crucially on buzz and word of mouth (which is why authors are often sent on grueling book tours); you buy a book because you’ve heard about it, because other people are reading it, because it’s a topic of conversation, because it’s made the best-seller list. And what Amazon possesses is the power to kill the buzz. It’s definitely possible, with some extra effort, to buy a book you’ve heard about even if Amazon doesn’t carry it — but if Amazon doesn’t carry that book, you’re much less likely to hear about it in the first place.

So can we trust Amazon not to abuse that power?”


During the insane time in California in the 1960s and early 1970s, when motorcycle gangs dropped LSD and a psychedelic pop band named itself “The Peanut Butter Conspiracy,” Joan Didion was the poet laureate of the Lost Children of the West Coast, runaways who’d run smack into a strange moment in America when all the clocks were broken. But Renata Adler also took a pretty fair shot at that title with her 1967 New Yorker article “Fly Trans-Love Airways” (gated), collected in her subsequent book Toward a Radical Middle, which examined the tensions between hippie kids and law enforcement on the Sunset Strip. A passage in which the journalist tried to make sense of it all:

“Some middle-hairs who were previously uncommitted made their choice–and thereby made more acute a division that had already existed between them. At Palisades High School, in a high-income suburb of Los Angeles, members of the football team shaved their heads by way of counter-protest to the incursions of the longhairs. The longhairs, meanwhile, withdrew from the competitive life of what they refer to as the Yahoos–sports, grades, class elections, popularity contests–to devote themselves to music, poetry, and contemplation. It is not unlikely that a prosperous, more automated economy will make it possible for this split to persist into adult life: the Yahoos, an essentially military model, occupying jobs; the longhairs, on an artistic model, devising ways of spending leisure time. At the moment, however, there is a growing fringe of waifs, vaguely committed to a moral drift that emerged for them from the confrontations on the Strip and from the general climate of the events. The drift is Love; and the word, as it is now used among the teen-agers of California (and as it appears in the lyrics of their songs), embodies dreams of sexual liberation, sweetness, peace on earth, equality–and, strangely, drugs.

The way drugs came into Love seems to be this: As the waifs abandoned the social mystique of their elders (work, repression, the power struggle), they looked for new magic and new mysteries. And the prophets of chemical insight, who claimed the same devotion to Love and the same lack of interest to the power struggle as the waifs, were only too glad to supply them. Allen Ginsberg, in an article entitled ‘Renaissance or Die,’ which appeared in the Los Angeles Free Press (a local New Left newspaper) last December, urged that ‘…everybody who hears my voice, directly or indirectly, try the chemical LSD at least once, every man, woman, and child American in good health over the age of fourteen.’ Richard Alpert (the former psychedelic teammate of Timothy Leary), in an article in Oracle (a newspaper of the hallucinogenic set), promised, ‘In about seven or eight years the psychedelic population of the United States will be able to vote anybody into office they want to, right?’ The new waifs, who, like many others in an age of ambiguities, are drawn to any expression of certainty or confidence, any semblance of vitality or inner happiness, have, under pressure and on the strength of such promises, gradually dropped out, in the Leary sense, to the point where they are economically unfit, devoutly bent on powerlessness, and where they can be used. They are used by the Left and the drug cultists to swell their ranks. They are used by the politicians of the Right to attack the Left. And they used by their more conventional peers just to brighten the landscape and slow down the race a little. The waifs drift about the centers of longhair activism, proselytizing for LSD and Methedrine (with arguments only slightly more extreme than the ones liberals use on behalf of fluoridation), and there is a strong possibility that although they speak of ruling the world with Love, they will simply vanish, like the children of the Children’s Crusade, leaving just a trace of color and gentleness in their wake.”


The opening of Dwight Garner’s lively New York Times Book Review piece about the latest volume, a career summation of sorts, by Edward O. Wilson, a biologist who has watched his aunts ants have sex:

“The best natural scientists, when they aren’t busy filling us with awe, are busy reminding us how small and pointless we are.’Stephen Hawking has called humankind ‘just an advanced breed of monkeys on a minor planet of a very average star.’ The biologist and naturalist Edward O. Wilson, in his new book, which is modestly titled The Meaning of Human Existence, puts our pygmy planet in a different context.

‘Let me offer a metaphor,’ he says. ‘Earth relates to the universe as the second segment of the left antenna of an aphid sitting on a flower petal in a garden in Teaneck, N.J., for a few hours this afternoon.’ The Jersey aspect of that put-down really drives in the nail.

Mr. Wilson’s slim new book is a valedictory work. The author, now 85 and retired from Harvard for nearly two decades, chews over issues that have long concentrated his mind: the environment; the biological basis of our behavior; the necessity of science and humanities finding common cause; the way religion poisons almost everything; and the things we can learn from ants, about which Mr. Wilson is the world’s leading expert.

Mr. Wilson remains very clever on ants. Among the questions he is most asked, he says, is: ‘What can we learn of moral value from the ants?’ His response is pretty direct: ‘Nothing. Nothing at all can be learned from ants that our species should even consider imitating.’

He explains that while female ants do all the work, the pitiful males are merely ‘robot flying sexual missiles’ with huge genitalia. (This is not worth imitating?) During battle, they eat their injured. ‘Where we send our young men to war,’ Mr. Wilson writes, ‘ants send their old ladies.’ Ants: moral idiots.

The sections about ants remind you what a lively writer Mr. Wilson can be. This two time winner of the Pulitzer Prize in nonfiction stands above the crowd of biology writers the way John le Carré stands above spy writers. He’s wise, learned, wicked, vivid, oracular.”

Tags: , ,

« Older entries § Newer entries »