Excerpts

You are currently browsing the archive for the Excerpts category.

William Butler Yeats famously pined for his muse, Maud Gonne, who rejected him. When her daughter, Iseult, turned 22, the now-midlife poet tried for her hand and was likewise turned away. While apparently no one in the family would fuck Yeats, Maud did apparently have sex in the grave of her infant son who had died at two, believing some mystical hooey which said the soul of the deceased boy would transmigrate into the new baby if she conceived next to his coffin. Well, okay. From Hugh Schofield at the BBC:

Actress, activist, feminist, mystic, Maud Gonne was also the muse and inspiration for the poet W B Yeats, who immortalised her in some of his most famous verses.

After the Free State was established in 1922, Maud Gonne remained a vocal figure in Irish politics and civil rights. Born in 1866, she died in Dublin in 1953.

But for many years in her youth and early adulthood, Maud Gonne lived in France.

Of this part of her life, much less is known. There is one long-secret and bizarre episode, however, that has now been established as almost certainly true.

This was the attempt in late 1893 to reincarnate her two-year-old son, through an act of sexual intercourse next to the dead infant’s coffin. …

Having inherited a large sum of money on the death of her father, she paid for a memorial chapel – the biggest in the cemetery. In a crypt beneath, the child’s coffin was laid.

In late 1893 Gonne re-contacted Lucien Millevoye, from whom she had separated after Georges’ death.

She asked him to meet her in Samois-sur-Seine. First the couple entered the small chapel, then opened the metal doors leading down to the crypt.

They descended the small metal ladder – just five or six steps. And then – next to the dead baby’s coffin – they had sexual intercourse.•

Tags: , ,

There was a time not too long ago, before the words selfie and Kardashian were household, when Neil Hamburger, the alter-ego stand-up persona of Gregg Turkington, was even sadder than the rest of America, though we seem to have caught up. Through excruciatingly terrible jokes, he coldly points out that much of our pop culture exists merely because of how depressed and horny we are, vomiting forth the unbearable heaviness of our being. When the audience turns on his flailing, coughing, anti-comedy Pupkin-ness, Hamburger tries to manipulate mercy from them, claiming to have cancer. He isn’t feeling well, and how exactly are you and I and our chaturbating buddies?

Hamburger’s horribleness has hatched a movie which is currently at Sundance. From Matt Patches at Grantland:

In hell’s dingy comedy club, Neil Hamburger takes the stage each night, forcing audiences to confront the life they once lived. Still there are laughs — after all, this is a crowd that wound up in hell. A sample of his devilish comedy:

What’s the difference between Courtney Love and the American flag?

It would be wrong to urinate on the American flag.

For the living willing to challenge themselves, comedian Gregg Turkington tours the country as his tuxedoed alter ego Hamburger, delivering one-liners with nasally sadness. Audiences shell out to see Hamburger nose-dive with sets that would make Rupert Pupkin bite his lip. Though Turkington’s found success in his own shoes, acting in film and television, writing, and working with musicians, his weaponized jokester is the star.1 Neil Hamburger commands attention and remains an ever-changing creature, 20 years spent warping American pop culture with a fun-house mirror.

In the age of adaptation, any recognizable face is a movie waiting to happen. Borat got a movie. MacGruber got a movie. The Lego “minifig” got a movie. Fred Figglehorn got three nightmare-fuel movies (ask your kids). Saturday Night Live was in the character exploitation business before it was cool, churning out movies like It’s Pat, Coneheads, Stuart Saves His Family, and Wayne’s World. Despite a shtick that sends sensitive souls directly to therapy, Neil Hamburger’s day in the cinematic sun was inevitable. And now it’s here: Entertainment, a 2015 Sundance Film Festival premiere that extrapolates Turkington’s ongoing work into a bleak vessel of human failure. That Neil Hamburger show from hell? We’re already living it.•

 

Tags: , ,

Peyote has historically been central to the Native American church meeting. In a 1967 Psychedelic Review article, Stewart Brand, Prankster and prophet, wrote of one such congregation he attended. An excerpt:

“The meeting is mandala-form, a circle with a doorway to the east. The roadman will sit opposite the door, the moon-crescent altar in front of him. To his left sits the cedarman, to his right the drummer. On the right side as you enter will be the fireman. The people sit around the circle. In the middle is the fire.

A while after dark they go in. This may be formal, filling in clockwise around the circle in order. The roadman may pray outside beforehand, asking that the place and the people and the occasion be blessed.

Beginning a meeting is as conscious and routine as a space launch countdown. At this time the fireman is busy starting the fire and seeing that things and people are in their places. The cedarman drops a little powder of cedar needles and little balls, goes down the quickest. In all cases, the white fluff should be removed. There is usually a pot of peyote tea, kept near the fire, which is passed occasionally during the night. Each person takes as much medicine as he wants and can ask for more at any time. Four buttons is a common start. Women usually take less than the men. Children have only a little, unless they are sick. 

Everything is happening briskly at this point. People swallow and pass the peyote with minimum fuss. The drummer and roadman go right into the starting song. The roadman, kneeling on one or both knees, begins it with the rattle in his right hand. The drummer picks up the quick beat, and the roadman gently begins the song. His left hand holds the staff, a feather fan, and some sage. He sings four times, ending each section with a steady quick rattle as a signal for the drummer to pause or re-wet the drumhead before resuming the beat. Using his thumb on the drumhead, the drummer adjusts the beat of his song. When the roadman finishes he passes the staff, gourd, fan and sage to the cedarman, who sings four times with the random drumming. So it goes, the drum following the staff to the left around the circle, so each man sings and drums many times during the night.•

As a follow-up to the post which quoted former Google and current Baidu AI research scientist Andrew Ng, here’s a fuller explanation of his thoughts about the existential threat of intelligent machines, from a Backchannel interview by Caleb Garling:

Caleb Garling:

Do you see AI as a potential threat?

Andrew Ng:

I’m optimistic about the potential of AI to make lives better for hundreds of millions of people. I wouldn’t work on it if I didn’t fundamentally believe that to be true. Imagine if we can just talk to our computers and have it understand “please schedule a meeting with Bob for next week.” Or if each child could have a personalized tutor. Or if self-driving cars could save all of us hours of driving.

I think the fears about “evil killer robots” are overblown. There’s a big difference between intelligence and sentience. Our software is becoming more intelligent, but that does not imply it is about to become sentient.

The biggest problem that technology has posed for centuries is the challenge to labor. For example, there are 3.5 million truck drivers in the US, whose jobs may be affected if we ever manage to develop self-driving cars. I think we need government and business leaders to have a serious conversation about that, and think the hype about “evil killer robots” is an unnecessary distraction.•

Tags: ,

In the wake of the 2008 economic collapse, austerity felt to many the right thing to do: We needed to punish ourselves. But that policy was moralistic and incorrect, since what we actually needed was to borrow and spend. Is our view of labor also driven by a misplaced sense of morality? Brian Dean asks this question and others in “Antiwork,” a Contributoria essay which reconsiders the meaning of toil. An excerpt:

“Work” is seen as a virtue, but it covers the moral spectrum from charity and art to forced labour and banking. Belief in the inherent moral good of work has been used historically in social engineering, notably during the shift from agriculture to industry, when the Protestant work ethic was used to motivate workers and to justify punishment, including whipping and imprisonment of “idlers”. (In The Making of the English Working Class, historian EP Thompson describes how the ethos of Protestant sects such as Methodism effectively provided the prototype of the disciplined, punctual worker required by the factory owners.)

Work’s assumed virtue has always been about more than its utility or market value. George Lakoff, the cognitive linguist, provided a clue in the frame of “work as obedience.” The first virtue we learn as children is obeying our parents, particularly in performing tasks we don’t enjoy. Later, as adults, we’re paid to obey our employers – it’s called work. Work and virtue are thus connected in our neurology in terms of obedience to authority. That’s not the only cognitive frame we have for the virtue of work, but it’s the one that is constantly reinforced by what Lakoff calls the “strict father” conservative moral system.

This “strictness” moral framing is implicit, for example, in the current welfare system. An increasingly punitive approach is adopted towards those who don’t follow the prescribed “job-seeking” regimen – a trend that most political parties seem to approve of. Politicians boast of getting “tough on dependency culture”, and when they talk of “clamping down” on the “hardcore unemployed”, you’d think they were referring to criminals.

Emphasis on punishment is the sign of an obedience frame. Work itself has a long history as punishment for disobedience, as the Book of Genesis illustrates – Adam and Eve had no work until they disobeyed God, who imposed it as their punishment: “Cursed is the ground because of you; in toil you shall eat of it all the days of your life.” Unpaid work, or “community service,” is still sometimes dictated as punishment by courts. Workfare programmes similarly involve mandatory work without wages – it looks very much like punishment for the “sin” of unemployment.

Workfare illustrates a difference between framing and spin. The cognitive frame is paternalistic, morally strict, punishment-based (much like “community service”), while the political spin is all about “helping” people “integrate” back into society. Genuine help, of course, shouldn’t require the threat of losing what little income one has.

Morally, it seems that politicians, most of the media and a large section of the public are still stuck in the Puritan codes and scripts that, following the Reformation and into the industrial revolution, dominated social attitudes to work and idleness in England, America and much of Europe.•

Tags: ,

I agree with two very smart people working in Artificial Intelligence, Andrew Ng and Hod Lipson, when I say that I’m not worried about any near-term scenario in which Strong AI extincts Homo Sapiens the way we did Neanderthals. It’s not that it’s theoretically impossible in the long run, but we would likely first need to know precisely how the human brain operates, to understand the very nature of consciousness, to give “life” to our eliminators. While lesser AI than that could certainly be dangerous on a large scale, I don’t think it’s moving us back down the food chain today or tomorrow.

But like Ng and Lipson, the explosion of Weak AI throughout society in the form of autonomous machines is very concerning to me. It’s an incredible victory of ingenuity that can become a huge loss if we aren’t able to politically reconcile free-market societies with highly autonomous ones. An excerpt from Robert Hof at Forbes’ horribly designed site:

“Historically technology has created challenges for labor,” [Ng] noted. But while previous technological revolutions also eliminating many types of jobs and created some displacement, the shift happened slowly enough to provide new opportunities to successive generations of workers. “The U.S. took 200 years to get from 98% to 2% farming employment,” he said. “Over that span of 200 years we could retrain the descendants of farmers.”

But he says the rapid pace of technological change today has changed everything. “With this technology today, that transformation might happen much faster,” he said. Self-driving cars, he suggested could quickly put 5 million truck drivers out of work.

Retraining is a solution often suggested by the technology optimists. But Ng, who knows a little about education thanks to his cofounding of Coursera, doesn’t believe retraining can be done quickly enough. “What our educational system has never done is train many people who are alive today. Things like Coursera are our best shot, but I don’t think they’re sufficient. People in the government and academia should have serious discussions about this.

His concerns were echoed by Hod Lipson, director of Cornell University’s Creative Machines Lab. “If AI is going to threaten humanity, it’s going to be through the fact that it does almost everything better than almost anyone,” he said.•

Tags: , ,

Carl Djerassi, the chemist credited with creating the birth-control pill and abetting the women’s movement and sexual revolution of the 1960s, just passed away. A true polymath, he was devoted to writing plays and collecting art just as much to rewriting the rules of mating. He was also subsequently thwarted by pharmaceutical companies when he wanted to create a male pill. In a 1976 People article, Nancy Faber profiled Djerassi during his tenure as a Stanford professor and recalled his discombobulating relationship with President Nixon. An excerpt: 

Stanford Professor Carl Djerassi invited some students to his house for an evening conference and two of them showed up with a gift. Not exactly an apple for the teacher. It was a box of pink condoms. Djerassi was delighted.

It was the perfect token of esteem for a well-liked faculty member who also happens to be the research chemist who developed the birth control pill. His course in human biology was examining various methods of controlling population. (The unusual gift was brought back from Kenya where the two students had gone to study birth control techniques.)

“I don’t think there is such a thing as one best method of birth control,” Djerassi tells his classes. “If the most important thing is to be 100 percent effective, then the Pill is the best we have. If you are more concerned about side effects, then a condom is a hell of a lot better.” He adds: “It is unrealistic not to expect some side effects. You get them with tobacco, alcohol and penicillin.”

The professor, 52, is not at all reluctant to plunge into the Pill controversy. At a recent campus colloquium, he heard one young woman charge: “Sure, we have control of our fertility now, but at the cost of our health. What kind of control do we really have if we have to make that kind of bargain?” After listening to Djerassi on the subject, another participant admitted: “I’m really surprised that he is so receptive to other ideas. He advocates what is called the cafeteria approach to birth control—whatever works.”

Students are often surprised to learn that Djerassi’s career is rooted in academe as well as in the drug industry. Born in Vienna in 1923, he was educated in the United States (Kenyon College and the University of Wisconsin) after he emigrated when he was 16. He had his Ph.D. by his 22nd birthday. Five years later, in 1951, as an employee of the Mexico City-based Syntex Corporation, Djerassi led the research team that synthesized the first contraceptive pill. …

Restlessly energetic even in his leisure hours, Djerassi hikes and skis despite a fused knee suffered in a skiing accident. Rather than drop either sport, Djerassi collaborated with one of his students in designing a special boot to compensate for the knee’s loss of mobility. When he travels, the professor gets a letter from airline presidents guaranteeing him an aisle seat so he can stretch out his leg.

Djerassi has accumulated an extensive art collection weighted toward pre-Columbian artifacts and an equally impressive number of honors from every corner of the scientific community. He recalls none of the testimonials as vividly as the National Medal of Science awarded him by Richard Nixon in 1973. Two weeks later Djerassi discovered his name on the notorious White House enemies list.•

 

Tags: , ,

In his New Atlantis piece, “Losing Liberty in an Age of Access,” James Poulos writes of returning to live in his former neighborhood of Downtown Los Angeles and finding a new order–and one that isn’t limited to that city’s former ghost town. He examines the modern landscape, in which we’re all connected but there are no strings attached, a rental economy elbowing aside the buying one. The Great Recession may have hastened the new normal of access over ownership, of time itself being commodified and valued over stability, but it wasn’t the driving force behind the Uberization of cosmopolitan life, a more rootless and less cumbersome thing, in which everything (and seemingly everyone) is for rent. Technology has mostly propelled the change of heart. What has been gained and what has been lost? An excerpt about the transformation that’s taken hold in DTLA:

In an age when ownership meant everything, downtown Los Angeles languished. Today, current tastes and modern technology have made access, not ownership, culturally all-important, and LA’s “historic core” is the hottest neighborhood around. Likewise, from flashy metros like San Francisco to beleaguered cities like Pittsburgh, rising generations are driving economic growth by paying to access experiences instead of buying to own.

Nationwide, the line between downsizing hipsters and upwardly mobile yuppies is blurring — an indication of potent social and economic change. America’s hipsters and yuppies seem to be making property ownership uncool. But they’re just the fashionable, visible tip of a much bigger iceberg.

Rather than a fad, the access economy has emerged organically from the customs and habits of “the cheapest generation” — as it has been dubbed in The Atlantic, the leading magazine tracking upper-middle-class cultural trends. Writers Derek Thompson and Jordan Weissman recount that, in 2010, Americans aged 21 to 34 “bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985.” From 1998 to 2008, the share of teenagers with a driver’s license dropped by more than a fourth. And it isn’t just cars and driving: Thompson and Weissman cite a 2012 paper written by a Federal Reserve economist showing that the proportion of new young homeowners during the period from 2009 to 2011 was at a level less than half that of a decade earlier. It’s not quite a stampede from ownership, but it’s close.

In part, these changes can be chalked up to the post-Great Recession economy, which has left Millennials facing bleak job prospects while carrying heavy loads of student debt. But those economic conditions have been reinforced by other incentives to create a new way of thinking among Millennials. They are more interested than previous generations in paying to use cars and houses instead of buying them outright. Buying means responsibility and risk. Renting means never being stuck with what you don’t want or can’t afford. It remains to be seen how durable these judgments will be, but they are sharpened by technology and tastes, which affect not just the purchase of big-ticket items like cars and houses but also life’s daily decisions. Ride-sharing apps like Uber and Lyft and car-sharing services like Zipcar are biting into car sales. Vacation-home apps like Airbnb have become virtual rent-sharing apps. There’s something powerfully convenient about the logic of choosing to access stuff instead of owning it. Its applications are limited only by the imagination.

That is why we are witnessing more than just a minor shift in the way Americans do business. It is a transformation. Commerce is being remade in the image of a new age. Once associated with ubiquitous private property, capitalism is becoming a game of renting access to goods and services, not purchasing them for possession.•

Tags:

I can understand Slavoj Žižek looking at China and seeing capitalism stripped of democracy as an impressive beast, but the same was said of Fascism, even Nazism, in the 1930s. They were machines, many thought–even many American business leaders–which could not be stopped. Those states were driven by madmen and China is not, but perhaps there’s ultimately something antithetical to the human spirit embedded inside them all. Well, we shall see. From a recent Žižek address transcribed at Disinformation:

Well people often ask me how can you be so stupid and still proclaim yourself a communist. What do you mean by this? Well, I have always to emphasize that first I am well aware that let’s call it like this – the twentieth century’s over. Which means all not only communists solution but all the big leftist projects of the twentieth century failed. Not only did Stalinist communism although there its failure is much more paradoxical. Most of the countries where communists are still in power like China, Vietnam – their communists in power appear to be the most efficient managers of a very wildly productive capitalism. So okay, that one failed. I think that also and here I in a very respectful way disagree with your – by your I mean American neo-Keynesian leftists, Krugman, Stiglitz and so on. I also think that this Keynesian welfare state model is passé. In the conditions of today’s global economy it no longer works. For the welfare state to work you need a strong nation state which can impose a certain fiscal politics and so on and so on. When you have global market it doesn’t work. And the third point which is most problematic for my friends, the third leftist vision which is deep in the heart of all leftists that I know – this idea of critically rejecting alienated representative democracy and arguing for local grass root democracy where it’s not that you just delegate to the others. Your representatives to act for you, but people immediately engage in locally managing their affairs and so on.

I think this is a nice idea as far as it goes but it’s not the solution. It’s a very limited one. And if I may be really evil here I frankly I wouldn’t like to live in a stupid society where I would have to be all the time engaged in local communitarian politics and so on and so on. My idea is to live in a society where some invisible alienated machinery takes care of things so that I can do whatever I want – watch movies, read and write philosophical books and so on. But so I’m well aware that in all its versions radical left projects of the twentieth century came to an end and for one decade maybe we were all Fukuyamaists for the nineties. By Fukuyamaism I mean the idea that basically we found if not the best formula at least the least bad formula. Liberal democratic capitalism with elements of rebel state and so on and so on. And even the left played this game. You know we were fighting for less racism, women’s right, gay rights, whatever tolerance. But basically we accepted the system. I think and even Fukuyama himself is no longer a Fukuyamaist as I know that if there is a lesson of September 11 if other event is that no we don’t have the answer. That not only is liberal democratic capitalism not the universal model and is just a time of slow historical progress for it to be accepted everywhere. But again try now in Singapore and other examples of very successful economies today demonstrate that this, let’s call it ironically eternal marriage between democracy and capitalism it’s coming to an end.

What we are more and more getting today is a capitalism which is brutally efficient but it no longer needs democracy for its functioning.•

Tags:

The capacity for watch-computer hybrids has grown exponentially in the 20 years since the product fail of the Timex Data Link, though I still don’t have any interest in the iWatch or whatever Apple will brand its forthcoming wrist-worn device. Of course, my opinion means nothing. I think everyone knows that Beats by Dre isn’t the best headphones on the market, but that’s mattered little. Apple products and Beats (now an Apple product) are being purchased from the U.S. to China for reasons in addition to function. Regardless of the motivations, I’m pretty confident Angela Ahrendts will make new watch the most handsome mass-marketed wearable yet. From Rupert Neate of Guardian, a question about the item Tim Cook hopes will expand the Apple juggernaut beyond the iPhone:

Can it afford for the Apple Watch to fail?

It has been five years since Apple launched its latest truly new product – the iPad – in 2010. To live up to its name for innovation, and diversify revenues away from reliance on the iPhone, Apple needs the Apple Watch to be an unqualified success.

Cook announced that the watch would go on sale in April, giving the company a boost in its third quarter when it will not benefit from Christmas or the Chinese new year, which will have helped the previous two quarters. “We’re making great progress in the development of it,” he said.

Apple describes the new product – often referred to as the iWatch, although it has not been officially named – as the “most personal device ever” and it is thought it will be able to monitor its wearer’s health as well as connect to an iPhone to provide several other functions. Cook said app developers had already impressed him with “some incredible innovation”.

Carolina Milanesi at Kantar Worldpanel ComTech says the watch will help Apple extend its sales into a much wider market. “They have been very smart in pushing it as jewellery and design rather than how technologically smart it is,” she says. “They are concentrating more on impressing the design and fashion world than the tech bloggers.

“I think this will be a much more irrational buy than with an iPad. With an iPadyou wanted an iPad: this is going to be more of a fashion statement.”

She said the launch would benefit from the fashion and marketing skills of Angela Ahrendts, the former Burberry boss Apple hired last year on a $73m pay package as its head of retail.

Apple poached a string of big names from fashion and design to join its watch team, including Patrick Pruniaux, former vice-president of sales at Tag Heuer and former Yves Saint Laurent boss Paul Deneve, who is now Apple’s “vice president of special projects.”•

Tags: , , ,

As Thanksgiving is a relaxation of violent impulses (not including turkeys, of course), Super Bowl Sunday, that other great American holiday, is an orgy of it. What will become of the game now that parents know that they’re inviting brain injuries on their children if they let them play? There’s no helmet that can protect from concussions since it’s mostly an injury of whiplash, the brain washing around inside the skull. Will the pipeline of talent run dry even as the league is at its financial zenith? Cricket, once a hugely popular game in America, disappeared in just about two decades. Organized football, much wealthier and more powerful, won’t vanish, but will it decline in the coming decades? From the Economist:

The NFL players’ union says that the average length of a professional career is just under three and a half years. Watching a big hit on a player now comes with the same twinge of guilt as watching clips of Muhammad Ali being pummelled. Though high-school players are less likely to suffer brain damage, some school teams were forced to end their seasons early last year because so many children had been injured. Almost half of parents say they would not allow their sons to play the game, a feeling shared by Barack Obama. Nor is it easy to see how the rules could be changed to reduce the risk of brain damage in the professional game to an acceptable level.

Yet the sport will not continue to be both as popular as it is now and as dangerous. Those who dismiss football-bashers like Malcolm Gladwell, who compared the sport to dog-fighting in the New Yorker, as elitist east-coast types should remember that football began as a form of organised riot on the campuses of elitist east-coast colleges. Changes in taste can trickle down as well as bubble up. During the second half of the 20th century boxing went from being a sport watched together by fathers and sons to something that dwells among the hookers and slot machines of Nevada. Hollywood’s output of Westerns peaked in the late 1960s, after which the appeal of spending a couple of hours watching tight-lipped gunslingers in pursuit of an ethnic minority waned. Football will go the same way.•

Via the beautiful 3 Quarks Daily, I came across psychiatrist Darold Treffert’s Scientific American post about a priori knowledge, which we know exists because of savants who bring talents already formed to the world. And we all likely have such gifts of genetic memory, dormant though they usually are. An excerpt:

Whether called genetic, ancestral or racial memory, or intuitions or congenital gifts, the concept of a genetic transmission of sophisticated knowledge well beyond instincts, is necessary to explain how prodigious savants can know things they never learned.

We tend to think of ourselves as being born with a magnificent and intricate piece of organic machinery (“hardware”) we call the brain, along with a massive but blank hard drive (memory). What we become, it is commonly believed, is an accumulation and culmination of our continuous learning and life experiences, which are added one by one to memory. But the prodigious savant apparently comes already programmed with a vast amount of innate skill -and knowledge in his or her area of expertise–factory-installed “software” one might say–which accounts for the extraordinary abilities over which the savant innately shows mastery in the face of often massive cognitive and other learning handicaps. It is an area of memory function worthy of much more exploration and study.

Indeed recent cases of “acquired savants” or “accidental genius” have convinced me that we all have such factory-installed software. I discussed some of those cases in detail in the August issue of Scientific American under the title “Accidental Genius”. In short, certain persons, after head injury or disease, show explosive and sometimes prodigious musical, art or mathematical ability, which lies dormant until released by a process of recruitment of still intact and uninjured brain areas, rewiring to those newly recruited areas and releasing the until then latent capacity contained therein.

Finally, the animal kingdom provides ample examples of complex inherited capacities beyond physical characteristics. Monarch butterflies each year make a 2,500-mile journey from Canada to a small plot of land in Mexico where they winter. In spring they begin the long journey back north, but it takes three generations to do so. So no butterfly making the return journey has flown that entire route before. How do they “know” a route they never learned? It has to be an inherited GPS-like software, not a learned route.•

Tags:

There’s good stuff in James B. Stewart’s New York Times piece “How, And Why, Apple Overtook Microsoft,” though it oversimplifies the reasons for the heavenly resuscitation of Jobs’ near-dead company and the purgatory Bill Gates’ once-mighty empire is now experiencing. In one passage, it reduces the reversal of fortunes to a “vision thing,” making it seem as if Gates was taken unawares by a mobile-dominated future. Oh, Gates knew. From his 1995 book The Road Ahead

What do you carry on your person now? Probably at least keys, identification, money, and a watch. And maybe credit cards, a checkbook, traveler’s checks, an address book, an appointment book, a notepad, something to read, a camera, a pocket tape recorder, a cellular phone, a pager, concert tickets, a map, a compass, a calculator, an electronic entry card, photographs, and maybe a loud whistle to call for help.

You’ll be able to keep equivalent necessities — and more — in an information appliance I call the wallet PC. It will be about the same size as a wallet, which means you’ll be able to carry it in your pocket or purse. It will display messages and schedules and let you read or send electronic mail and faxes, monitor weather and stock reports, and play both simple and sophisticated games. At a meeting, you might take notes, check your appointments, browse information if you’re bored, or choose from among thousands of easy-to-call-up photos of your kids.•

The real distinction between the companies wasn’t vision but execution. Microsoft was too huge to pivot, though Apple might have won even if its rival wasn’t “encumbered” by success. From Stewart:

The most successful companies need a vision, and both Apple and Microsoft have one. But Apple’s was more radical and, as it turns out, more farsighted. Microsoft foresaw a computer on every person’s desk, a radical idea when IBM mainframes took up entire rooms. But Apple went a big step further: Its vision was a computer in every pocket. That computer also just happened to be a phone, the most ubiquitous consumer device in the world. Apple ended up disrupting two huge markets.

“Apple has been very visionary in creating and expanding significant new consumer electronics categories,” [Bernstein analyst Toni] Sacconaghi said. “Unique, disruptive innovation is really hard to do. Doing it multiple times, as Apple has, is extremely difficult. It’s the equivalent of Pixar producing one hit after another. You have to give kudos to Apple.”

Walter Isaacson, who interviewed Mr. Jobs for his biography of the Apple co-founder and chief executive, said: “Steve believed the world was going mobile, and he was right. And he believed that beauty matters. He was deeply moved by beautiful design. Objects of great functionality also had to be objects of desire.”•

Tags: , , , ,

Demis Hassabis, the Google Deep Learning expert recently interviewed by Steven Levy, is also queried by Murad Ahmed in the Financial Times. He argues what I suspect to be true: Machine consciousness isn’t anywhere on the horizon though not theoretically impossible. An excerpt:

A modern polymath, the 38-year-old’s career has already included spells as a child chess prodigy, master computer programmer, video games designer and neuroscientist. Four years ago, these experiences led him to start DeepMind, an AI company that, he says, has the aim of making “machines smart.”

For some, this is a utopic idea — a world aided by super-smart digital assistants working to solve humanity’s most pressing problems, from disease to climate change. Others warn of a grim Armageddon, with cognisant robots becoming all too aware of human limitations, then moving to crush their dumb creators without emotion.

Hassabis, wearing a figure-hugging black top and dark-rimmed glasses, blends in at Hakkasan, where the decor is mostly black and the lighting minimal. He tells me he knows the place well — it’s where he took executives from Google, during a series of meetings that led to the search giant paying £400m for his fledgling company a year ago. Google is betting Hassabis may be able to unlock the secrets of the mind.

“It’s quite possible there are unique things about humans,” he argues. “But, in terms of intelligence, it doesn’t seem likely. With the brain, there isn’t anything non-computable.” In other words, the brain is a computer like any other and can, therefore, be recreated. Traits previously considered innate to humans — imagination, creativity, even consciousness — may just be the equivalent of software programs. …

Hassabis argues that we’re getting ahead of ourselves. “It’s very, very far in the future from the kinds of things we’re currently dealing with, which is playing Pong on Atari,” he says. “I think the next four, five, 10 years, we’ll have a lot more information about what these systems do, what kind of computations they’re creating, how to specify the right goals. At the moment, these are science fiction stories. Yes, there’s no doubt that AI is going to be a hugely powerful technology. That’s why I work on it. It has the power to provide incredible advances for humanity.”

Too soon then, to be worrying about how to wage war with a sentient robot army? “In our research programme, there isn’t anything that says ‘program consciousness,’ ” he says.•

Tags:

Got my hands on an early copy of Yuval Noah Harari’s Sapiens: A Brief History of Humankind yesterday, and I haven’t been able to put it down. The ideas are many, rich and often contrarian. You might not anticipate a book with that title being a page-turner, but it definitely is. Its drop date in the U.S. is February 10, and I highly recommend it. One brief passage from the opening section:

One of the most common uses of early stone tools was to crack open bones in order to get to the marrow. Some researchers believe this was our original niche. Just as woodpeckers specialise in extracting insects from the trunks of trees, the first humans specialised in extracting marrow from bones. Why marrow? Well, suppose you observe a pride of lions take down and devour a giraffe. You wait patiently until they’re done. But it’s still not your turn because first the hyenas and jackals – and you don’t dare interfere with them – scavenge the leftovers. Only then would you and your band dare approach the carcass, look cautiously left and right – and dig into the only edible tissue that remained.

This is a key to understanding our history and psychology. The position of humans in the food chain was, until quite recently, solidly in the middle. It was only in the last 100,000 years – with the rise of Homo sapiens – that man jumped to the top of the food chain.

That spectacular leap had enormous consequences. Other animals at the top of the pyramid, such as lions and sharks, evolved into that position very gradually, over millions of years. This enabled the ecosystem to develop checks and balances that prevent lions and sharks from wreaking too much havoc. As lions became deadlier, so gazelles evolved to run faster. In contrast, humankind ascended to the top so quickly that the ecosystem was not given time to adjust. Moreover, humans themselves failed to adjust. Most top predators of the planet are majestic creatures. Millions of years of dominion have filled them with self-confidence. Sapiens by contrast is more like a banana republic dictator. Having so recently been one of the underdogs of the savannah, we are full of fears and anxieties over our position, which makes us doubly cruel and dangerous. Many historical calamities, from deadly wars to ecological catastrophes, have resulted from this over-hasty jump.•

 

Tags:

I don’t want to be around anyone who’s an asshole, male or female, but it’s clear that it’s mostly men who can get away with such poor behavior, even be celebrated for it. For proof, look no further than the technology sector. On that topic, an excerpt from “The Difference Machine,” some of Molly Lambert’s typically excellent thinking and writing at Grantland:

When computing was considered drudgery, women played a significant role. They were hired to be “human computors” who carried out math problems and solved equations before machines that could do so existed. During World War II, women were drafted into theElectronic Numerical Integrator and Computer program, where they worked as human “computers.”The women of ENIAC — including Betty Jean Jennings, Kay McNulty, Betty Snyder, Ruth Lichterman, Fran Bilas, and Marlyn Wescoff — were drafted into service as programmers. Snyder wrote SORT/MERGE, the first generative programming system. The women of ENIAC did much of the work but received little credit for it; the Army downplayed their involvement. Once programming became seen as a creative art rather than a rote secretarial one, women were not as welcome. (The Innovators also covers the women of ENIAC in detail, and discusses exactly how programming evolved from being seen as rote flip-switching to an intellectual endeavor.)

Women in tech today are taking a more direct approach to confronting issues of gender inequality. Rooting out the exact causes and conspirators who keep women on tech’s sidelines is difficult, because most forms of prejudice are deeply ingrained and subtly enforced. The solution, at least in part, may come from increasing the visibility of the issues. Tracy Chou, a Pinterest programmer and rising star in tech, has begun asking companies to release the data on their own internal makeup so that it can be tracked. The dismal statistics — women making up 17 percent of the workforce in technology- or engineering-related jobs at Google, 15 percent at Facebook, 9 percent at Mozilla — demonstrate that female engineers and programmers who felt alienated and underrepresented were not imagining things. To combat the concept of the tech bro, there must be a tech sisterhood. Tech history is not a chain of command, it’s a crazy quilt — no machine is ever really built by one person alone. It would be a mistake to consider Ada Lovelace and Grace Hopper as just lone geniuses — the same way it is a mistake to think that way of the men.•

Tags:

A neurophysiological researcher at Yale, Colleen McCullough turned to writing at 37 as a second career and made it her first, producing with The Thorn Birds, a book about illicit love between a married woman and a priest, a career-defining success. Where did a story of such forbidden passion come from? Well, she was the daughter of a bigamist who had at least three wives at the same time. Listen, as an author she wasn’t Carson McCullers, but she didn’t need to be: Her heart was its own kind of lonely hunter. From her New York Times obituary, penned by the excellent Margalit Fox:

On a typical day, Ms. McCullough said, she might produce 15,000 words; on a very good day, 30,000. Her facility was all the more noteworthy in that she continued to use an electric typewriter well into the computer age.

“I spell perfectly,” she told The Inquirer in the 1996 article. “My grammar’s very good. My sentence construction is excellent. So I don’t have a lot of mistakes.” …

As a girl, Ms. McCullough dreamed of becoming a doctor. She entered medical school at the University of Sydney but was forced to abandon her studies after she developed a severe allergy to the soap widely used in Australian hospitals. She trained instead in neurophysiology, which is concerned with testing for and diagnosing neuromuscular diseases.

In the late 1960s, after working at Great Ormond Street Hospital in London, Ms. McCullough accepted a position as a neurophysiological research assistant at the Yale School of Medicine. Discovering that she was being paid less than her male colleagues there, she cast about for another source of income.

“I loved being a neurophysiologist, but I didn’t want to be a 70-year old spinster in a cold-water walk-up flat with one 60-watt light bulb, which is what I could see as my future,” she told The California Literary Review in 2007.

Interested in writing since girlhood, she took to her typewriter.•

Tags: ,

I’ll always remember what that staunch supporter or meritocracy Charles Murray replied when asked by the New York Times Magazine in 2008 about a certain Wasilla-based Republican:

NYT: What do you think of Sarah Palin? Charles Murray: I’m in love. Truly and deeply in love.

All because she stood up at a convention and read a speech someone else wrote that was full of lies. Can you imagine if the Obamas had behaved like the Palins for the past six years, what odious theories Murray would have espoused? 

At the Daily Beast, Matt Lewis, apparently the last person in America to get the memo that even staunch conservatives have long disdained the adult baby who (briefly) governed Alaska, belatedly announces the love affair is over. You don’t say? It’s a perplexing missive from deep inside an echo chamber. The opening:

Has conservative genuflection at the altar of Sarah Palin finally come to a halt?

In case you missed it, her speech in Iowa this week was not well received on the right. The Washington Examiner’s Byron York called it a “long, rambling, and at times barely coherent speech” and National Review’s Charles C.W. Cooke said she slipped into self-parody. And there’s more. The Examiner’s Eddie Scarry, for example, contacted several conservative bloggers who were once Palin fans, but have since moved on.

But here’s my question… what changed?

Yes, in 2008, Sarah Palin delivered one of the finest convention speeches I’ve ever heard (trust me, I was there), but she hasn’t exactly been channeling Winston Churchill ever since. Remember her big speech at CPAC a couple of years ago? You know, the one where she took a swig out of a Big Gulp and said of her husband Todd:He’s got the rifle, I got the rack.” Not exactly a great moment in political rhetoric.

So why is anyone surprised when, this weekend, she said: “‘The Man,’ can only ride ya when your back is bent?”

Demosthenes, she is not, but there’s nothing new about Palin’s penchant for populism or lowbrow rhetoric. What does feel new is that she has finally gotten around to roundly losing conservative opinion leaders. (OK, this has been a long time coming. In 2011, Conor Friedersdorf noted that the hard right was skewering Palin, and that Kathleen Parker had been vindicated. And as recently as this past April, I wondered whether it was finally safe for conservatives to criticize her publicly. But it does feel like we have finally reached a tipping point where criticizing Palin isn’t only acceptable for conservative opinion leaders, it’s now almost expected.)•

Tags: , ,

Will the survival of life as we know it on Earth become affordable before it’s too late? When will extinction avoidance achieve its price point? From Chris Mooney of the Washington Post:

America is a nation of pavement. According to research conducted by the Lawrence Berkeley National Laboratory, most cities’ surfaces are 35 to 50 percent composed of the stuff. And 40 percent of that pavement is parking lots. That has a large effect: Asphalt and concrete absorb the sun’s energy, retaining heat — and contributing to the “urban heat island effect,” in which cities are hotter than the surrounding areas.

So what if there were a way to cut down on that heat, cool down the cars that park in these lots, power up those parked cars that are electric vehicles (like Teslas), and generate a lot of energy to boot? It sounds great, and there is actually a technology that does all of this — solar carports.

It’s just what it sounds like — covering up a parking lot with solar panels, which are elevated above the ground so that cars park in the shade beneath a canopy of photovoltaics. Depending of course on the size of the array, you can generate a lot of power. For instance, one vast solar carport installation at Rutgers University is 28 acres in size and produces 8 megawatts of power, or about enough energy to power 1,000 homes.

Solar carports have many benefits, ranging from aesthetics (yes, the things look very cool) to subtler factors. Like this: Not having to return to a hot car after spending three hours at the mall or a sporting event in the summer. In fact, according to the Environmental Protection Agency and Department of Energy, being able to park in the shade in the summer is actually a substantial contributor to increased vehicle fuel efficiency, because it saves having to cool your car back up by cranking the air conditioner.

So what’s the downside here? And why aren’t solar parking lots to be found pretty much everywhere you turn?

In a word, the problem is cost.•

Tags:

A thing that worries me about Americans right now–and people all around the world, really–is the surfeit of ego, how dearly people need to be respected or else, how little work we’ve done on ourselves internally that we can’t be happy from within about who we are. If you need mass approval, you need too much. The Internet has opened up the media to all, which is wonderful and egalitarian, but it simultaneously opened a Pandora’s box. Not to say that the new platforms are responsible for any type of shocking violence, but there exists beneath it a scary undercurrent; a tool can be a weapon depending on how you swing it. In yet another great Aeon essay, “Running Amok,” Joseph Pierre considers the trigger effect behind mass shootings in the U.S., which he believes are provoked by more than mental illness or guns or video games. The opening:

A movie theatre in Aurora, Colorado. Sandy Hook Elementary School in Connecticut. The Washington Navy Yard. The college town of Isla Vista, California. Most of us recognise these as US sites of recent mass murder, loosely defined as the intentional killing of more than four people in a single incident. Unlike the casualties of war or gang-related murders carried out in inner cities, these acts of domestic terrorism strike a particular brand of fear in the hearts of Americans because they seem to be random acts committed in places where such behaviour is unexpected.

Naturally, we respond by trying to pinpoint the cause: bad parenting, mental illness, guns, video games, the media, heavy metal music, or just plain evil. Once some ‘other’ is identified as an offending agent, we set up a kind of quarantine so that it can be banished from society and no longer threaten. Hoping to allay fears and respond to emotionally charged demands for action, politicians jump on this or that bandwagon with proposals for legislation aimed at sequestering and eliminating would-be culprits. Then we go about our lives, until the next mass shooting occurs and the cycle is repeated.

In the short term, this process makes us feel safer than looking inward and thinking: ‘There but for the grace of God go I.’ But what if the reality is that the underlying cause of mass murder lies not in something external to ourselves, but rather something at the root of human instinct and behaviour that’s also interwoven into American popular culture? This possibility suggests that, rather than trying to get rid of some offending external agent, a more meaningful approach might require looking within ourselves and our own communities for a solution.

In support of this idea, James Fox and Monica DeLateur, criminologists at Northeastern University, published a paper last year in Homicide Studies that dispels some myths about mass shootings and calls into question our tendency to blame things outside of ourselves. To begin with, the authors note that ‘mass shootings have not increased in number or death toll, at least not over the past several decades’. They then go on to demythologise a number of common assumptions about mass shootings. Contrary to popular opinion after the Columbine High School massacre in 1999, in which two schoolboys murdered 12 fellow students and a teacher, violent entertainment doesn’t seem to be a significant cause of mass murder. In terms of interventions, neither tighter gun control nor arming our schools are likely to reduce mass shootings. Even expanded efforts at profiling would-be mass murderers or enhancing mental health services might be futile. Needless to say, these conclusions aren’t very encouraging and the authors end by suggesting that we ought to continue ineffective responses in any case because ‘doing something is better than nothing’.•

Tags:

A follow-up post to the David Graeber video about so-called bullshit jobs, here are excerpts from two articles about modern employment, one from Farhad Manjoo of the New York Times which looks at the Uberization of work and the other by Joshua Krook of New Intrigue which focuses on labor in a highly automated world.

_______________________________

From New Intrigue:

The Robotic (Post-Industrial) Revolution:

There is something very curious about politicians constantly obsessing over people getting jobs in the light of the oncoming Robot Revolution.

Now you might think I’m crazy for believing in such things, but then you will have to call the likes of Stephen Hawking crazy too, which is a much, much more difficult task.

There are already articles on the web asking:What will happen when Robots Take our Jobs? The idea is that an oncoming robotic revolution is coming whether we like it or not.

And with it, the capacity of robots to do the jobs typically reserved for humans – including high-end, white-collar professional work. The latest robotic innovations out of Japan can play ping pong (“and even decide to take it easy on opponents by missing a few hits”), use sign language to “talk” to humans and “mimic simple greetings.” This is only the beginning.

Despite almost every single instinct of intuition in my body saying that robots will make our lives easier, which is what we’ve been taught (using examples like the washing machine in the 1950s) –by freeing up our time and allowing us to work on things that aren’t menial, boring office jobs– we have to look to history here and realise that that seems like an unlikely outcome. History has a few examples where this is true, but on the whole it has gone the other way, and this time round…

It may even go the other way.•

_______________________________

From the New York Times:

Various companies are now trying to emulate Uber’s business model in other fields, from daily chores like grocery shopping and laundry to more upmarket products like legal services and even medicine.

“I do think we are defining a new category of work that isn’t full-time employment but is not running your own business either,” said Arun Sundararajan, a professor at New York University’s business school who has studied the rise of the so-called on-demand economy, and who is mainly optimistic about its prospects.

Uberization will have its benefits: Technology could make your work life more flexible, allowing you to fit your job, or perhaps multiple jobs, around your schedule, rather than vice versa. Even during a time of renewed job growth, Americans’ wages are stubbornly stagnant, and the on-demand economy may provide novel streams of income.

“We may end up with a future in which a fraction of the work force would do a portfolio of things to generate an income — you could be an Uber driver, an Instacart shopper, an Airbnb host and a Taskrabbit,” Dr. Sundararajan said.

But the rise of such work could also make your income less predictable and your long-term employment less secure. And it may relegate the idea of establishing a lifelong career to a distant memory.

“I think it’s nonsense, utter nonsense,” said Robert B. Reich, an economist at the University of California, Berkeley who was the secretary of labor during the Clinton administration. “This on-demand economy means a work life that is unpredictable, doesn’t pay very well and is terribly insecure.” After interviewing many workers in the on-demand world, Dr. Reich said he has concluded that “most would much rather have good, well-paying, regular jobs.”•

Tags: , , ,

Aloft Hotels is already supplementing its human staff (i.e., reducing it) with robots that deliver sundries, and now a theme-park lodging opening in Nagasaki in July is going a step further in injecting silicon into its system, employing Weak AI to do all the grunt work, from robotic arms in the cloak room to facial recognition “keys” for room doors to android receptionists at the front desk. From the Japan Times:

A hotel with robot staff and face recognition instead of room keys will open this summer in Huis Ten Bosch in Nagasaki Prefecture, the operator of the theme park said Tuesday.

The two-story Henn na Hotel is scheduled to open July 17. It will be promoted with the slogan “A Commitment for Evolution,” Huis Ten Bosch Co. said.

The name reflects how the hotel will “change with cutting-edge technology,” a company official said. This is a play on words: “Henn” is also part of the Japanese word for change.

Robots will provide porter service, room cleaning, front desk and other services to reduce costs and to ensure comfort.

There will be facial recognition technology so guests can enter their rooms without a key.

“We will make the most efficient hotel in the world,” company President Hideo Sawada told a news conference. “In the future, we’d like to have more than 90 percent of hotel services operated by robots.”•

Tags:

The Penguin blog has a Nicholas Carr essay about modern navigation devices and the effect they have on the “maps” in our brains, “Welcome to Nowheresville,” which is adapted from a piece of his most recent book, The Glass Cage. Carr is one of those blessed thinkers I always enjoy reading whether I agree with him or not. I don’t necessarily share his concerns about how GPS is redefining what it is to be human (we’ve always been and always will be fluidly defined) or “skill fade” causing transportation fatalities (the net number of such deaths will likely decline as travel becomes more autonomous), but it’s certainly worth considering the unknown neurological consequences of offloading our piloting skills. Are we unwittingly creating a new mismatch disease? An excerpt:

A loss of navigational acumen can have dire consequences for airline pilots and lorry drivers. Most of us, in our daily routines of driving and walking and otherwise getting around, are unlikely to find ourselves in such perilous spots. Which raises the obvious question: Who cares? As long as we arrive at our destination, does it really matter whether we maintain our navigational sense or offload it to a machine? Those of us living in lands crisscrossed by well marked roads and furnished with gas stations, motels, and 7-Elevens long ago lost both the custom of and the capacity for prodigious feats of wayfinding. Our ability to perceive and interpret topography, especially in its natural state, is already much reduced. Paring it away further, or dispensing with it altogether, doesn’t seem like such a big deal, particularly if in exchange we get an easier go of it.

But while we may no longer have much of a cultural stake in the conservation of our navigational prowess, we still have a personal stake in it. We are, after all, creatures of the earth. We’re not abstract dots proceeding along thin blue lines on computer screens. We’re real beings in real bodies in real places. Getting to know a place takes effort, but it ends in fulfillment and in knowledge. It provides a sense of personal accomplishment and autonomy, and it also provides a sense of belonging, a feeling of being at home in a place rather than passing through it. …

The harder people work at building cognitive maps of space, the stronger their underlying memory circuits seem to become. They can actually grow grey matter in the hippocampus—a phenomenon documented in cab drivers—in a way that’s analogous to the building of muscle mass through physical exertion.

But when they simply follow turn-by-turn instructions in “a robotic fashion,” Bohbot warns, they don’t “stimulate their hippocampus” and as a result may leave themselves more susceptible to memory loss. Bohbot worries that, should the hippocampus begin to atrophy from a lack of use in navigation, the result could be a general loss of memory and a growing risk of dementia. “Society is geared in many ways toward shrinking the hippocampus,” she told an interviewer. “In the next twenty years, I think we’re going to see dementia occurring earlier and earlier.”•

Tags:

Fifty years after Stanley Milgram’s “Obedience” study at Yale shocked the world, there’s dispute as to whether the Milgram experiment actually proved casual inhumanity is our default mode. Technologists like Jonah Peretti swear by Milgram, whereas others have begun to swear at him. From Cari Romm at the Atlantic:

To mark the 50th anniversary of the experiments’ publication (or, technically, the 51st), the Journal of Social Issues released a themed edition in September 2014 dedicated to all things Milgram. “There is a compelling and timely case for reexamining Milgram’s legacy,” the editors wrote in the introduction, noting that they were in good company: In 1964, the year after the experiments were published, fewer than 10 published studies referenced Milgram’s work; in 2012, that number was more than 60.

It’s a trend that surely would have pleased Milgram, who crafted his work with an audience in mind from the beginning. “Milgram was a fantastic dramaturg. His studies are fantastic little pieces of theater. They’re beautifully scripted,” said Stephen Reicher, a professor of psychology at the University of St. Andrews and a co-editor of the Journal of Social Issues’ special edition. Capitalizing on the fame his 1963 publication earned him, Milgram went on to publish a book on his experiments in 1974 and a documentary, Obedience, with footage from the original experiments.

But for a man determined to leave a lasting legacy, Milgram also made it remarkably easy for people to pick it apart. The Yale University archives contain boxes upon boxes of papers, videos, and audio recordings, an entire career carefully documented for posterity. Though Milgram’s widow Alexandra donated the materials after his death in 1984, they remained largely untouched for years, until Yale’s library staff began to digitize all the materials in the early 2000s. Able to easily access troves of material for the first time, the researchers came flocking.

“There’s a lot of dirty laundry in those archives,” said Arthur Miller, a professor emeritus of psychology at Miami University and another co-editor of the Journal of Social Issues. “Critics of Milgram seem to want to—and do find—material in these archives that makes Milgram look bad or unethical or, in some cases, a liar.”•

_________________________________

“Oh, I’m not going to kill that man.”

Tags: ,

Looking into a post-human future, John G. Messerly of the Institute of Ethics & Emerging Technologies sees no room for religion, though our devotion may just morph. The opening:

History is littered with dead gods. The Greek and Roman gods, and thousands of others have perished. Yet AllahYahwehKrishna and a few more survive. But will belief in the gods endure? It will not. Our descendents will be too advanced to share such primitive beliefs.

If we survive and science progresses, we will manipulate the genome, rearrange the atom, and augment the mind. And if science defeats suffering and death, religion as we know it will die. Without suffering and death, religion will have lost its raison d’être. For who will pray for heavenly cures, when the cures already exist on earth? Who will die hoping for a reprieve from the gods, when science offers immortality? With the defeat of death, science and technology will have finally triumphed over superstition. Our descendents will know, once and for all, that they are stronger than imaginary gods.

As they continue to evolve our post-human progeny will become increasingly godlike. They will overcome human physical and psychological limitations, and achieve superintellgence, either by modifying their brains or interfacing with computers. While we can’t know this for sure, what we do know is that the future will not be like the past. From our perspective, if science and technology continue to progress, our offspring will come to resemble us about as much as we do the amino acids from which we sprang.

As our descendents distance themselves from their past, they will lose interest in the gods. Such primitive ideas may even be unthinkable for them. Today the gods are impotent, tomorrow they’ll be irrelevant. You may doubt this. But do you really think that in a thousand or a million years your descendents, travelling through an infinite cosmos with augmented minds, will find their answers in ancient scriptures? Do you really think that powerful superintelligence will cling to the primitive mythologies that once satisfied ape-like brains? Only the credulous can believe such things. In the future gods will exist … only if we become them.

Still the future is unknown. Asteroids, nuclear war, environmental degradation, climate change or deadly viruses and bacteria may destroy us. Perhaps the machine intelligences we create will replace us. Or we might survive but create a dystopia. None of these prospects is inviting, but they all entail the end of religion.•

« Older entries § Newer entries »