Books

You are currently browsing the archive for the Books category.

James Salter turned out some beautiful pieces for People magazine during that publication’s infancy, usually profiling other great writers of earlier generations who were living in some state of exile. (Earlier I posted a passage from his Graham Greene article.) In 1975, he coerced Vladimir Nabokov, living in Switzerland two years before his death, into grudgingly sitting for an interview, and recorded the writer’s dislike for many things: fame, hippies, Dostoevsky, etc. It’s not a portrait of only one novelist but also of a different time for writers in general, when one could still find pockets of a less-disposable age. An excerpt:

Novelists, like dictators, have long reigns. It is remarkable to think of Nabokov’s first book, a collection of love poems, appearing in his native Russia in 1914. Soon after, he and his family were forced to flee as a result of the Bolshevik uprising and the civil war. He took a degree at Cambridge and then settled in the émigré colony in Berlin. He wrote nine novels in Russian, beginning with Mary, in 1926, and including Glory, The Defense, and Laughter in the Dark. He had a certain reputation and a fully developed gift when he left for America in 1940 to lecture at Stanford. The war burst behind him.

Though his first novel written in English, The Real Life of Sebastian Knight, in 1941, went almost unnoticed, and his next, Bend Sinister, made minor ripples, the stunning Speak, Memory, an autobiography of his lost youth, attracted respectful attention. It was during the last part of 10 years at Cornell that he cruised the American West during the summers in a 1952 Buick, looking for butterflies, his wife driving and Nabokov beside her making notes as they journeyed through Wyoming, Utah, Arizona, the motels, the drugstores, the small towns. The result was Lolita, which at first was rejected everywhere, like many classics, and had to be published by the Olympia Press in Paris (Nabokov later quarreled with and abandoned his publisher, Maurice Girodias). A tremendous success and later a film directed by Stanley Kubrick, the book made the writer famous. Nabokov coquettishly demurs. “I am not a famous writer,” he says, “Lolita was a famous little girl. You know what it is to be a famous writer in Montreux? An American woman comes up on the street and cries out, ‘Mr. Malamud! I’d know you anywhere.’ ”

He is a man of celebrated prejudices. He abhors student activists, hippies, confessions, heart-to-heart talks. He never gives autographs. On his list of detested writers are some of the most brilliant who have ever lived: Cervantes, Dostoevsky, Faulkner and Henry James. His opinions are probably the most conservative, among important writers, of any since Evelyn Waugh’s. “You will die in dreadful pain and complete isolation,” his fellow exile, the Nobel Prize winner Ivan Bunin, told him. Far from pain these days and beyond isolation, Nabokov is frequently mentioned for that same award. “After all, you’re the secret pride of Russia,” he has written of someone unmistakably like himself. He is far from being cold or uncaring. Outraged at the arrest last year of the writer Maramzin, he sent this as yet unpublished cable to the Soviet writers’ union: “Am appalled to learn that yet another writer martyred just for being a writer. Maramzin’s immediate release indispensable to prevent an atrocious new crime.” The answer was silence.

Last year Nabokov published Look at the Harlequins!, his 37th book. It is the chronicle of a Russian émigré writer named Vadim Vadimych whose life, though he had four devastating wives, has many aspects that fascinate by their clear similarity to the life of Vladimir Vladimirovich. The typical Nabokovian fare is here in abundance, clever games of words, sly jokes, lofty knowledge, all as written by a “scornful and austere author, whose homework in Paris had never received its due.” It is probably one of the final steps toward a goal that so many lesser writers have striven to achieve: Nabokov has joined the current of history not by rushing to take part in political actions or appearing in the news but by quietly working for decades, a lifetime, until his voice seems as loud as the detested Stalin’s, almost as loud as the lies. Deprived of his own land, of his language, he has conquered something greater. As his aunt in Harlequins! told young Vadim, “Play! Invent the world! Invent reality!” Nabokov has done that. He has won.

“I get up at 6 o’clock,” he says. He dabs at his eyes. “I work until 9. Then we have breakfast together. Then I take a bath. Perhaps an hour’s work afterward. A walk, and then a delicious siesta for about two-and-a-half hours. And then three hours of work in the afternoon. In the summer we hunt butterflies.” They have a cook who comes to their apartment, or Véra does the cooking. “We do not attach too much importance to food or wine.” His favorite dish is bacon and eggs. They see no movies. They own no TV.

They have very few friends in Montreux, he admits. They prefer it that way. They never entertain. He doesn’t need friends who read books; rather, he likes bright people, “people who understand jokes.” Véra doesn’t laugh, he says resignedly. “She is married to one of the great clowns of all time, but she never laughs.”

The light is fading, there is no one else in the room or the room beyond. The hotel has many mirrors, some of them on doors, so it is like a house of illusion, part vision, part reflection, and rich with dreams.•

Tags: , ,

At the Forbes site, John Tamny, author of the forthcoming pop culture-saturated book Popular Economics argues that robots will be job creators, not killers, and breathlessly asserts that the Digital Revolution will follow the arc of the Industrial one. Perhaps. But there could be a very bumpy number of decades while that potential transition takes place. Although, as I’ve said before, you wouldn’t want to live in a country left behind in the race to greater AI.

But robots or no robots, here’s one job that should be created: someone to design a site for Forbes that isn’t a complete piece of shit. It’s really like Web 1.0 over there. The opening of Tamny’s reasoning:

As robots increasingly adopt human qualities, including those that allow them to replace actual human labor, economists are starting to worry.  As the Wall Street Journal reported last week, some “wonder if automation technology is near a tipping point, when machines finally master traits that have kept human workers irreplaceable.”

The fears of economists, politicians and workers themselves are way overdone.  They should embrace the rise of robots precisely because they love job creation.  As my upcoming book Popular Economics points out with regularity, abundant job creation is always and everywhere the happy result of technological advances that tautologically lead to job destruction.

Robots will ultimately be the biggest job creators simply because aggressive automation will free us up to do new work by virtue of it erasing toil that was once essential.  Lest we forget, there was a time in American history when just about everyone worked whether they wanted to or not — on farms — just to survive.  Thank goodness technology destroyed lots of agricultural work that freed Americans up to pursue a wide range of vocations off the farm.

With their evolution as labor inputs, robots bring the promise of new forms of work that will have us marveling at labor we wasted in the past, and that will make past job destroyers like wind, water, the cotton gin, the car, the internet and the computer seem small by comparison.  All the previously mentioned advances made lots of work redundant, but far from forcing us into breadlines, the destruction of certain forms of work occurred alongside the creation of totally new ways to earn a living.  Robots promise a beautiful multiple of the same.•

Tags:

Allen Ginsberg was a great poet and performer, if a dubious person in other ways. Here he is in 1965 giving one of his rapturous readings at the Royal Albert Hall. 

Tags:

Paul Krugman is continually taken to task for predicting in 1998 that the Internet would be no more important economically than the fax machine by 2005. Culturally, of course, this new medium has been a watershed event. But he had a point on some level: the Internet–and computers, more broadly–still disappoint from a productivity perspective. Either that or all conventional measurements are insufficient to gauge this new machine. At his Financial Times blog, Andrew McAfee, co-author with Erik Brynjolfsson of 2014’s wonderful The Second Machine Age, wonders about the confusing state of contemporary economics. An excerpt:

The economy’s behaviour is puzzling these days. No matter what you think is going on, there are some facts — important ones — that don’t fit your theory well at all, and/or some important things left unexplained.

For example, if you believe that technological progress is reshaping the economy (as Erik and I do) then you’ve got to explain why productivity growth is so low. As Larry Summers pointed out on the first panel, strong labour productivity growth is the first thing you’d expect to see if tech progress really were taking off and reshaping the economy, disrupting industries, hollowing out the middle class, and so on. So why has it been so weak for the past 10 years? Is it because of mismeasurement? William Baumol’s “Cost Disease” (the idea that all the job growth has come in manual, low-productivity sectors)? Or is it that recent tech progress is in fact economically unimpressive, as Robert Gordon and others believe?

If you believe that tech progress has not been that significant, however, you’ve got to explain why labor’s share of income is declining around the world.•

Tags: , ,

In a belated London Review of Books assessment of The Second Machine Age and Average Is Over, John Lanchester doesn’t really break new ground in considering Deep Learning and technological unemployment, but in his customarily lucid and impressive prose he crystallizes how quickly AI may remake our lives and labor in the coming decades. Two passages follow: The opening, in which he charts the course of how the power of a supercomputer ended up inside a child’s toy in a few short years; and a sequence about the way automation obviates workers and exacerbates income inequality.

__________________________________

In 1996, in response to the 1992 Russo-American moratorium on nuclear testing, the US government started a programme called the Accelerated Strategic Computing Initiative. The suspension of testing had created a need to be able to run complex computer simulations of how old weapons were ageing, for safety reasons, and also – it’s a dangerous world out there! – to design new weapons without breaching the terms of the moratorium. To do that, ASCI needed more computing power than could be delivered by any existing machine. Its response was to commission a computer called ASCI Red, designed to be the first supercomputer to process more than one teraflop. A ‘flop’ is a floating point operation, i.e. a calculation involving numbers which include decimal points (these are computationally much more demanding than calculations involving binary ones and zeros). A teraflop is a trillion such calculations per second. Once Red was up and running at full speed, by 1997, it really was a specimen. Its power was such that it could process 1.8 teraflops. That’s 18 followed by 11 zeros. Red continued to be the most powerful supercomputer in the world until about the end of 2000.

I was playing on Red only yesterday – I wasn’t really, but I did have a go on a machine that can process 1.8 teraflops. This Red equivalent is called the PS3: it was launched by Sony in 2005 and went on sale in 2006. Red was only a little smaller than a tennis court, used as much electricity as eight hundred houses, and cost $55 million. The PS3 fits underneath a television, runs off a normal power socket, and you can buy one for under two hundred quid. Within a decade, a computer able to process 1.8 teraflops went from being something that could only be made by the world’s richest government for purposes at the furthest reaches of computational possibility, to something a teenager could reasonably expect to find under the Christmas tree.

The force at work here is a principle known as Moore’s law. This isn’t really a law at all, but rather the extrapolation of an observation made by Gordon Moore, one of the founders of the computer chip company Intel. By 1965, Moore had noticed that silicon chips had for a number of years been getting more powerful, in relation to their price, at a remarkably consistent rate. He published a paper predicting that they would go on doing so ‘for at least ten years’. That might sound mild, but it was, as Erik Brynjolfsson and Andrew McAfee point out in their fascinating book, The Second Machine Age, actually a very bold statement, since it implied that by 1975, computer chips would be five hundred times more powerful for the same price. ‘Integrated circuits,’ Moore said, would ‘lead to such wonders as home computers – or at least terminals connected to a central computer – automatic controls for automobiles and personal portable communications equipment’. Right on all three. If anything he was too cautious.•

__________________________________

Note that in this future world, productivity will go up sharply. Productivity is the amount produced per worker per hour. It is the single most important number in determining whether a country is getting richer or poorer. GDP gets more attention, but is often misleading, since other things being equal, GDP goes up when the population goes up: you can have rising GDP and falling living standards if the population is growing. Productivity is a more accurate measure of trends in living standards – or at least, it used to be. In recent decades, however, productivity has become disconnected from pay. The typical worker’s income in the US has barely gone up since 1979, and has actually fallen since 1999, while her productivity has gone up in a nice straightish line. The amount of work done per worker has gone up, but pay hasn’t. This means that the proceeds of increased profitability are accruing to capital rather than to labour. The culprit is not clear, but Brynjolfsson and McAfee argue, persuasively, that the force to blame is increased automation.

That is a worrying trend. Imagine an economy in which the 0.1 per cent own the machines, the rest of the 1 per cent manage their operation, and the 99 per cent either do the remaining scraps of unautomatable work, or are unemployed. That is the world implied by developments in productivity and automation. It is Pikettyworld, in which capital is increasingly triumphant over labour. We get a glimpse of it in those quarterly numbers from Apple, about which my robot colleague wrote so evocatively. Apple’s quarter was the most profitable of any company in history: $74.6 billion in turnover, and $18 billion in profit. Tim Cook, the boss of Apple, said that these numbers are ‘hard to comprehend’. He’s right: it’s hard to process the fact that the company sold 34,000 iPhones every hour for three months. Bravo – though we should think about the trends implied in those figures. For the sake of argument, say that Apple’s achievement is annualised, so their whole year is as much of an improvement on the one before as that quarter was. That would give them $88.9 billion in profits. In 1960, the most profitable company in the world’s biggest economy was General Motors. In today’s money, GM made $7.6 billion that year. It also employed 600,000 people. Today’s most profitable company employs 92,600. So where 600,000 workers would once generate $7.6 billion in profit, now 92,600 generate $89.9 billion, an improvement in profitability per worker of 76.65 times. Remember, this is pure profit for the company’s owners, after all workers have been paid. Capital isn’t just winning against labour: there’s no contest. If it were a boxing match, the referee would stop the fight.•

Tags: , , , ,

I haven’t yet read Naomi Klein’s book, This Changes Everything: Capitalism vs. the Climate, the one that Elizabeth Kolbert took to task for not being bold enough. (Kolbert’s own volume on the topic, The Sixth Extinction, was one of my favorite books of 2014.) In an often-contentious Spiegel interview conducted by Klaus Brinkbäumer, Klein contends that capitalism and ecological sanity are incompatible and calls out supposedly green captains of industry like Michael Bloomberg and Richard Branson. An excerpt:

Spiegel:

The US and China finally agreed on an initial climate deal in 2014.

Naomi Klein:

Which is, of course, a good thing. But anything in the deal that could become painful won’t come into effect until Obama is out of office. Still, what has changed is that Obama said: “Our citizens are marching. We can’t ignore that.” The mass movements are important; they are having an impact. But to push our leaders to where they need to go, they need to grow even stronger.

Spiegel:

What should their goal be?

Naomi Klein:

Over the past 20 years, the extreme right, the complete freedom of oil companies and the freedom of the super wealthy 1 percent of society have become the political standard. We need to shift America’s political center from the right fringe back to where it belongs, the real center.

Spiegel:

Ms. Klein, that’s nonsense, because it’s illusory. You’re thinking far too broadly. If you want to first eliminate capitalism before coming up with a plan to save the climate, you know yourself that this won’t happen.

Naomi Klein:

Look, if you want to get depressed, there are plenty of reasons to do so. But you’re still wrong, because the fact is that focusing on supposedly achievable incremental changes light carbon trading and changing light bulbs has failed miserably. Part of that is because in most countries, the environmental movement remained elite, technocratic and supposedly politically neutral for two-and-a-half decades. We are seeing the result of this today: It has taken us in the wrong direction. Emissions are rising and climate change is here. Second, in the US, all the major legal and social transformations of the last 150 years were a consequence of mass social movements, be they for women, against slavery or for civil rights. We need this strength again, and quickly, because the cause of climate change is the political and economic system itself. The approach that you have is too technocratic and small.

Spiegel:

If you attempt to solve a specific problem by overturning the entire societal order, you won’t solve it. That’s a utopian fantasy.

Naomi Klein:

Not if societal order is the root of the problem. Viewed from another perspective, we’re literally swimming in examples of small solutions: There are green technologies, local laws, bilateral treaties and CO2 taxation. Why don’t we have all that at a global level?

Spiegel:

You’re saying that all the small steps — green technologies and CO2 taxation and the eco-behavior of individuals — are meaningless?

Naomi Klein:

No. We should all do what we can, of course. But we can’t delude ourselves that it’s enough. What I’m saying is that the small steps will remain too small if they don’t become a mass movement. We need an economic and political transformation, one based on stronger communities, sustainable jobs, greater regulation and a departure from this obsession with growth. That’s the good news. We have a real opportunity to solve many problems at once.•

Tags: , , , ,

You could argue that Tunisia’s uprising was the match that lit the Middle East, as some struggles reverberate beyond their borders because they speak to a widespread dissatisfaction. The Paris Commune was viewed this way by outsiders during the late 1800s. Via the lovely Delancey Place, a passage from James Green’s Death in the Haymarket about the American interpretation of the French uprising:

When the French army laid siege to Paris and hostilities began, the Chicago Tribune’s reporters covered the fighting much as they had during the American Civil War. Many Americans, notably Republican leaders like Senator Charles Sumner, identified with the citizens of Paris who were fighting to create their own republic against the forces of a corrupt regime whose leaders had surrendered abjectly to the Iron Duke and his Prussian forces.

As the crisis deepened, however, American newspapers increasingly portrayed the Parisians as communists who confiscated property and as atheists who closed churches. The brave citizens of Paris, first described as rugged democrats and true republicans, now seemed more akin to the uncivilized elements that threatened America — the ‘savage tribes’ of Indians on the plains and the ‘dangerous classes’ of tramps and criminals in the cities. When the Commune’s defenses broke down on May 21, 1871, the Chicago Tribune hailed the breach of the city walls. Comparing the Communards to the Comanches who raided the Texas frontier, its editors urged the ‘mowing down’ of rebellious Parisians ‘without compunction or hesitation.’

La semaine sanglante — the week of blood — had begun as regular army troops took the city street by street, executing citizen soldiers of the Parisian National Guard as soon as they surrendered. In retaliation, the Communards killed scores of hostages and burned large sections of the city to the ground. By the time the killing ended, at least 25,000 Parisians, including many unarmed citizens, had been slaughtered by French army troops.

These cataclysmic events in France struck Americans as amazing and distressing. The bloody disaster cried out for explanation. In response, a flood of interpretations appeared in the months following the civil war in France. Major illustrated weeklies published lurid drawings of Paris scenes, of buildings gutted by fire, monuments toppled, churches destroyed and citizens executed, including one showing the death of a ‘petroleuse’ — a red-capped, bare-breasted woman accused of incendiary acts. Cartoonist Thomas Nast drew a picture of what the Commune would look like in an American city. Instant histories were produced, along with dime novels, short stories, poems and then, later in the fall, theatricals and artistic representations in the form of panoramas.

News of the Commune seemed exotic to most Americans, but some commentators wondered if a phenomenon like this could appear in one of their great cities, such as New York or Chicago, where vast hordes of poor immigrants held mysterious views of America and harbored subversive elements in their midst.•

Tags:

So sad to learn of Oliver Sacks’ terminal illness. I read The Man Who Mistook His Wife for a Hat at a young age, and I didn’t know what the hell to make of it, so stunned was I to find out that we’re not necessarily in control of our minds. In this piece of writing and so many others, Sacks examined the brain, that mysterious and scary thing, and because of his work as an essayist as well as a doctor, that organ is today a little less mysterious, a little less scary. It doesn’t mean he was always right, but how could anyone be when sailing in such dark waters? Sacks was accused sometimes of being a modern Barnum who used as diverting curiosities those with the misfortune of having minds that played tricks on them–even stranger tricks than the rest of us experience–and sometimes I cringed at the very personal things he would reveal about his subjects, but I always felt he strived to be ethical. We certainly live in an era when the freak show still thrives, albeit in a slickly produced form, but I don’t think that’s where Sacks’ work has ever lived. His prose and narrative abilities grew markedly during his career as he he came to realize–be surprised by?–his own brain’s capabilities. I hope he has a peaceful and productive final chapter. 

A profile of Sacks by Diane Sawyer with good 1969 footage of his work as a young doctor.

Tags: ,

Audio of Oriana Fallaci being interviewed in 1972 by Stephen Banker at the time of the publication of Nothing, and So Be It, her account of the dangerous season she spent as a war correspondent in Vietnam.

Tags:

I’ll be perplexed if Yuval Noah Harari’s great book Sapiens: A Brief History of Humankind, just published in the U.S., doesn’t wind up on many “Best of 2015” lists at the end of the year. It’s such an amazing, audacious, lucid thing. Salon has run a piece from the volume. Here’s an excerpt about the seemingly eternal search for eternity:

The Gilgamesh Project

Of all mankind’s ostensibly insoluble problems, one has remained the most vexing, interesting and important: the problem of death itself. Before the late modern era, most religions and ideologies took it for granted that death was our inevitable fate. Moreover, most faiths turned death into the main source of meaning in life. Try to imagine Islam, Christianity or the ancient Egyptian religion in a world without death. These creeds taught people that they must come to terms with death and pin their hopes on the afterlife, rather than seek to overcome death and live for ever here on earth. The best minds were busy giving meaning to death, not trying to escape it.

That is the theme of the most ancient myth to come down to us – the Gilgamesh myth of ancient Sumer. Its hero is the strongest and most capable man in the world, King Gilgamesh of Uruk, who could defeat anyone in battle. One day, Gilgamesh’s best friend, Enkidu, died. Gilgamesh sat by the body and observed it for many days, until he saw a worm dropping out of his friend’s nostril. At that moment Gilgamesh was gripped by a terrible horror, and he resolved that he himself would never die. He would somehow find a way to defeat death. Gilgamesh then undertook a journey to the end of the universe, killing lions, battling scorpion-men and finding his way into the underworld. There he shattered the mysterious “stone things” of Urshanabi, the ferryman of the river of the dead, and found Utnapishtim, the last survivor of the primordial flood. Yet Gilgamesh failed in his quest. He returned home empty-handed, as mortal as ever, but with one new piece of wisdom. When the gods created man, Gilgamesh had learned, they set death as man’s inevitable destiny, and man must learn to live with it.

Disciples of progress do not share this defeatist attitude. For men of science, death is not an inevitable destiny, but merely a technical problem. People die not because the gods decreed it, but due to various technical failures – a heart attack, cancer, an infection. And every technical problem has a technical solution. If the heart flutters, it can be stimulated by a pacemaker or replaced by a new heart. If cancer rampages, it can be killed with drugs or radiation. If bacteria proliferate, they can be subdued with antibiotics. True, at present we cannot solve all technical problems. But we are working on them. Our best minds are not wasting their time trying to give meaning to death. Instead, they are busy investigating the physiological, hormonal and genetic systems responsible for disease and old age. They are developing new medicines, revolutionary treatments and artificial organs that will lengthen our lives and might one day vanquish the Grim Reaper himself.

Until recently, you would not have heard scientists, or anyone else, speak so bluntly. ‘Defeat death?! What nonsense! We are only trying to cure cancer, tuberculosis and Alzheimer’s disease,’ they insisted. People avoided the issue of death because the goal seemed too elusive. Why create unreasonable expectations? We’re now at a point, however, where we can be frank about it. The leading project of the Scientific Revolution is to give humankind eternal life. Even if killing death seems a distant goal, we have already achieved things that were inconceivable a few centuries ago. In 1199, King Richard the Lionheart was struck by an arrow in his left shoulder. Today we’d say he incurred a minor injury. But in 1199, in the absence of antibiotics and effective sterilisation methods, this minor flesh wound turned infected and gangrene set in. The only way to stop the spread of gangrene in twelfth-century Europe was to cut off the infected limb, impossible when the infection was in a shoulder. The gangrene spread through the Lionheart’s body and no one could help the king. He died in great agony two weeks later.

As recently as the nineteenth century, the best doctors still did not know how to prevent infection and stop the putrefaction of tissues. In field hospitals doctors routinely cut off the hands and legs of soldiers who received even minor limb injuries, fearing gangrene. These amputations, as well as all other medical procedures (such as tooth extraction), were done without any anaesthetics. The first anaesthetics – ether, chloroform and morphine – entered regular usage in Western medicine only in the middle of the nineteenth century. Before the advent of chloroform, four soldiers had to hold down a wounded comrade while the doctor sawed off the injured limb. On the morning after the battle of Waterloo (1815), heaps of sawn-off hands and legs could be seen adjacent to the field hospitals. In those days, carpenters and butchers who enlisted to the army were often sent to serve in the medical corps, because surgery required little more than knowing your way with knives and saws.

In the two centuries since Waterloo, things have changed beyond recognition. Pills, injections and sophisticated operations save us from a spate of illnesses and injuries that once dealt an inescapable death sentence. They also protect us against countless daily aches and ailments, which premodern people simply accepted as part of life. The average life expectancy jumped from around twenty-five to forty years, to around sixty-seven in the entire world, and to around eighty years in the developed world.

Death suffered its worst setbacks in the arena of child mortality. Until the twentieth century, between a quarter and a third of the children of agricultural societies never reached adulthood. Most succumbed to childhood diseases such as diphtheria, measles and smallpox. In seventeenth-century England, 150 out of every 1,000 newborns died during their first year, and a third of all children were dead before they reached fifteen. Today, only five out of 1,000 English babies die during their first year, and only seven out of 1,000 die before age fifteen.

We can better grasp the full impact of these figures by setting aside statistics and telling some stories. A good example is the family of King Edward I of England (1237–1307) and his wife, Queen Eleanor (1241–90). Their children enjoyed the best conditions and the most nurturing surroundings that could be provided in medieval Europe. They lived in palaces, ate as much food as they liked, had plenty of warm clothing, well-stocked fireplaces, the cleanest water available, an army of servants and the best doctors. The sources mention sixteen children that Queen Eleanor bore between 1255 and 1284:

1. An anonymous daughter, born in 1255, died at birth.

2. A daughter, Catherine, died either at age one or age three.

3. A daughter, Joan, died at six months.

4. A son, John, died at age five.

5. A son, Henry, died at age six.

6. A daughter, Eleanor, died at age twenty-nine.

7. An anonymous daughter died at five months.

8. A daughter, Joan, died at age thirty-five.

9. A son, Alphonso, died at age ten.

10. A daughter, Margaret, died at age fifty-eight.

11. A daughter, Berengeria, died at age two.

12. An anonymous daughter died shortly after birth.

13. A daughter, Mary, died at age fifty-three.

14. An anonymous son died shortly after birth.

15. A daughter, Elizabeth, died at age thirty-four.

16. A son, Edward.

The youngest, Edward, was the first of the boys to survive the dangerous years of childhood, and at his father’s death he ascended the English throne as King Edward II. In other words, it took Eleanor sixteen tries to carry out the most fundamental mission of an English queen – to provide her husband with a male heir. Edward II’s mother must have been a woman of exceptional patience and fortitude. Not so the woman Edward chose for his wife, Isabella of France. She had him murdered when he was forty-three.

To the best of our knowledge, Eleanor and Edward I were a healthy couple and passed no fatal hereditary illnesses on to their children. Nevertheless, ten out of the sixteen – 62 per cent – died during childhood. Only six managed to live beyond the age of eleven, and only three – just 18 per cent – lived beyond the age of forty. In addition to these births, Eleanor most likely had a number of pregnancies that ended in miscarriage. On average, Edward and Eleanor lost a child every three years, ten children one after another. It’s nearly impossible for a parent today to imagine such loss.

How long will the Gilgamesh Project – the quest for immortality – take to complete? A hundred years? Five hundred years? A thousand years? When we recall how little we knew about the human body in 1900, and how much knowledge we have gained in a single century, there is cause for optimism. Genetic engineers have recently managed to double the average life expectancy of Caenorhabditis elegans worms. Could they do the same for Homo sapiens? Nanotechnology experts are developing a bionic immune system composed of millions of nano-robots, who would inhabit our bodies, open blocked blood vessels, fight viruses and bacteria, eliminate cancerous cells and even reverse ageing processes. A few serious scholars suggest that by 2050, some humans will become a-mortal (not immortal, because they could still die of some accident, but a-mortal, meaning that in the absence of fatal trauma their lives could be extended indefinitely).•

Tags:

Marc Goodman, law-enforcement veteran and author of the forthcoming book Future Crimes, sat for an interview with Jason Dorrier of Singularity Hub about the next wave nefariousness, Internet-enabled and large-scale. A question about the potential for peril writ relatively small with Narrow AI and on a grand scale if we create Artificial General Intelligence. An excerpt::

Question:

Elon Musk, Stephen Hawking, and Bill Gates have expressed concern about artificial general intelligence. It’s a hotly debated topic. Might AI be our “final invention?” It seems even narrow AI in the wrong hands might be problematic.

Marc Goodman:

I would add Marc Goodman to that list. To be clear, I think AI, narrow AI, and the agents around us have tremendous opportunity to be incredibly useful. We’re using AI every day, whether it’s in our GPS devices, in our Netflix recommendations, what we see on our Facebook status updates and streams—all of that is controlled via AI.

With regard to AGI, however, I put myself firmly in the camp of concern.

Historically, whatever the tool has been, people have tried to use it for their own power. Of course, typically, that doesn’t mean that the tool itself is bad. Fire wasn’t bad. It could cook your meals and keep you warm at night. It comes down to how we use it. But AGI is different. The challenge with AGI is that once we create it, it may be out of our hands entirely, and that could certainly make it our “final invention.”

I’ll also point out that there are concerns about narrow AI too.

We’ve seen examples of criminals using narrow AI in some fascinating ways. In one case, a University of Florida student was accused of killing his college roommate for dating his girlfriend. Now, this 18-year-old freshman had a conundrum. What does he do with the dead body before him? Well, he had never murdered anybody before, and he had no idea how to dispose of the body. So, he asked Siri. The answers Siri returned? Mine, swamp, and open field, among others.

So, Siri answered his question. This 18-year-old kid unknowingly used narrow AI as an accomplice after the fact in his homicide. We’ll see many more examples of this moving forward. In the book, I say we’re leaving the world of Bonnie and Clyde and joining the world of Siri and Clyde.•

Tags: ,

If you read this blog regularly, you know I adored David Carr, someone I never met except through his writing. His success was improbable, having previously survived a pitiless drug addiction–a surrender and an onslaught. Almost as unlikely was that he maintained his soulfulness inside a corporate behemoth like the New York Times, appearing unchanged, unreconstructed, unvanquished, perhaps inoculated from the plague of phoniness by the earlier taste of poison. It doesn’t surprise me that in his final column he hoped for a second chance for Brian Williams. Carr himself was one of the best second chances ever. He will be missed. From his book The Night of the Gun, in which he searched for a face that was strange yet his own:

Am I a lunatic? Yes. When am I going to cut this stuff out? Apparently never. Does God see me right now? Yes. God sees everything, including the blind.

Trapped in drug-induced paranoia, I began to think of the police as God’s emissaries, arriving not to seek vengeance but a cease-fire, a truce that would put me up against a wall of well-deserved consequences, and the noncombatants, the children, out of harm’s way.

On this night — it was near the end — every hit sent out an alarm along my vibrating synapses. If the cops were coming — Any. Minute. Now. — I should be sitting out in front of the house. That way I could tell them that yes, there were drugs and paraphernalia in the house, but no guns. And there were four blameless children. They could put the bracelets on me, and, head bowed, I would solemnly lead them to the drugs, to the needles, to the pipes, to what was left of the money. And then some sweet-faced matrons would magically appear and scoop up those babies and take them to that safe, happy place. I had it all planned out.

I took another hit, and Barley and I walked out and sat on the steps. My eyes, my heart, the veins in my forehead, pulsed against the stillness of the night. And then they came. Six unmarked cars riding in formation with lights off, no cherries, just like I pictured. It’s on.

A mix of uniforms and plainclothes got out, and in the weak light of the street, I could see long guns held at 45-degree angles. I was oddly proud that I was on the steps, that I now stood between my children and the dark fruits of the life I had chosen. I had made the right move after endless wrong ones. And then they turned and went to the house across the street.

Much yelling. “Facedown! Hug the carpet! No sudden movements!” A guy dropped out of the second-floor window in just gym shorts, but they were waiting. More yelling and then quiet. I went back inside the house and watched the rest of it play out through the corner of the blind. Their work done, the cops loaded several cuffed people into a van. I let go of the blind and got back down to business. It wasn’t my turn.

Twenty years later, now sober and back for a look at my past, I sat outside that house on Oliver Avenue on a hot summer day in a rental car, staring long and hard to make sense of what had and had not happened there. The neighborhood had turned over from white to black, but it was pretty much the same. Nice lawns, lots of kids, no evidence of the mayhem that had gone on inside. Sitting there in a suit with a nice job in a city far away and those twins on their way to college, I almost would have thought I’d made it up. But I don’t think I did. While I sat there giving my past the once over, someone lifted up the corner of the blind in the living-room window. It was time to go.•

Tags:

Donald Trump, a human oil spill, apparently requested that the Obama Administration make him czar of the BP cleanup effort, according to David Axelrod’s new book. From Amy Chozick in the New York Times:

Question:

Some anecdotes in the book make clear that, as a senior adviser to the president, you dealt with some odd requests. Donald Trump asked you to put him in charge of cleaning up the BP oil spill.

David Axelrod:

You owe it to the president to be polite and to give folks a hearing. But even as I was going through these conversations, I had this sense of surreality. I was watching the scene and thinking, Man, this is really bizarre. I gotta write about this someday. Nobody will believe this.•

Tags: , , ,

I think the most defining negative quality of bureaucracy is simply incompetence. Look at the example of relief efforts in New York City in the wake of Hurricane Sandy. When the federal government was in charge, FEMA did quite well. But when $100 million was shifted to the city, the efforts were a fiasco. A program called Build It Back was established by Mayor Bloomberg, and in his final 14 months in office not a single destroyed home was built back. Not one. Local homeowners living in ruins or in shelters were summoned to government offices numerous times to provide information, but while paperwork piled high, no one was helped. (So far during the de Blasio Administration, a little more than 300 homes have been completely repaired out of more than one thousand where work has begun.)

When David Graeber looks at bureaucracy, he senses something more sinister than ineptitude. He sees the potential for routine violence. In an interview about his forthcoming book, The Utopia of Rules, with David Whelan of Vice, the anthropologist looks into our future and believes an Orwellian nightmare may be headed our way. I don’t agree with Graeber’s vision of technological dreams vanishing or of library fines being commonly treated as felonies, but we’re already living in a society where “quality-of-life” policing is often used as racial punishment and certainly we’re being more tracked and quantified each day. An excerpt: 

Vice:

OK, say we’re 50 years from now, this moment. What’s happening?

David Graeber:

Research investment has changed. Flying cars are scrapped. They say to hell with going to Mars. All this space age stuff is done. Money moves elsewhere, such as information technology. And now every intimate aspect of your life is under potential bureaucratic scrutiny, which means fines and violence.

Vice:

What happens if you step out of line?

David Graeber:

Bureaucratic societies rely on the threat of violence. We follow their rules because if we don’t there’s a chance we’ll get killed. A good way to think of this is through libraries.

Vice:

Libraries?

David Graeber:

Say you want to go get a book by Foucault from the library describing why life is all a matter of physical coercion, but you haven’t paid an overdue fine and therefore you don’t have a currently valid personal ID. You walk through the gate illegally. What’s going to happen?

Vice:

A smacked bottom?

David Graeber:

Men with sticks will eventually show up and threaten to hit you.

Vice:

Wait. This actually happens?

David Graeber:

Yeah. Check out the UCLA Taser incident in 2006. They Tasered him, told him to get up, then Tasered him again.

Vice:

What’s the point in that?

David Graeber:

The point is bureaucracy. They don’t care who he is or why he’s there. It doesn’t matter who you are. You just apply the same rules to everybody, because that’s “fair.”

Vice:

But if you’re at the top of the bureaucratic tree, those rules don’t apply.

David Graeber:

Bureaucracy provides an illusion of fairness. Everyone is equal before the law, but the problem is it never works like that. But to advance in a bureaucratic system the one thing you CANNOT do is point out all the ways the system doesn’t work the way it’s supposed to. You have to pretend it’s a meritocracy.•

Tags: ,

Working off futurist Martin Ford’s forthcoming book Rise of the Robots, Zoë Corbyn of the Guardian analyzes the next phase of labor, in which many of the human laborers will be phased out. The opening:

It could be said that the job of bridge toll collector was invented in San Francisco. In 1968, the Golden Gate Bridge became the world’s first major bridge to start employing people to take tolls.

But in 2013 the bridge where it all began went electronic. Of its small band of collectors, 17 people were redeployed or retired and nine found themselves out of work. It was the software that did it – a clear-cut case of what economists call technological unemployment. Licence-plate recognition technology took over. Automating jobs like that might not seem like a big deal. It is easy to see how it might happen, just as how we buy train tickets at machines or book movie tickets online reduces the need for people.

But technology can now do many more things that used to be unique to people. Rethink Robotics’ Baxter, a dexterous factory robot that can be programmed by grabbing its arms and guiding it through the motions, sells for a mere $25,000 (equivalent to about $4 an hour over a lifetime of work, according to a Stanford University study). IPsoft’s Amelia, a virtual service desk employee, is being trialled by oil industry companies, such as Shell and Baker Hughes, to help with employee training and inquiries. Meanwhile, doctors are piloting the use of Watson, IBM’s supercomputer, to assist in diagnosing patients and suggesting treatments. Law firms are using software such as that developed by Blackstone Discovery to automate legal discovery, the process of gathering evidence for a lawsuit, previously an important task of paralegals. Rio Tinto’smine of the futurein Western Australia has 53 autonomous trucks moving ore and big visions for expansion. Even the taxi-sharing company Uber is in on the act – it has just announced it will open a robotics research facility to work on building self-driving cars.

The upshot will be many people losing jobs to software and machines, says Silicon Valley-based futurist Martin Ford, whose book The Rise of the Robots comes out this year. He forecasts significant unemployment and rising inequality unless radical changes are made.•

Tags: ,

James Salter’s sad 1967 novel, A Sport and a Pastime, has only grown in stature since its publication, but the book apparently didn’t make the author financially independent. Salter, who will turn 90 in June, picked up some paychecks writing articles for People in the 1970s, including a profile of a septuagenarian Graham Greene, who was then living a rather anonymous life in Paris. Judging from this piece, Philip Roth and China were among Greene’s dislikes. An excerpt:

Greene still reads a lot, three or four books a week, and notes them in his diary, putting down a little tick or cross in judgment. Among the Americans, he likes Kurt Vonnegut. Gore Vidal: “I like his essays.” Alison Lurie. Philip Roth, not much. Bellow, he finds rather difficult. As for his own work, even coming from a long-lived family it is not easy, he admits, to think of starting on a book these days. “The fears,” he says simply, “not knowing whether one will live to see the end of it.”

He has been a published writer since 1929 with his first novel, The Man Within. There have been novels, travel books, thrillers, films, plays, short stories and autobiography as well as essays and reviews. His output has been protean and the breadth of his travel and experience, vast. Many of his settings are foreign. The Honorary Consul, for instance, resulted from a three months’ trip to South America. Though his command of Spanish covers only the present tense, he was visiting in Argentina and saw the town of Corrientes one day while going up the river to Asunción. Corrientes became the scene of the book. He has been in Africa, Mexico, Russia and China (“I found it depressing”), served as an intelligence officer in Sierra Leone during the war, smoked opium in Indochina where he went as a correspondent regularly beginning in 1951 and flew in French bombers between Saigon and Hanoi. He has been an editor in a publishing house, a film reviewer, a critic, a life as varied and glamorous as that of André Malraux, another great literary and political figure. Like Malraux, he asks to be read as a political writer and has set his fiction firmly in that world. The lesson in the books of Graham Greene is the great lesson of the times: one must take sides.•

Tags: , ,

Ed Finn of Slate has a new interview with Margaret Atwood, and in one give-and-take she explains her philosophy on writing about the future. An excerpt:

Question:

Whether you call it science fiction or speculative fiction, much of your work imagines a future that many of us wouldn’t want. Do you see stories as a way to effect change in the world, especially about climate change?

Margaret Atwood:

I think calling it climate change is rather limiting. I would rather call it the everything change because when people think climate change, they think maybe it’s going to rain more or something like that. It’s much more extensive a change than that because when you change patterns of where it rains and how much and where it doesn’t rain, you’re also affecting just about everything. You’re affecting what you can grow in those places. You’re affecting whether you can live there. You’re affecting all of the species that are currently there because we are very water dependent. We’re water dependent and oxygen dependent.

The other thing that we really have to be worried about is killing the oceans, because should we do that there goes our major oxygen supply, and we will wheeze to death.

It’s rather useless to write a gripping narrative with nothing in it but climate change because novels are always about people even if they purport to be about rabbits or robots. They’re still really about people because that’s who we are and that’s what we write stories about.

You have to show people in the midst of change and people coping with change, or else it’s the background. In the MaddAddam books, people hardly mentioned “climate change,” but things have already changed. For instance, in the world of Jimmy who we follow in Oryx and Crake, the first book, as he’s growing up as an adolescent, they’re already getting tornadoes on the East Coast of the United States, the upper East Coast, because I like setting things in and around Boston. It’s nice and flat, and when the sea rises a bunch of it will flood. It’s the background, but it’s not in-your-face a sermon.

When you set things in the future, you’re thinking about all of the same things as the things that you’re thinking about when you’re writing historical fiction. But with the historical fiction, you’ve got more to go on, and you also know that people are going to be checking up on your details. If you put the wrong underpants on Henry VIII, you’re in trouble.•

Tags: ,

The thing about pornographers, those horrible people, is that they were right, their suspicions about us proved true. No matter the moral posture, we did want their wares, and we wanted them to be portable. Before smartphones offering every category you could imagine and some you couldn’t, pulpy paperbacks did the trick. The 1970s were the golden age for such prurient printed matter, until that moment was disrupted by technology, first the VCR and then the Internet. Andrew Offutt (who wrote most often as “John Cleve”) was the lonely and tortured king of the Selectric-produced sex book, making it possible for gentlemen to jerk it to genre art, sordid space odysseys and wankable Westerns. His son Chris, who was deputized with the responsibility of sorting through his late father’s sizable and seemly estate, recalls dad’s uneasy reign in the New York Times Magazine. An excerpt:

The commercial popularity of American erotic novels peaked during the 1970s, coinciding with my father’s most prolific and energetic period. Dad combined porn with all manner of genre fiction. He wrote pirate porn, ghost porn, science-fiction porn, vampire porn, historical porn, time-travel porn, secret-agent porn, thriller porn, zombie porn and Atlantis porn. An unpublished Old West novel opens with sex in a barn, featuring a gunslinger called Quiet Smith, without doubt Dad’s greatest character name. By the end of the decade, Dad claimed to have single-handedly raised the quality of American pornography. He believed future scholars would refer to him as the “king of 20th-century written pornography.” He considered himself the “class operator in the field.”

In the 1980s, John Cleve’s career culminated with a 19-book series for Playboy Press, the magazine’s foray into book publishing. The “Spaceways” series allowed him to blend porn with old-time “space opera,” reminiscent of the 1930s pulps, his favorite kind of science fiction. Dad’s modern twist included aliens who possessed the genitalia of both genders. Galactic crafts welcomed the species as part of their crews, because they were unencumbered with the sexual repression of humans and could service men and women alike. The books were popular, in part, because of their campiness, repeating characters and entwined stories — narrative tropes that later became standard on television. The “Spaceways” series ended in 1985, coinciding with the widespread ownership of VCRs. Men no longer needed “left-handed books” for stimulation when they could watch videotapes in their own homes. The era of written pornography was over.

John Cleve retired. Dad insisted that he himself hadn’t quit, but that John Cleve had. It was more retreat than retirement, a slipping back into the shadows, fading away like an old soldier. Cleve had done his duty — the house was paid off, the kids were grown and the bank held a little savings.

Dad was 52. As Cleve, he published more than 130 books in 18 years. He turned to self-publishing and, using an early pseudonym, Turk Winter, published 260 more titles over the next 25 years.•

Tags: , ,

Literature will be around as long as people are, but the particular literary world which George Plimpton and John Gregory Dunne inhabited has been disrupted, permanently. It wasn’t necessarily greater, but it was great. In a 1996 Paris Review interview, the former queried the latter about writing. The opening:

George Plimpton:

Your work is populated with the most extraordinary grotesqueries—nutty nuns, midgets, whores of the most breathtaking abilities and appetites. Do you know all these characters?

John Gregory Dunne:

Certainly I knew the nuns. You couldn’t go to a parochial school in the 1940s and not know them. They were like concentration-camp guards. They all seemed to have rulers and they hit you across the knuckles with them. The joke at St. Joseph’s Cathedral School in Hartford, Connecticut, where I grew up, was that the nuns would hit you until you bled and then hit you for bleeding. Having said that, I should also say they were great teachers. As a matter of fact, the best of my formal education came from the nuns at St. Joseph’s and from the monks at Portsmouth Priory, a Benedictine boarding school in Rhode Island where I spent my junior and senior years of high school. The nuns taught me basic reading, writing, and arithmetic; the monks taught me how to think, how to question, even to question Catholicism in order to better understand it. The nuns and the monks were far more valuable to me than my four years at Princeton. I’m not a practicing Catholic, but one thing you never lose from a Catholic education is a sense of sin and the conviction that the taint on the human condition is the natural order.

George Plimpton:

What about the whores and midgets?

John Gregory Dunne:

I suppose for that I would have to go to my informal education. I spent two years as an enlisted man in the army in Germany after the Korean War, and those two years were the most important learning experience I really ever had. I was just a tight-assed upper-middle-class kid, the son of a surgeon, and I had this sense of Ivy League entitlement, and all that was knocked out of me in the army. Princeton boys didn’t meet the white and black underclass that you meet as an enlisted draftee. It was a constituency of the dispossessed—high-school dropouts, petty criminals, rednecks, racists, gamblers, you name it—and I fit right in. I grew to hate the officer class that was my natural constituency. A Princeton classmate was an officer on my post and he told me I was to salute him and call him sir, as if I had to be reminded, and also that he would discourage any outward signs that we knew each other. I hate that son of a bitch to this day. I took care of him in Harp. Those two years in Germany gave me a subject I suppose I’ve been mining for the past God-knows-how-many years. It fit nicely with that Catholic sense of sin, the taint on the human condition. And it was in the army that I learned to appreciate whores. You didn’t meet many Vassar girls when you were serving in a gun battery on the Czech border and were in a constant state of alert in case the Red Army came rolling across the frontier. As for midgets, they’re part of that constituency of the dispossessed.

George Plimpton:

You once said you only had one character. Is that true?

John Gregory Dunne:

I’ve always thought a novelist only has one character and that is himself or herself. In my case, me.•

Tags: ,

Speaking of psychedelics enthusiasts, Aldous Huxley, who thought deeply about globalism, consumerism, virtual reality and technocracy before most others did, had a little book of his called A Brave New World reviewed in the February 7, 1932 Brooklyn Daily Eagle. It was apparently a ripping good yarn.

Tags: ,

In the U.S., the Right pretends it’s attacking bureaucracy while really angling to subjugate unions and workers; the aim is dismantling safety nets, not improving the situation. But that doesn’t mean mountains of paperwork shouldn’t be a bipartisan scourge. It’s often a maze with no exit. David Graeber’s forthcoming book, The Utopia of Rules, sees something even more sinister than incompetence buried in the files and folders. From Cory Doctorow’s review at Boing Boing:

Bureaucracy is pervasive and metastatic. To watch cop-dramas, you’d think that most of the job of policing was crime-fighting. But it’s not. The police are just “armed bureaucrats.” Most of what police do is administrative enforcement — making sure you follow the rules (threatening to gas you or hit you with a stick if you don’t). Get mugged and chances are, the police will take the report over the phone. Drive down the street without license plates and you’ll be surrounded by armed officers of the law who are prepared to deal you potentially lethal violence to ensure that you’re not diverging from the rules.

This just-below-the-surface violence is the crux of Graeber’s argument. He mocks the academic left who insist that violence is symbolic these days, suggesting that any grad student sitting in a university library reading Foucault and thinking about the symbolic nature of violence should consider the fact that if he’d attempted to enter that same library without a student ID, he’d have been swarmed by armed cops.

Bureaucracy is a utopian project: like all utopians, capitalist bureaucrats (whether in private- or public-sector) believe that humans can be perfected by modifying their behavior according to some ideal, and blame anyone who can’t live up to that ideal for failing to do so. If you can’t hack the paperwork to file your taxes, complete your welfare rules, figure out your 401(k) or register to vote, you’re obviously some kind of fuckup.

Bureaucracy begets bureaucracy. Every effort to do away with bureaucracy ends up with more bureaucracy.•

Tags: ,

William S. Burroughs reading in 1981 from Naked Lunch on Saturday Night Live, the rare pleasing moment during the the show’s most arid patch, those years when Tony Rosato could be a cast member and Robert Urich a host. Lauren Hutton intros him.

Tags: ,

William Butler Yeats famously pined for his muse, Maud Gonne, who rejected him. When her daughter, Iseult, turned 22, the now-midlife poet tried for her hand and was likewise turned away. While apparently no one in the family would fuck Yeats, Maud did apparently have sex in the grave of her infant son who had died at two, believing some mystical hooey which said the soul of the deceased boy would transmigrate into the new baby if she conceived next to his coffin. Well, okay. From Hugh Schofield at the BBC:

Actress, activist, feminist, mystic, Maud Gonne was also the muse and inspiration for the poet W B Yeats, who immortalised her in some of his most famous verses.

After the Free State was established in 1922, Maud Gonne remained a vocal figure in Irish politics and civil rights. Born in 1866, she died in Dublin in 1953.

But for many years in her youth and early adulthood, Maud Gonne lived in France.

Of this part of her life, much less is known. There is one long-secret and bizarre episode, however, that has now been established as almost certainly true.

This was the attempt in late 1893 to reincarnate her two-year-old son, through an act of sexual intercourse next to the dead infant’s coffin. …

Having inherited a large sum of money on the death of her father, she paid for a memorial chapel – the biggest in the cemetery. In a crypt beneath, the child’s coffin was laid.

In late 1893 Gonne re-contacted Lucien Millevoye, from whom she had separated after Georges’ death.

She asked him to meet her in Samois-sur-Seine. First the couple entered the small chapel, then opened the metal doors leading down to the crypt.

They descended the small metal ladder – just five or six steps. And then – next to the dead baby’s coffin – they had sexual intercourse.•

Tags: , ,

Got my hands on an early copy of Yuval Noah Harari’s Sapiens: A Brief History of Humankind yesterday, and I haven’t been able to put it down. The ideas are many, rich and often contrarian. You might not anticipate a book with that title being a page-turner, but it definitely is. Its drop date in the U.S. is February 10, and I highly recommend it. One brief passage from the opening section:

One of the most common uses of early stone tools was to crack open bones in order to get to the marrow. Some researchers believe this was our original niche. Just as woodpeckers specialise in extracting insects from the trunks of trees, the first humans specialised in extracting marrow from bones. Why marrow? Well, suppose you observe a pride of lions take down and devour a giraffe. You wait patiently until they’re done. But it’s still not your turn because first the hyenas and jackals – and you don’t dare interfere with them – scavenge the leftovers. Only then would you and your band dare approach the carcass, look cautiously left and right – and dig into the only edible tissue that remained.

This is a key to understanding our history and psychology. The position of humans in the food chain was, until quite recently, solidly in the middle. It was only in the last 100,000 years – with the rise of Homo sapiens – that man jumped to the top of the food chain.

That spectacular leap had enormous consequences. Other animals at the top of the pyramid, such as lions and sharks, evolved into that position very gradually, over millions of years. This enabled the ecosystem to develop checks and balances that prevent lions and sharks from wreaking too much havoc. As lions became deadlier, so gazelles evolved to run faster. In contrast, humankind ascended to the top so quickly that the ecosystem was not given time to adjust. Moreover, humans themselves failed to adjust. Most top predators of the planet are majestic creatures. Millions of years of dominion have filled them with self-confidence. Sapiens by contrast is more like a banana republic dictator. Having so recently been one of the underdogs of the savannah, we are full of fears and anxieties over our position, which makes us doubly cruel and dangerous. Many historical calamities, from deadly wars to ecological catastrophes, have resulted from this over-hasty jump.•

 

Tags:

A neurophysiological researcher at Yale, Colleen McCullough turned to writing at 37 as a second career and made it her first, producing with The Thorn Birds, a book about illicit love between a married woman and a priest, a career-defining success. Where did a story of such forbidden passion come from? Well, she was the daughter of a bigamist who had at least three wives at the same time. Listen, as an author she wasn’t Carson McCullers, but she didn’t need to be: Her heart was its own kind of lonely hunter. From her New York Times obituary, penned by the excellent Margalit Fox:

On a typical day, Ms. McCullough said, she might produce 15,000 words; on a very good day, 30,000. Her facility was all the more noteworthy in that she continued to use an electric typewriter well into the computer age.

“I spell perfectly,” she told The Inquirer in the 1996 article. “My grammar’s very good. My sentence construction is excellent. So I don’t have a lot of mistakes.” …

As a girl, Ms. McCullough dreamed of becoming a doctor. She entered medical school at the University of Sydney but was forced to abandon her studies after she developed a severe allergy to the soap widely used in Australian hospitals. She trained instead in neurophysiology, which is concerned with testing for and diagnosing neuromuscular diseases.

In the late 1960s, after working at Great Ormond Street Hospital in London, Ms. McCullough accepted a position as a neurophysiological research assistant at the Yale School of Medicine. Discovering that she was being paid less than her male colleagues there, she cast about for another source of income.

“I loved being a neurophysiologist, but I didn’t want to be a 70-year old spinster in a cold-water walk-up flat with one 60-watt light bulb, which is what I could see as my future,” she told The California Literary Review in 2007.

Interested in writing since girlhood, she took to her typewriter.•

Tags: ,

« Older entries § Newer entries »