Books

You are currently browsing the archive for the Books category.

In a newly revised edition of Federico Fellini’s 1980 book, Making a Film, there’s a fresh translation of “A Spectator’s Autobiography,” the wonderful essay by Italo Calvino that begins the volume. It’s been adapted for publication by the NYRB.

In the piece, Calvino notes that the unpunctual habits of Italian moviegoers in the 1930s portended the ultimate widespread fracturing of the traditional narrative structure, an artifice intended to satisfy, if fleetingly, our deep craving for order, to deliver us a simple solution to the complex puzzle of life and its jagged pieces. 

An excerpt:

Italian spectators barbarously made entering after the film already started a widespread habit, and it still applies today. We can say that back then we already anticipated the most sophisticated of modern narrative techniques, interrupting the temporal thread of the story and transforming it into a puzzle to put back together piece by piece or to accept in the form of a fragmentary body. To console us further, I’ll say that attending the beginning of the film after knowing the ending provided additional satisfaction: discovering not the unraveling of mysteries and dramas, but their genesis; and a vague sense of foresight with respect to the characters. Vague: just like soothsayers’ visions must be, because the reconstruction of the broken plot wasn’t always easy, especially if it was a detective movie, where identifying the murderer first and the crime afterward left an even darker area of mystery in between. What’s more, sometimes a part was still missing between the beginning and the end, because suddenly while checking my watch I’d realize I was running late; if I wanted to avoid my family’s wrath I had to leave before the scene that was playing when I entered came back on. Therefore lots of films ended up with holes in the middle, and still today, more than thirty years later—what am I saying?—almost forty, when I happen to see one of those films from back then—on television, for example—I recognize the moment in which I entered the theater, the scenes that I’d watched without understanding them, and I recover the lost pieces, I put the puzzle back together as if I’d left it incomplete the day before.•

  • See also:

Fellini feuds with Oriana Fallaci. (1963)

Tags: ,

Sad to hear of the passing of Dr. Oliver Sacks, the neurologist and writer, who made clear in his case studies that the human brain, a friend and a stranger, was as surprising as any terrain we could ever explore. It feels like we’ve not only lost a great person, but one who was uniquely so. He became hugely famous with the publication of his 1985 collection, The Man Who Mistook His Wife For A Hat, which built upon the template of A.R. Luria’s work with better writing and a wider array of investigations. Two years prior, he published an essay In the London Review of Books that became the title piece. An excerpt: 

I stilled my disquiet, his perhaps too, in the soothing routine of a neurological exam – muscle strength, co-ordination, reflexes, tone. It was while examining his reflexes – a trifle abnormal on the left side – that the first bizarre experience occurred. I had taken off his left shoe and scratched the sole of his foot with a key – a frivolous-seeming but essential test of a reflex – and then, excusing myself to screw my ophthalmoscope together, left him to put on the shoe himself. To my surprise, a minute later, he had not done this.

‘Can I help?’I asked.

‘Help what? Help whom?’

‘Help you put on your shoe.’

‘Ach,’ he said, ‘I had forgotten the shoe,’ adding, sotto voce: ‘The shoe! The shoe?’ He seemed baffled.

‘Your shoe,’ I repeated. ‘Perhaps you’d put it on.’

He continued to look downwards, though not at the shoe, with an intense but misplaced concentration. Finally his gaze settled on his foot: ‘That is my shoe, yes?’

Did I mishear? Did he mis-see? ‘My eyes,’ he explained, and put a hand to his foot. ‘This is my shoe, no?’

‘No, it is not. That is your foot. There is your shoe.’

‘Ah! I thought that was my foot.’

Was he joking? Was he mad? Was he blind? If this was one of his ‘strange mistakes’, it was the strangest mistake I had ever come across.

I helped him on with his shoe (his foot), to avoid further complication. Dr P. himself seemed untroubled, indifferent, maybe amused. I resumed my examination. His visual acuity was good: he had no difficulty seeing a pin on the floor, though sometimes he missed it if it was placed to his left.

He saw all right, but what did he see? I opened out a copy of the National Geographic Magazine, and asked him to describe some pictures in it. His eyes darted from one thing to another, picking up tiny features, as he had picked up the pin. A brightness, a colour, a shape would arrest his attention and elicit comment, but it was always details that he saw – never the whole. And these details he ‘spotted’, as one might spot blips on a radar-screen. He had no sense of a landscape or a scene.

I showed him the cover, an unbroken expanse of Sahara dunes.

‘What do you see here?’I asked.

‘I see a river,’ he said. ‘And a little guesthouse with its terrace on the water. People are dining out on the terrace. I see coloured parasols here and there.’ He was looking, if it was ‘looking’, right off the cover, into mid-air, and confabulating non-existent features, as if the absence of features in the actual picture had driven him to imagine the river and the terrace and the coloured parasols.

I must have looked aghast, but he seemed to think he had done rather well. There was a hint of a smile on his face. He also appeared to have decided the examination was over, and started to look round for his hat. He reached out his hand, and took hold of his wife’s head, tried to lift it off, to put it on. He had apparently mistaken his wife for a hat!•

Tags:

Impresario is what they used to call those like Steve Ross of Warner Communications, whose mania for mergers allowed him a hand in a large number of media and entertainment ventures, making him boss and handler at different times to the Rolling Stones, Pele and Dustin Hoffman. One of those businesses the erstwhile funeral-parlor entrepreneur became involved with was Qube, an interactive cable-TV project that was a harbinger if a money-loser. That enterprise and many others are covered in a brief 1977 People profile. The opening:

In our times, the courtships and marriages that make the earth tremble are no longer romantic but corporate. The most legendary (or lurid) figures are not the Casanovas today. They are the conglomerateurs, and for sheer seismic impact on the popular culture, none approaches Steven J. Ross, 50, the former slacks salesman who married into a mortuary chain business that he parlayed 17 years later into Warner Communications Inc. (WCI). In founder-chairman Ross’s multitentacled clutch are perhaps the world’s predominant record division (with artists like the Eagles, Fleetwood Mac, the Rolling Stones, Led Zeppelin and Joni Mitchell); one of the Big Three movie studios (its hot fall releases include Oh, God! and The Goodbye Girl); a publishing operation (the paperback version of All the President’s Men, which was also a Warner Bros, film); the Atari line of video games like Pong, which inadvertently competes with Warner’s own TV producing arm, whose credits include Roots, no less. The conglomerate is furthermore not without free-enterprising social consciousness (WCI put up $1 million and owns 25 percent of Ms. magazine) or a redeeming sense of humor (it disseminates Mad).

Warner’s latest venturesome effort is bringing the blue-sky of two-way cable TV down to earth in a limited experiment in Columbus, Ohio. There, subscribers are able to talk back to their TV sets (choosing the movie they want to see or kibitzing the quarterback on his third-down call). An earlier Ross vision—an estimated $4.5 million investment in Pelé by Warner’s New York Cosmos—was, arguably, responsible for soccer’s belated breakthrough in the U.S. this year after decades of spectator indifference. Steve is obviously in a high-rolling business—Warners’ estimated annual gross is approaching a billion—and so the boss is taking his. Financial writer Dan Dorfman pegs Ross’s personal ’77 earnings at up to $5 million. That counts executive bonuses but not corporate indulgences. On a recent official trip to Europe in the Warner jet, Steve brought along his own barber for the ride.

En route to that altitude back in the days of his in-laws’ funeral parlor operation, Ross expanded into auto rentals (because he observed that their limos were unprofitably idle at night) and then into Kinney parking lots. “The funeral business is a great training ground because it teaches you service,” he notes, though adding: “It takes as much time to talk a small deal as a big deal.” So, come the ’70s, Ross dealt away the mortuary for the more glamorous show world. Alas, too, he separated from his wife. •

Tags:

Grandpa Friedrich Trump was a “whoremaster,” and in retrospect, he was the classy one.

After leaving Germany as a lad and rechristening himself as the more-Americanized “Frederick,” Donald’s pop-pop burnished his bank account by selling liquor, gambling and women to miners in rooming houses he built in boomtowns before they went bust. He always managed to stay just ahead of the law.

In a Politico piece, Gwenda Blair, who authored a book about Trump history, writes that his grandfather was the template for Donald, who has been most influenced by the “family culture of doing whatever it takes to come out on top and never giving up.”

The opening:

One hundred and thirty years ago, in 1885, Friedrich Trump stepped off a boat in lower Manhattan with a single suitcase. Only sixteen years old, he had left a note for his widowed mother on the kitchen table back in Kallstadt, a village in southwestern Germany, and slipped off in the middle of the night. He didn’t want to work in the family vineyard or get a job as a barber, the profession for which he’d been trained. He wanted to become rich, and America was the place to do it.

Friedrich wasted no time, and he did it by pushing the behavioral boundaries of his time, much as his grandson Donald would a century later. By the early 1890s, Friedrich had learned English; morphed from a skinny teenager into an adult man with a handlebar moustache; become a naturalized U.S. citizen, an easy matter at a time when there were no immigration quotas (much less debates about “birthright”); changed the spelling of his name to the more American-sounding Frederick; and made his way to Seattle, a wide-open city filled with single, rootless newcomers who’d arrived expecting to make their fortunes but found themselves facing the same uncertain economic prospects they’d wanted to leave behind.

A quick study, Trump headed for a prime location, the city’s red-light district, known as the Lava Beds. There he leased a tiny storefront restaurant named the Poodle Dog, which had a kitchen and a bar and advertised “private rooms for ladies”–code for prostitutes. It would allow the resourceful Trump, who renamed it the Dairy Restaurant, to offer the restless, frustrated public some right-now satisfaction in the form of food, booze and easily available sex.•

Tags: , , ,

Image by Ted Streshinsky.

In his New Yorker piece about Tracy Daugherty’s Joan Didion biography, The Last Love Song, Louis Menand states that “‘Slouching Towards Bethlehem’ was not a very good piece of standard journalism.” Well, no. Nor was the Flying Burrito Brothers very good classical music, but each of those assessments is probably beside the point.

Menand claims Didion poorly contextualized the Hippie movement, but the early stages of his own article suffers from the same. He asserts the Flower Child craze and the thorny period that followed it was similar to the Beats of the previous decade, just weekend faddists lightly experimenting with drugs. But the counterculture of the late-1960s blossomed into a massive anti-war movement, a much larger-scale thing, and the youth culture’s societal impact wasn’t merely a creation of opportunistic, screaming journalism. Menand wants to prove this interpretation wrong, but he doesn’t do so in this piece. He offers a couple of “facts” of indeterminate source about that generation’s drug use, and leaves it at that. Not nearly good enough.

I admire Menand deeply (especially The Metaphysical Club) the way he does Didion, but I think her source material approaches the truth far more than this part of Menand’s critique does. Later on in the piece, he points out that Didion wasn’t emblematic of that epoch but someone unique and outside the mainstream, suggesting her grasp of the era was too idiosyncratic to resemble reality. But detachment doesn’t render someone incapable of understanding the moment. In fact, it’s often those very people who are best positioned to.

The final part of the article which focuses on how in the aftermath of her Haight-Ashbury reportage, Didion had a political awakening from her conservative California upbringing, though not an immediate or conventional one. This long passage is Menand’s strongest argument.

An excerpt:

“Slouching Towards Bethlehem” is not a very good piece of standard journalism, though. Didion did no real interviewing or reporting. The hippies she tried to have conversations with said “Groovy” a lot and recycled flower-power clichés. The cops refused to talk to her. So did the Diggers, who ran a sort of hippie welfare agency in the Haight. The Diggers accused Didion of “media poisoning,” by which they meant coverage in the mainstream press designed to demonize the counterculture.

The Diggers were not wrong. The mainstream press (such as the places Didion wrote for, places like The Saturday Evening Post) was conflicted about the hippie phenomenon. It had journalistic sex appeal. Hippies were photogenic, free love and the psychedelic style made good copy, and the music was uncontroversially great. Around the time Didion was in San Francisco, the Beatles released Sgt. Pepper’s Lonely Hearts Club Band, and soon afterward the Monterey Pop Festival was held. D. A. Pennebaker’s film of the concert came out in 1968 and introduced many people to Janis Joplin, Jimi Hendrix, and Ravi Shankar. Everybody loved Ravi Shankar.

Ravi Shankar did not use drugs, however. The drugs were the sketchy part of the story, LSD especially. People thought that LSD made teen-age girls jump off bridges. By the time Didion’s article came out, Time had run several stories about “the dangerous LSD craze.” And a lot of Didion’s piece is about LSD, people on acid saying “Wow” while their toddlers set fire to the living room. The cover of the Post was a photograph of a slightly sinister man, looking like a dealer, in a top hat and face paint—an evil Pied Piper. That photograph was what the Diggers meant by “media poisoning.”•

Tags: , , ,

Ethel_Rosenberg_mugshot

In a time of hysteria, justice is only the first casualty. Human lives often follow.

It’s hard to make sense in retrospect of the 1950s trial of Julius and Ethel Rosenberg, accused Soviet spies, because there was very little sensible about the Communist witch hunt of that era. Charged with a crime that “jeopardizes the lives of every man, woman and child in America,” the couple certainly didn’t get a fair hearing.

I thought of this agonizing piece of our history when E.L. Doctorow, author of The Book of Daniel, a fictionalized take on the topic, died recently. As the novel reminds, it was an especially painful period for many Americans because the two Rosenberg children, Michael and Robert (later adopted by Abel and Anne Meeropol), were collateral damage. Embedded is a January 4 1953 Brooklyn Daily Eagle article which covers the boys visiting their parents six months before their execution.

Tags: , , ,

Col. William “Billy” Breakenridge was tossed into the belly of the beast in 1879 when he became Assistant City Marshal of the hell-raising, often-lethal city of Tombstone, Arizona. Somehow he lived to tell the story, which he did quite literally nearly 50 years later, soon before his death, when he published his autobiography, Helldorado. Even this literary effort, far removed from the gun-slinging madness, caused conflict, as Wyatt Earp, portrayed in its pages as a low-down scoundrel, protested its verity. An article about the book was published in the June 12, 1929 Brooklyn Daily Eagle.

Tags: ,

In 1999, Michael Crichton played what he knew to be a fool’s game and predicted the future. He was not so successful about culture. Things he got wrong: Printed matter will be unchanged, movies will soon be dead, communications will be consolidated into fewer hands. Well, he did foresee YouTube.

Crichton, who was fascinated by science and often accused of being anti-science, commenting in a 1997 Playboy interview on technology creating moral quandaries we’re not prepared for:

I think we’re a long way from cloning people. But I am worried about scientific advances without consideration of their consequences. The history of medicine in my lifetime is one of technological advances that outstrip our ethical systems. We’ve never caught up. When I was in medical school—30-odd years ago—people were struggling to deal with mechanical-respiration systems. They were keeping alive people who a few years earlier would have died of natural causes. Suddenly people weren’t going to die of natural causes. They were either going to get on these machines and never get off or—or what? Were we going to turn the machines off? We had the machines well before we started the debate. Doctors were speaking quietly among themselves with a kind of resentment toward these machines. On the one hand, if somebody had a temporary disability, the machines could help get them over the hump. For accident victims—some of whom were very young—who could be saved if they pulled through the initial crisis, the technology saved lives. You could get them over the hump and then they would recover, and that was terrific.

But on the other hand, there was a category of people who were on their way out but could be kept alive. Before the machine, ‘pulling the plug’ actually meant opening the window too wide one night, and the patient would get pneumonia and die. That wasn’t going to happen now. We were being forced by technology to make decisions about the right to die—whether it’s a legal or religious issue—and many related matters. Some of them contradict longstanding ideas in an ethically protected world; we weren’t being forced to make hard decisions, because those decisions were being made for us—in this case, by the pneumococcus.

This is just one example of an ethical issue raised by technology. Cloning is another. If you’re knowledgeable about biotechnology, it’s possible to think of some terrifying scenarios. I don’t even like to discuss them. I know people doing biotechnology research who have decided not to pursue avenues of research because they think they’re too dangerous. But we go forward without sorting out the issues. I don’t believe that everything new is necessarily better. We go forward with the technology while the ethical issues are still up in the air, whether it’s the genetic variability of crop streams, which is a resource in times of plant plagues, to the assumption that we all have to be connected all the time. The technology is here so you must use it. Do you? Do you have to have your cell phone and your e-mail address and your Internet hookup? I was just on holiday in Scotland without e-mail. I had to notify people that I wouldn’t be checking my e-mail, because there’s an assumption that if I send you an e-mail, you’ll get it. Well, I won’t get it. I’m not plugged in, guys. Some people are horrified: “You’ve gone offline?” People feel so enslaved by technology that they will stop having sex to answer the telephone. What could be so important? Who’s calling, and who cares?•

David Foster Wallace overdid it in many ways, and his journalism seemed to conveniently descend into fiction when need be, but he was genuinely brilliant, certainly far superior to Bret Easton Ellis, who wrote, oy gevalt, Less Than Zero. That ersatz-J.D. Salinger-makes-a-snuff-film literary stain is so deeply ingrained on the sheets that you have throw away the whole bed.

Ellis uses a new Medium essay to yet again disapprove of his late contemporary and also to take aim at the new Wallace biopic, The End of the Tour. He is correct that such films almost universally reduce their subjects with hagiography, meaning to make them more likable but instead robbing them of their humanity. An excerpt:

The David in this movie is the voice of reason, a sage, and the movie succumbs to the cult of stressing likability. But the real David scolded people and probably craved fame — what writer isn’t both suspicious of literary fame and yet curious in seeing how that game is played out? It’s not that rare and — hey — it sells books. He was cranky and could be very mean and caustic and opportunistic, but this David Foster Wallace is completely erased and that’s why the movie is so resolutely one-note and earnest. There’s so much handwringing about doing one dumb book tour and being “terrified” by a magazine profile — and this is looked on as a sign of pure integrity in the movie — that at some point you may want to tell the screen: “Just don’t finish the tour, dude, if it hurts so much, and shut up about it. Don’t talk to freakin’ Rolling Stone. Get over it. Chill.”

This is not the David Foster Wallace who voted for Reagan and supported Ross Perot, the David who wrote a scathing and deliciously cruel put-down of late-period Updike, the David who posed for glamour-puss photos in Interview magazine (years before Infinite Jest) and appeared on Charlie Rose numerous times — all of which the movie strongly suggests was probably absolute agony for David who keeps naively fretting about his real self being co-opted by a fake self, as if a man as intelligent as Wallace would really care one way or the other, but the movie insists this was the case which perversely reveals Wallace to be the world-class narcissist so many people (even Jonathan Franzen, a close friend, and Mary Karr, an ex) always assumed he was.•

Tags: ,

At least outwardly, William F. Buckley was approving in the 1990s of Rush Limbaugh replacing him as the voice of Conservatism, believing he was to be succeeded by a more populist talker. Neither pundit, however, was really a driving force in American society. They were just well-positioned observers as responsible for political movements as alarm clocks are for the sun’s rise. Both were simply the noise accompanying the moment, as commentators almost always are.

Buckley and Gore Vidal and Norman Mailer and Germaine Greer were figures we used to call “public intellectuals,” although quite often they behaved like adult babies hurling balls of ego at one another. I don’t know we’re worse for their absence (though I grant that when Mailer wrote of technology, he was quite insightful).

As Garry Wills states in a NYRB piece, a single episode of The Daily Show or The Colbert Report did more to elucidate than every last insult and threat of fisticuffs from these supposed heavyweights.

From Wills:

A more ambitious project is Kevin M. Schultz’s Buckley and Mailer. He argues that the 1950s was a placid time narcotized by Eisenhower. But two radical voices, Buckley from the right and Mailer from the left, called out across the dreary middle ground, shaking things up—deep calling to deep, in Schultz’s telling. When chaos broke out in the 1960s, the two men pulled back from the violence they had created.

But had they created it? The upsetting of the old order was accomplished mainly by the civil rights movement, the feminist movement, and the anti-war movement. Those three things, and the vehement opposition to them, did the real churning of the waters; and Buckley and Mailer were only briefly and peripherally involved in them. The real troublemakers were people like Martin Luther King, Jr., Malcolm X, and James Baldwin, opposed by the likes of Strom Thurmond and George Wallace. Feminists like Gloria Steinem and Kate Millett were opposed to the pious legions of Phyllis Schlafly and Beverly LaHaye. On Vietnam, Benjamin Spock and Tom Hayden faced down Nixon’s hardhats and Edgar Hoover’s COINTELPRO. These deeply committed people with real followings had little time for the filigreed warblings of Buckley or Mailer. Deep to deep? Rather, flamboyant shallow to flamboyant shallow. Buckley and Mailer did not make history. They made good copy.•

 

Tags: ,

According to Paul Mason, author of PostCapitalism, technology has reduced the economic system to obsolescence or soon will. While I don’t agree that capitalism is going away, I do believe the modern version of it is headed for a serious revision.

The extent to which technology disrupts capitalism–the biggest disruption of them all–depends to some degree on how quickly the new normal arrives. If driverless cars are perfected in the next few years, tens of millions of positions will vanish in America alone. Even if the future makes itself known more slowly, employment will probably grow more scarce as automation and robotics insinuate themselves. 

The very idea of work is currently undergoing a reinvention. In exchange for the utility of communicating with others, Facebook users don’t pay a small monthly fee but instead do “volunteer” labor for the company, producing mountains of content each day. That would make Mark Zuckerberg’s company something like the biggest sweatshop in history, except even those dodgy outfits pay some minimal fee. It’s a quiet transition.

Gillian Tett of the Financial Times reviews Mason’s new book, which argues that work will become largely voluntary in the manner of Wikipedia and Facebook, and that governments will provide basic income and services. That’s his Utopian vision at least. Tett finds it an imperfect but important volume. An excerpt:

His starting point is an assertion that the current technological revolution has at least three big implications for modern economies. First, “information technology has reduced the need for work” — or, more accurately, for all humans to be workers. For automation is now replacing jobs at a startling speed; indeed, a 2013 report by the Oxford Martin school estimated that half the jobs in the US are at high risk of vanishing within a decade or two.

The second key point about the IT revolution, Mason argues, is that “information goods are corroding the market’s ability to form prices correctly.” For the key point about cyber-information is that it can be replicated endlessly, for free; there is no constraint on how many times we can copy and paste a Wikipedia page. “Until we had shareable information goods, the basic law of economics was that everything is scarce. Supply and demand assumes scarcity. Now certain goods are not scarce, they are abundant.”

But third, “goods, services and organisations are appearing that no longer respond to the dictates of the market and the managerial hierarchy.” More specifically, people are collaborating in a manner that does not always make sense to traditional economists, who are used to assuming that humans act in self-interest and price things according to supply and demand.•

Tags: ,

In my teens, I read the books of Fred Exley’s “Fan” fictional-memoir trilogy one after another, and I remember hoping to never be within a million miles of someone as fucked up and frightening as the author’s doppelganger. That Exley is a fascinating character but also the height (and depth) of American male insecurity: violent, self-pitying, alcoholic, furious, a mess. He realizes too intently that he’s on the wrong side of an ugly score. 

The second and third volumes are so-so, but the original, A Fan’s Notes, is a searing, heartbreaking thing, and everyone who’s picked up a copy probably thought of it again this weekend when hearing of Frank Gifford’s death. I know I did.

The USC and Giants great was the “star” to Exley’s “fan,” and the segment about the back running headlong into a Chuck Bednarik guillotine, which put him in the hospital for ten days and on the sidelines for 18 months, is unforgettable. If Gifford, the golden idol, could be felled by life, what chance did Ex have in the bleachers?

At Grantland, Fred Schruers has a beautifully written article about the literary “relationship” which begat a real one. An excerpt:

Ex, as he called himself and answered to among friends, had begun the fixation on Gifford that led to this nostalgia-and-booze-soaked threnody of dysfunction around 1951, when both men were enrolled at USC. Gifford, a converted quarterback and defensive back who became a halfback his senior year and slashed for four touchdowns against Ohio State, was campus royalty; Ex was a legendarily hard-drinking English major. Like Gifford, Exley would head to New York, having been raised upstate in Watertown as the son of a crusty semi-pro footballer. Before he truly discovered his great gift — striving to redeem his own scattered life in long, lapidary sentences touched with wit and pathos — Exley would spend his twenties as the victim of his own deep emotional maladies. He would know a depression that led to electroshock therapy. In his three main works, he would explicate a painful grapple with attempts to capture the love of the kind of unreachable American princesses he longed for.2

Though he never saw Frank play in college, Exley would understand the mythic heft of this transformed oil driller’s son who became an All-American. Exley’s Fitzgeraldian tangle of thoughts about Gifford only deepened as no. 16’s NFL career soared. In one mid-novel excursion, Exley explains his own role as a failing writer among the working stiffs around him in the $1 bleacher seats:

It was very simple really. Where I could not, with syntax, give shape to my fantasies, Gifford could, with his superb timing, his great hands, his uncanny faking, give shape to his … he became my alter ego, that part of me had its being in the competitive world of men …

Tags: , ,

Easily the best article I’ve read about E.L. Doctorow in the wake of his death is Ron Rosenbaum’s expansive Los Angeles Review of Books piece about the late novelist. It glides easily from Charles Darwin to Thomas Nagel to the hard problem of consciousness to the “electrified meat” in our skulls to the “Jor-El warning” in Doctorow’s final fiction, Andrew’s Brain. That clarion call was directed at the Singularity, which the writer feared would end human exceptionalism, and, of course, it would. More a matter of when than if.

An excerpt:

Not to spoil the mood but I feel a kind of responsibility to pass on Doctorow’s Jor-El warning, even if I don’t completely understand it. I would nonetheless contend that — coming from a person as steeped as he is in the contemplation of the Mind and its possibilities, the close reading of consciousness, of that twain of brain and mind and the mysteries of their relationship — attention should be paid. It seemed like a message he wanted me to convey.

I asked him to expand upon the idea voiced in Andrew’s Brain that once a computer was created that could replicate everything in the brain, once machines can think as men, when we’ve achieved true “artificial intelligence” or “ the singularity” as it’s sometimes called, it would be “catastrophic.”

“There is an outfit in Switzerland,” he says. “And this is a fact — they’re building a computer to emulate a brain. The theory is, of course, complex. There are billions of things going on in the brain but they take the position that the number of things is finite and that finally you can reach that point. Of course there’s a lot more work to do in terms of the brain chemistry and so on. So Andrew says to Doc ‘the twain will remain.’

“But later he has this revelation because he’s read, as I had, a very responsible scientist saying that it was possible someday for computers to have consciousness. That was said in a piece by a very respected neuroscientist by the name of Gerald Edelman. So the theory is this: If we do ever figure out how the brain becomes what we understand as consciousness, our feelings, our wishes, our desires, dreams — at that point we will know enough to simulate with a computer the human brain — and the computer will achieve consciousness. That is a great scientific achievement if it ever occurs. But if it does, all the old stories are gone. The Bible, everything.”

“Why?”

“Because the idea of the exceptionalism of the human mind is no longer exceptional. And you’re not even dealing with the primary consciousness of animals, of different degrees of understanding. You’re talking about a machine that could now think, and the dominion of the human mind no longer exists. And that’s disastrous because it’s earth-shaking. I mean, imagine.”•

Tags: ,

Once Ernest Hemingway was dead and his cult of personality vanished, his stock as a writer fell precipitously, which was justice. It’s difficult to believe now that Hemingway was considered the greatest writer of his age by many while he was alive. He got somewhere with The Sun Also Rises, but the rest of his work was largely overrated, and he’s most interesting now for the era he lived in and for being representative of a particular type of damaged American male, one who marked his pages with symbolism of sexual dysfunction while boasting of a zeal for big-game hunting. What a douche. In an article in the April 25, 1934 Brooklyn Daily Eagle, he told report Guy Hickok about Depression Era safaris.

Tags: ,

I know it sounds unlikely, but I once asked Snoop Dogg what it was he liked about pimping, that disgraceful thing, when he was a child. He answered in the most consumerist terms: the clothes, the cars, the hair–the style that only money could buy. It was the closest thing to capitalism that young Calvin Broadus could imagine his.

I don’t know if the late and infamous pimp Iceberg Slim (born Robert Beck) was what Malcolm X would have been had he never been politicized, but he certainly was the template for Don King, Dr. Dre and other African-American males who wanted into the capitalist system in the worst way–and got there by those means. The way they looked at it, their hands weren’t any dirtier than anyone else’s, just darker. 

A new biography of Slim encourages us to consider him for his literary talent, not just his outsize persona. Dwight Garner of the New York Times, who writes beautifully on any topic, has a review. The opening:

In the late 1960s and early ’70s, if you wanted a book by Iceberg Slim, the best-selling black writer in America, you didn’t go to a bookstore. You went to a black-owned barbershop or liquor store or gas station. Maybe you found a copy on a corner table down the block, or being passed around in prison.

The first and finest of his books was a memoir, Pimp: The Story of My Life, published in 1967. This was street literature, marketed as pulp. The New York Times didn’t merely not review Pimp, Justin Gifford notes in Street Poison: The Biography of Iceberg Slim. Given the title, this newspaper wouldn’t even print an ad for it.

Pimp related stories from Iceberg Slim’s 25 years on the streets of Chicago, Milwaukee, Detroit and other cities. It was dark. The author learned to mistreat women with a chilly élan. It was dirty, so filled with raw language and vividly described sex acts that, nearly 50 years later, the book still makes your eyeballs leap out of your skull, as if you were at the bottom of a bungee jump.

Yet Iceberg Slim’s prose was, and is, as ecstatic and original as a Chuck Berry guitar solo. Mark Twain meets Malcolm X in his sentences. When he was caught with an underage girl by her father, for example, the author didn’t just run. “I vaulted over the back fence,” he wrote, “and torpedoed down the alley.”

Pimp is a different sort of American coming-of-age story, the tale of a determined young man who connived to take what society would not give. It’s a subversive classic.•

_______________________________

A masked Slim meets Joe Pyne in the 1960s.

Tags: , , ,

In a tweet she published in the uproar after Cecil the lion’s killing, Roxane Gay delivered a line as funny and heartbreaking as anything George Carlin or Richard Pryor could have conjured. It was this:

I’m personally going to start wearing a lion costume when I leave my house so if I get shot, people will care.”

It speaks brilliantly to racial injustice and the unequal way our empathy is aroused. Later on, in a separate tweet, the critic said something much less true while defending that great line:

Speciesism is not a thing.

Oh, it is a thing. We’ve based so much of our world on that very thing, and all of us, even those who’ve tried to be somewhat kinder, have benefited from this arrangement.

In a winding Foreign Affairs piece that traces the history of the long struggle against animal cruelty, Humane Society CEO Wayne Pacelle reveals what is uneven but significant progress and how the movement, which coalesced around a book written 40 years ago by moral philosopher Peter Singer, gained steam after shifting tactics, replacing moralizing with legislative efforts. An excerpt:

There was forward progress, many setbacks through the decades, and a wave of lawmaking in the early 1970s, but the biggest catalyst for change came with the publication of Peter Singer’s book Animal Liberation in 1975.

Singer’s book spurred advocates to form hundreds more local and national animal protection groups, including People for the Ethical Treatment of Animals in 1980. I was part of that wave, forming an animal protection group as an undergraduate at Yale University in 1985, protesting cruelty, screening films, and generally trying to shine a light on large-scale systemic abuses that few wanted to acknowledge. This grass-roots activism pushed established groups to step up their calls for reform and adopt more campaign-oriented tactics. Nevertheless, throughout the 1980s, the animal protection movement was still fundamentally about protest. The issues that animal advocates raised were unfamiliar and challenging, and their demands were mainly for people to reform their lifestyles. Asking someone to stop eating meat or to buy products not tested on animals was a hard sell, because people don’t like to change their routines, and because practical, affordable, and easily available alternatives were scarce.

It was not until the 1990s that the animal protection movement adopted a legislative strategy and became more widely understood and embraced. A few groups had been doing the political spadework needed to secure meaningful legislative reforms, focusing mostly on the rescue and sheltering of animals. Then organizations such as the HSUS and the Fund for Animals started introducing ballot measures to establish protections, raise awareness, and demonstrate popular support for reform. The ball got rolling with a successful initiative in 1990 to outlaw the trophy hunting of mountain lions in California, which was followed by the 1992 vote in Colorado to protect bears. Other states followed suit with measures to outlaw cockfighting, dove hunting, greyhound racing, captive hunts, the use of steel-jawed traps for killing fur-bearing mammals, and the intensive confinement of animals on factory farms. Today, activists are working to address the inhumane slaughter of chickens and turkeys, the captive display of marine mammals, the hunting of captive wildlife, and the finning of sharks for food. The United States alone now counts more than 20,000 animal protection groups, with perhaps half of them formed in just the last decade. The two largest groups, the HSUS and the ASPCA, together raise and spend nearly $400 million a year and have assets approaching $500 million.•

Tags: , ,

I’ve yet to get my grubby, ink-stained hands on a copy of Stephen Petranek How We’ll Live on Mars, which argues that we’ll establish a human presence on our neighboring planet within the next 20 years. It’s not just theory but reportage also, featuring interviews with numerous figures at the heart of the new Space Race, a welter of public and private interests. An excerpt from the book via the TED site:

These first explorers, alone on a seemingly lifeless planet as much as 250 million miles away from home, represent the greatest achievement of human intelligence.

Anyone who watched Neil Armstrong set foot on the moon in 1969 can tell you that, for a moment, the Earth stood still. The wonder and awe of that achievement was so incomprehensible that some people still believe it was staged on a Hollywood set. When astronauts stepped onto the moon, people started saying, “If we can get to the moon, we can do anything.” They meant that we could do anything on or near Earth. Getting to Mars will have an entirely different meaning: If we can get to Mars, we can go anywhere.

The achievement will make dreamy science fiction like Star Wars and Star Trek begin to look real. It will make the moons of Saturn and Jupiter seem like reasonable places to explore. It will, for better or worse, create a wave of fortune seekers to rival those of the California gold rush. Most important, it will expand our vision past the bounds of Earth’s gravity. When the first humans set foot on Mars, the moment will be more significant in terms of technology, philosophy, history, and exploration than any that have come before it. We will no longer be a one-planet species.

These explorers are the beginning of an ambitious plan not just to visit Mars and establish a settlement but to reengineer, or terraform, the planet — to make its thin atmosphere of carbon dioxide rich enough in oxygen for humans to breathe, to raise its temperature from an average of –81 degrees Fahrenheit to a more tolerable 20 degrees, to fill its dry stream beds and empty lakes with water again, and to plant foliage that can flourish in its temperate zone on a diet rich in CO2. These astronauts will set in motion a process that might not be complete for a thousand years but will result in a second home for humans, an outpost on the farthest frontier. Like many frontier outposts before it, this one may eventually rival the home planet in resources, standard of living and desirability.

Tags:

David McCullough’s latest, The Wright Brothers, details how two bicycle makers with no formal training in aviation became the first to touch the sky. In what might be James Salter’s final piece of journalism, the NYRB has posthumously published the late novelist and journalist’s graceful critique of the new book. Probably best known for his acclaimed fiction, Salter also was a reporter for People magazine in the ’70s, profiling other writers, Vladimir Nabokov and Graham Greene among them. Here he focuses on the recurring theme of the brothers’ distance from the world in everything from their family life to the relative isolation of Kitty Hawk.

An excerpt about the very origins of the Wrights’ fever dream:

Together they opened a bicycle business in 1893, selling and repairing bicycles. It was soon a success, and they were able to move to a corner building where they had two floors, the upper one for the manufacturing of their own line of bicycles. Then late in the summer of 1896 Orville fell seriously ill with typhoid fever. His father was away at the time, and he lay for days in a delirium while Wilbur and Katharine nursed him. During the convalescence Wilbur read aloud to his brother about Otto Lilienthal, a famous German glider enthusiast who had just been killed in an accident.

Lilienthal was a German mining engineer who, starting with only a pair of birdlike wings, designed and flew a series of gliders—eighteen in all—and made more than two thousand flights in them to become the first true aviator. He held on to a connecting bar with his legs dangling free so they could be used in running or jumping and also in the air for balance. He took off by jumping from a building or escarpment or running down a man-made forty-five-foot hill, and he wrote ecstatically of the sensation of flying. Articles and photographs of him in the air were published widely. Icarus-like he fell fifty-five feet and was fatally injured, not when his wings fell off but when a gust of wind tilted him upward so that his glider stalled. Opfer müssen gebracht werden were his final words, “sacrifices must be made.”

Reading about Lilienthal aroused a deep and long-held interest in Wilbur that his brother, when he had recovered, shared. They began to read intensively about birds and flying.•

See also: 

Tags: , , ,

We’ll likely be richer and healthier in the long run because of the Digital Revolution, but before the abundance, there will probably be turbulence.

A major reorganization of Labor among hundreds of millions promises to be bumpy, a situation requiring deft political solutions in a time not known for them. It’s great if Weak AI can handle the rote work and free our hands, but what will we do with them then? And how will we balance a free-market society that’s also a highly automated one?

In a Washington Post piece, Matt McFarland wisely assesses the positive and negatives of the new order. Two excerpts follow.

_______________________

Just as the agrarian and industrial revolutions made us more efficient and created more value, it follows that the digital revolution will do the same.

[Geoff] Colvin believes as the digital revolution wipes out jobs, new jobs will place a premium on our most human traits. These should be more satisfying than being a cog on an assembly line.

“For a long period, really dating to the beginning of the Industrial Revolution, our jobs became doing machine-like work, that the machines of the age couldn’t do it. The most obvious example being in factories and assembly-line jobs,” Colvin told me. “We are finally achieving an era in which the machines actually can do the machine-like work. They leave us to do the in-person, face-to-face work.”

_______________________

If self-driving cars and automated drone delivery become a reality, what happens to every delivery driver, truck driver and cab driver? Swaths of the population won’t be able to be retrained with skills needed in the new economy. Inequality will rise.

“One way or another it’s going to be kind of brutal,” [Jerry] Kaplan said. “When you start talking about 30 percent of the U.S. population being on the edge of losing their jobs, it’s not going to be a pleasant life and you’re going to get this enormous disparity between the haves and the have nots.”•

 

Tags: ,

From the March 28, 1886 Brooklyn Daily Eagle:

Tags:

In the latest excellent Sue Halpern NYRB piece, this one about Ashlee Vance’s Elon Musk bio, the critic characterizes the technologist as equal parts Iron Man and Tin Man, a person of otherworldly accomplishment who lacks a heart, his globe-saving goals having seemingly liberated him from a sense of empathy.

As Halpern notes, even Steve Jobs, given to auto-hagiography of stunning proportion, had ambitions dwarfed by Musk’s, who aims to not just save the planet but to also take us to a new one, engaging in a Space Race to Mars with NASA (while simultaneously doing business with the agency). The founder of Space X, Tesla, etc., may be parasitic on existing technologies, but he’s intent on revitalizing, not damaging, his hosts, doing so by bending giant corporations, entire industries and even governments to meet his will. An excerpt:

Two years after the creation of SpaceX, President George W. Bush announced an ambitious plan for manned space exploration called the Vision for Space Exploration. Three years later, NASA chief Michael Griffin suggested that the space agency could have a Mars mission off the ground in thirty years. (Just a few weeks ago, six NASA scientists emerged from an eight-month stint in a thirty-six-foot isolation dome on the side of Mauna Loa meant to mimic conditions on Mars.) Musk, ever the competitor, says he will get people to Mars by 2026. The race is on.

How are those Mars colonizers going to communicate with friends and family back on earth? Musk is working on that. He has applied to the Federal Communications Commission for permission to test a satellite-beamed Internet service that, he says, “would be like rebuilding the Internet in space.” The system would consist of four thousand small, low-orbiting satellites that would ring the earth, handing off services as they traveled through space. Though satellite Internet has been tried before, Musk thinks that his system, relying as it does on SpaceX’s own rockets and relatively inexpensive and small satellites, might actually work. Google and Fidelity apparently think so too. They recently invested $1 billion in SpaceX, in part, according to The Washington Post, to support Musk’s satellite Internet project.

While SpaceX’s four thousand circling satellites have the potential to create a whole new meaning for the World Wide Web, since they will beam down the Internet to every corner of the earth, the system holds additional interest for Musk. “Mars is going to need a global communications system, too,” he apparently told a group of engineers he was hoping to recruit at an event last January in Redmond, Washington. “A lot of what we do developing Earth-based communications can be leveraged for Mars as well, as crazy as that may sound.”

Tags: , ,

Jeez, Jim Holt is a dream to read. If you’ve never picked up his 2012 philosophical and moving inquiry, Why Does the World Exist?: An Existential Detective Story, it’s well worth your time. For awhile it was cheekily listed No. 1 at the Strand in the “Best Books to Read While Drunk” category, but I don’t drink, and I adored it.

In a 2003 Slate article, “My Son, the Robot,” Holt wrote of Bill McKibben’s Enough: Staying Human in an Engineered Age, a cautionary nonfiction tale that warned our siblings soon would be silicon sisters, thanks to the progress of genetic engineering, robotics, and nanotechnology. It was only a matter of time.

Holt was unmoved by the clarion call, believing human existence unlikely to be at an inflection point and thinking the author too dour about tomorrow. While Holt’s certainly right that we’re not going to defeat death anytime soon despite what excitable Transhumanists promise, both McKibben and the techno-optimists probably have time on their side.

An excerpt:

Take McKibben’s chief bogy, genetic engineering—specifically, germline engineering, in which an embryo’s DNA would be manipulated in the hopes of producing a “designer baby” with, say, a higher IQ, a knack for music, and no predisposition to obesity. The best reason to ban it (as the European Community has already done) is the physical risk it poses to individuals—namely, to the children whose genes are altered, with unforeseen and possibly horrendous consequences. The next best reason is the risk it poses to society—exacerbating inequality by creating a “GenRich” class of individuals who are smarter, healthier, and handsomer than the underclass of “Naturals.” McKibben cites these reasons, as did Fukuyama (and many others) before him. However, what really animates both authors is a more philosophical point: that genetic engineering would alter “human nature” (Fukuyama) or take away the “meaning” of life (McKibben). As far as I can tell, the argument from human nature and the argument from meaning are mere terminological variants of each other. And both are woolly, especially when contrasted with the libertarian argument that people should be free to do what they wish as long as other parties aren’t harmed.

Finally, McKibben’s reasoning fitfully betrays a vulgar variety of genetic determinism. He approvingly quotes phrases like “genetically engineered thoughts.” Altering an embryo’s DNA to make your child, say, less prone to violence would turn him into an “automaton.” Giving him “genes expressing proteins to boost his memory, to shape his stature” would leave him with “no more choice about how to live his life than a Hindu born untouchable.” Why isn’t the same true with the randomly assigned genes we now have?

Now to the deeper fallacy. McKibben takes it for granted that we are at an inflection point of history, suspended between the prehistoric and the Promethean. He writes, “we just happen to be alive at the brief and interesting moment when [technological] growth starts to really matter—when it spikes.” Everything is about to change.

The extropian visionaries arrayed against him—people like Ray Kurzweil, Hans Moravec, Marvin Minsky, and Lee Silver—agree. In fact, they think we are on the verge of conquering death (which McKibben thinks would be a terrible thing). And they mean RIGHT NOW. When you die, you should have your brain frozen; then, in a couple of decades, it will get thawed out and nanobots will repair the damage; then you can start augmenting it with silicon chips; finally, your entire mental software, and your consciousness along with it (you hope), will get uploaded into a computer; and—with multiple copies as insurance—you will live forever, or at least until the universe falls apart.•

 

Tags: ,

If there were two writers whose hearts beat as one despite a generational divide, it would have been Henry Miller and Hunter S. Thompson. When I tweeted on Saturday about a 1965 Thompson article regarding Big Sur becoming too big for its own good, it reminded me of an earlier piece the Gonzo journalist had written about the community, a 1961 Rogue article which centered on Miller’s life there. Big Sur was a place the novelist went for peace and solitude, which worked out well until aspiring orgiasts located it on a map and became his uninvited cult. Despite Miller’s larger-than-life presence, Thompson focuses mostly on the eccentricities of the singular region. I found the piece at Totallygonzo.org. Just click on the pages for a larger, readable version.

 

Tags: ,

In one of his typically bright, observant posts, Nicholas Carr wryly tackles Amazon’s new scheme of paying Kindle Unlimited authors based on how many of their pages are read, a system which reduces the written word to a granular level of constant, non-demanding engagement. 

There’s an argument to be made that like systems have worked quite well in the past: Didn’t Charles Dickens publish under similar if not-as-precisely-quantified circumstances when turning out his serial novels? Sort of. Maybe not to the same minute degree, but he was usually only as good as his last paragraph (which, thankfully, was always pretty good).

The difference is while it worked for Dickens, this process hasn’t been the motor behind most of the great writing in our history. James Joyce would not have survived very well on this nano scale. Neither would have Virginia Woolf, William Faulkner, Marcel Proust, etc. Their books aren’t just individual pages leafed together but a cumulative effect, a treasure that comes only to those who clear obstacles.

Shakespeare may have had to pander to the groundlings to pay the theater’s light bill, but what if the lights had been turned off mid-performance if he went more than a page without aiming for the bottom of the audience?

Carr’s opening:

When I first heard that Amazon was going to start paying its Kindle Unlimited authors according to the number of pages in their books that actually get read, I wondered whether there might be an opportunity for an intra-Amazon arbitrage scheme that would allow me to game the system and drain Jeff Bezos’s bank account. I thought I might be able to start publishing long books of computer-generated gibberish and then use Amazon’s Mechanical Turk service to pay Third World readers to scroll through the pages at a pace that would register each page as having been read. If I could pay the Turkers a fraction of a penny less to look at a page than Amazon paid me for the “read” page, I’d be able to get really rich and launch my own space exploration company.

Alas, I couldn’t make the numbers work. Amazon draws the royalties for the program from a fixed pool of funds, which serves to cap the upside for devious scribblers.

So much for my Mars vacation. Still, even in a zero-sum game that pits writer against writer, I figured I might be able to steal a few pennies from the pockets of my fellow authors. (I hate them all, anyway.) I would just need to do a better job of mastering the rules of the game, which Amazon was kind enough to lay out for me:

Under the new payment method, you’ll be paid for each page individual customers read of your book, the first time they read it. … To determine a book’s page count in a way that works across genres and devices, we’ve developed the Kindle Edition Normalized Page Count (KENPC). We calculate KENPC based on standard settings (e.g. font, line height, line spacing, etc.), and we’ll use KENPC to measure the number of pages customers read in your book, starting with the Start Reading Location (SRL) to the end of your book.

The first thing that has to be said is that if you’re a poet, you’re screwed.•

 

Tags:

David Brooks’ recent op-ed about Ta-Nehisi Coates, the one in which he approached the critic and his skin color cautiously and with some surprise, as if he’d happened upon a strange creature in a forest, was most troubling to me for two reasons, both of which were demonstrated in the same passage.

This one:

You are illustrating the perspective born of the rage “that burned in me then, animates me now, and will likely leave me on fire for the rest of my days.”

I read this all like a slap and a revelation. I suppose the first obligation is to sit with it, to make sure the testimony is respected and sinks in. But I have to ask, Am I displaying my privilege if I disagree? Is my job just to respect your experience and accept your conclusions? Does a white person have standing to respond?

If I do have standing, I find the causation between the legacy of lynching and some guy’s decision to commit a crime inadequate to the complexity of most individual choices.

I think you distort American history.•

  • The line “Does a white person have standing to respond?” is a galling transference of burden from people of actual oppression to people who’ve experienced none, a time-tested trick in the country, and one used repeatedly in discussions about Affirmative Action and other issues. The faux victimhood is appalling.
  • Even worse is this doozy: “I find the causation between the legacy of lynching and some guy’s decision to commit a crime inadequate to the complexity of most individual choices.” Here’s a negation of history and culpability that’s jaw-dropping. Does Brooks believe the inordinate poverty, imprisonment and shorter lifespans of African-Americans stem solely from their decisions? Does he believe Native Americans, the targets of another American holocaust, have such pronounced social problems because of poor choices they made? From Jim Crow to George Zimmerman, American law and justice has often been designed to reduce former slaves and their descendants, not to keep the peace but to maintain the power. If Brooks wants to point out the Civil Rights Act as a remedy to Jim Crow, that’s fine, but you don’t get to do a victory lap for offering basic decency, and a sensible person doesn’t believe that improvement in our country, often a slow and grueling and bloody thing, doesn’t leave deep scars.•

Tags: ,

« Older entries § Newer entries »