Books

You are currently browsing the archive for the Books category.

“Carver’s prose does not flinch,” Giles Harvey wrote in a 2010 NYRB piece of the short stories of the late deadpan tragedian Raymond Carver, which were marked by sinewy sentences that spoke truths which could not be altered nor were they protested. 

For decades there’s been a controversy over the role of Carver’s editor Gordon Lish in perfecting the lean stories, which were pruned and reshaped significantly from manuscript to publication. It’s a contretemps that survived Carver, who died at 50 in 1988, and one that continues to irk his widow, Tess Gallagher, though it does seem Lish’s participation was more meaningful than the average. And it also feels like stories were sometimes cut too much, that fragments of bone had been removed along with the fat.

More from Harvey’s piece, a passage about Carver receiving the edited copy of What We Talk About When Talk About Love: 

He had just spent the whole night going over Lish’s edited version of the book and was taken aback by the changes. His manuscript had been radically transformed. Lish had cut the total length of the book by over 50 percent; three stories were at least 70 percent shorter; ten stories had new titles and the endings of fourteen had been rewritten.•

Of course, we don’t let the questions of Shakespeare’s authorship ruin that canon, so eventually this literary feud will matter little, just the work will remain. But you can understand how it peeves the surviving principals. 

The opening of D.T. Max’s 1998 New York Times Magazine piece, “The Carver Chronicles“:

For much of the past 20 years, Gordon Lish, an editor at Esquire and then at Alfred A. Knopf who is now retired, has been quietly telling friends that he played a crucial role in the creation of the early short stories of Raymond Carver. The details varied from telling to telling, but the basic idea was that he had changed some of the stories so much that they were more his than Carver’s. No one quite knew what to make of his statements. Carver, who died 10 years ago this month, never responded in public to them. Basically it was Lish’s word against common sense. Lish had written fiction, too: If he was such a great talent, why did so few people care about his own work? As the years passed, Lish became reluctant to discuss the subject. Maybe he was choosing silence over people’s doubt. Maybe he had rethought what his contribution had been — or simply moved on.

Seven years ago, Lish arranged for the sale of his papers to the Lilly Library at Indiana University. Since then, only a few Carver scholars have examined the Lish manuscripts thoroughly. When one tried to publish his conclusions, Carver’s widow and literary executor, the poet Tess Gallagher, effectively blocked him with copyright cautions and pressure. I’d heard about this scholar’s work (and its failure to be published) through a friend. So I decided to visit the archive myself.

What I found there, when I began looking at the manuscripts of stories like ”Fat” and ”Tell the Women We’re Going,” were pages full of editorial marks — strikeouts, additions and marginal comments in Lish’s sprawling handwriting. It looked as if a temperamental 7-year-old had somehow got hold of the stories. As I was reading, one of the archivists came over. I thought she was going to reprimand me for some violation of the library rules. But that wasn’t why she was there. She wanted to talk about Carver. ”I started reading the folders,” she said, ”but then I stopped when I saw what was in there.”

It’s understandable that Lish’s assertions have never been taken seriously. The eccentric editor is up against an American icon. When he died at age 50 from lung cancer, Carver was considered by many to be America’s most important short-story writer. His stories were beautiful and moving. At a New York City memorial service, Robert Gottlieb, then the editor of The New Yorker, said succinctly, ”America has just lost the writer it could least afford to lose.” Carver is no longer a writer of the moment, the way David Foster Wallace is today, but many of his stories — ”Cathedral,” ”Will You Please Be Quiet, Please?” and ”Errand” — are firmly established in the literary canon. A vanguard figure in the 1980’s, Carver has become establishment fiction.

That doesn’t capture his claim on us, though. It goes deeper than his work. Born in the rural Northwest, Carver was the child of an alcoholic sawmill worker and a waitress. He first learned to write through a correspondence course. He lived in poverty and suffered multiple bouts of alcoholism throughout his 30’s. He struggled in a difficult marriage with his high-school girlfriend, Maryann Burk. Through it all he remained a generous, determined man — fiction’s comeback kid. By 1980, he had quit drinking and moved in with Tess Gallagher, with whom he spent the rest of his life. ”I know better than anyone a fellow is never out of the woods,” he wrote to Lish in one of dozens of letters archived at the Lilly. ”But right now it’s aces, and I’m enjoying it.” Carver’s life and work inspired faith, not skepticism.

Still, a quick look through Carver’s books would suggest that what Lish claims might have some merit.•

Tags: ,

Of the two things that could transform the world, Tesla and SpaceX, the former is far more plausible to succeed in its goal, which would be to environmentally remake the home and roads, but Elon Musk sees each as equally necessary for the human race to survive. Bloomberg has published an excellent segment from Ashlee Vance’s new book about Musk in which the writer makes clear how close the industrialist/technologist came to losing both the electric-and-solar empire and a shot at colonizing Mars.

SpaceX began with a dream of sending mice to our neighboring planet in a rocket purchased from the Russians, but consumer frustration forced Musk to build his own mini-NASA start-up, and for his ambitions to grow exponentially. 

An excerpt:

Elon and Justine decided to move south to begin their family and the next chapter of their lives in Los Angeles. Unlike many Southern California transplants, they were drawn by the technology. The mild, consistent weather made it ideal for the aeronautics industry, which had been there since the 1920s, when Lockheed Aircraft set up shop in Hollywood. Howard Hughes, the U.S. Air Force, NASA, Boeing, and a mosaic of support industries followed suit. While Musk’s space plans were vague at the time, he felt confident that he could recruit some of the world’s top aeronautics thinkers and get them to join his next venture.

Musk started by crashing the Mars Society, an eclectic collection of space enthusiasts dedicated to exploring and settling the Red Planet. They were holding a fund-raiser in mid-2001, a $500-per-plate event at the house of one of the well-off Mars Society members. What stunned Robert Zubrin, the head of the group, was the reply from someone named Elon Musk, whom no one could remember inviting. “He gave us a check for $5,000,” Zubrin said. “That made everyone take notice.” Zubrin invited Musk for coffee ahead of the dinner and told him about the research center the society had built in the Arctic to mimic the tough conditions of Mars and the experiments they had been running for something called the Translife Mission, in which there would be a capsule orbiting earth carrying a crew of mice. It would spin to give them one-third gravity—the same as Mars—and they would live there and make babies.

When it was time for dinner, Zubrin placed Musk at the VIP table next to himself, the director and space buff James Cameron, and Carol Stoker, a planetary scientist for NASA. Musk loved it. “He was much more intense than some of the other millionaires,” Zubrin said. “He didn’t know a lot about space, but he had a scientific mind. He wanted to know exactly what was being planned in regards to Mars and what the significance would be.” Musk took to the Mars Society right away and joined its board of directors. He donated an additional $100,000 to fund a research station in the desert.

Musk’s friends were not entirely sure what to make of his mental state at that time. He’d caught malaria while on vacation in Africa and lost a tremendous amount of weight fighting it off. Musk stands 6-foot-1 but usually seems much bigger than that. He’s broad-shouldered, sturdy, and thick. This version of Musk, though, looked emaciated and with little prompting would start expounding on his desire to do something meaningful with his life. “He said, ‘The logical thing to happen next is solar, but I can’t figure out how to make any money out of it,’ ” said George Zachary, an investor and close friend of Musk’s, recalling a lunch date at the time. “He started talking about space, and I thought he meant office space like a real estate play.” Musk had already started thinking beyond the Mars Society’s goals. Rather than send a few mice into earth’s orbit, Musk wanted to send them to Mars.

“He asked if I thought that was crazy,” Zachary said. “I asked, ‘Do the mice come back? Because, if they don’t, yeah, most people will think that’s crazy.’ ” Musk said that the mice were not only meant to go to Mars and come back but they also would come home with the baby mice, too.•

Tags: , , , , ,

I’ve been on the receiving end of puzzling looks more than once for saying Catch-22 isn’t the best novel Joseph Heller wrote. That work is a classic American novel in the sense that it overflows and sprawls, attempting almost too much–it’s brilliant and flawed. On the other hand, Something Happened, Heller’s devastating, little-read 1974 book is a precise masterpiece without a wasted word. I haven’t re-read either since my college years, so perhaps my feelings would be reversed now, but both deserve at least equal attention, which is certainly not how it’s worked out. 

In an essay at the Los Angeles Review of Books, Carmen Petaccio agrees with me, encouraging readers to discover Heller’s forgotten novel. The opening:

THE MOST CRIMINALLY OVERLOOKED great novel of the past half century is a book called Something Happened, which this year celebrates the 40th anniversary of its publication. Joseph Heller spent more than a decade writing the novel and was so convinced of its genius that he stashed manuscripts all over Manhattan, ensuring that Something Happened would survive in the event his apartment burned down. When he finally brought the completed draft to his agent, he forced his daughter to accompany him on the trip — so she could deliver the pages in case he suffered a coronary or got hit by a bus. In 1974, 13 years after Catch-22 began its gradual ascent into the rarefied realm of idiom, Something Happened was released to a collective cultural shrug, delivering the book its first firm nudge down the slippery slope that bottoms at obscurity. Today, the novel is perhaps best remembered for Kurt Vonnegut’s artfully impartial appraisal in The New York Times Book Review, which described it as “one of the unhappiest books ever written.” Vonnegut wasn’t entirely wrong.

Something Happened is, by design, a punishingly bleak novel. It’s dense and overlong, sometimes sadistically so, and it offers a minimum in the way of resolution or plot. If the novel’s worldview were a color, the human eye would likely fail to perceive its darkness. What is surprising, though, is how by virtue of that same bleakness, Something Happened becomes one of the most pleasurable, engrossing, and in retrospect moving American novels ever written. If you’ve read Something Happened, and you get why others haven’t, then you make it your little mission to convince people that they should.•

Tags: ,

The occasion of French astronomer and author Camille Flammarion’s second marriage in 1920 gave opportunity to the Brooklyn Daily Eagle to publish his thoughts on a machine Thomas Edison announced he was working on, which would purportedly allow the living to communicate with the dead. Talk about a long-distance call.

Flammarion, who believed a personality of sorts survived after life had ended, was understandably excited about the deceased being conjured via allegedly scientific means in Menlo Park. In addition to the serious astronomical work he published, Flammarion wrote sci-fi and speculative narratives and is credited with birthing the idea of an alien race superior to Earthlings, which he believed in actuality and utilized as a plot device in his fiction. 

 

Tags: ,

An Economist article about Richard Thaler’s new book, Misbehaving: The Making of Behavioral Economics, looks at how this sub-field of the dismal science still receives resistance when applied beyond the granular level, when looking at the illogic of the broader economy as opposed to the folly of the individual. The opening:

CAB drivers have good days and bad days, depending on the weather or special events such as a convention. If they were rational, they would work hardest on the good days (to maximise their take) but give up early when fares are few and far between. In fact, they do the opposite. It seems they have a mental target for their desired daily income and they work long enough to reach it, even though that means working longer on slow days and going home early when fares are plentiful.

Human beings are not always logical. We treat windfall gains differently from our monthly salary. We value things that we already own more highly than equivalent things we could easily buy. Our responses to questions depends very much on how the issue is framed: we think surcharges on credit-card payments are unfair, but believe a discount for paying with cash is reasonable.

None of these foibles will be a surprise to, well, humans. But they are not allowed for in many macroeconomic models, which tend to assume people actually come from the planet Vulcan, all coolly maximising their utility at every stage. Over the past 30-40 years, in contrast, behavioural economists have explored the way that individuals actually make decisions, and have concluded that we are more Kirk than Spock.•

Tags:

Like myself, Elon Musk rents sumo wrestlers for his parties. I, however, also employ former astronauts to serve drinks. You can put your cigarette out on Buzz Aldrin’s forehead, and he will accept it with quiet resignation. 

Seriously, Elon Musk is a super-wealthy, highly driven and somewhat odd guy, which we already know, but in Dwight Garner’s NYT review of Ashlee Vance’s new Musk bio, his features are given some definition. An excerpt:

Other eye-popping details, not all of them previously reported, are flecked atop this book like sea salt. His five children don’t merely have nannies but have had a nanny manager. He worries that Google is building a fleet of robots that may accidentally destroy mankind. He rents castles and sumo wrestlers for his parties. At one of them, a knife thrower aimed at a balloon between the blindfolded Mr. Musk’s legs.

The best thing Mr. Vance does in this book, though, is tell Mr. Musk’s story simply and well. It’s the story of an intelligent man, for sure. But more so it is the story of a determined one. Mr. Musk’s work ethic has always been intense. One observer says about him early on, “We all worked 20 hour days, and he worked 23 hours.”

Mr. Musk was born in 1971 and grew up in Pretoria. His father was an engineer; his mother, whose family had roots in the United States and Canada, was a model and dietitian. There are indications his father was brutal, and that Mr. Musk is a tortured soul trying to make up for a wrecked childhood. But no one will speak specifically about any such events.•

Tags: , ,

Following up on the recent Ask Me Anything conducted by Philip Zimbardo, an alumnus of the Stanford Prison Experiment, the psychologist is interviewed by Stuart Jeffries of the Guardian about the new book he’s coauthored, Man (Dis)connected: How Technology Has Sabotaged What It Means To Be Male, a treatment of the boys-in-peril thesis he’s been pushing in recent years.

I’m really circumspect of Zimbardo’s generalizations, his idea that a scary amount of guys are essentially receptacles for “porn, video games and Ritalin.” In the article, he proffers the dubious idea that mothers love unconditionally and fathers provisionally, a stereotype that runs afoul of reality. Zimbardo also believes young men are retreating from work and responsibilities for reasons which have nothing to do with the paucity of jobs, which seems dubious. I really don’t recognize any male people I know in his stereotypes.

There are certain aspects of American unhappiness that can be analyzed along gender or race or class lines, but I think our biggest collective psychological problem is that we’re sold on consumer-culture idealizations that are bound to leave us disappointed. That, too, is a generalization, though I think a far more believable one than Zimbardo’s.

From Jeffries:

The book, by Zimbardo and his co-author Nikita D Coulombe, is about why boys don’t man up as previous generations of males ostensibly did.

They argue that, while girls are increasingly succeeding in the real world, boys are retreating into cyberspace, seeking online the security and validation they can’t get anywhere else. They are bored at school, increasingly have no father figures to motivate them, don’t have the skills to form real romantic relationships, feel entitled to have things done for them (usually by their parents) and seek to avoid a looming adulthood of debt, unfulfilling work and other irksome responsibilities. As a result, they disappear into their bedrooms where, he argues, they risk becoming addicted to porn, video games and Ritalin.

No wonder, Zimbardo argues, popular culture teems with moodles (“man poodles”) or infantilised jerks (think: Jackass, Failure to Launch, Step Brothers, Hall Pass and The Hangover series), devoid of economic purpose, emotional intelligence, temperamentally unable to commit or take responsibility.

Zimbardo claims that a majority of African-American boys have been brought up in female-dominated households for generations. “Sixty, 70% grow up in a female world. I would trace a lot of that poor performance of black kids to not having a father present to make demands and not setting limits. This is now spilling out of the black community to the white community.”•

Tags: ,

I’m nearly done with Martin Ford’s Rise of the Robots, an excellent book I’ll have more to say about soon. The author sat for an episode of the Review the Future podcast and spoke on automation and wealth inequality.

Ford agrees with something I mentioned in response to Thomas Piketty’s suggestion that we counteract our 1% world with an investment in education (and re-education). While that would be great, I think it likely won’t be nearly enough to solve income disparity or technological unemployment.

Two exchanges from the podcast follow.

_____________________________

Jon Perry: 

This topic often is scoffed at by economists, which is something you mentioned in your first book The Lights in the Tunnel, that it was almost unthinkable to a lot of people. I’m curious, as that book came out in 2009 and now six years have gone by, how has the conversation around this issue changed since then?

Martin Ford:

Well, I think it’s definitely changed and it’s become a lot more visible. As you say, when I wrote that book, I mentioned that it was almost unthinkable in terms of the way economists approached it, and that’s less true today. There are definitely some economists out there, at least a few, that are now talking seriously about this. So, it’s really been quite a dramatic change as these technologies and the implications of them have become more visible.

Having said that, I still think this is very much an idea that is outside the mainstream, and a lot of the economists that do talk about it tend to take what I would call a very conservative tack, which is they still tend to believe that the solution to all this is more education and training–all we have to do is train people so that they can climb the skills ladder and keep ahead of the machines. I think that’s an idea that pretty much has run out of steam, and we probably need to look at more radical proposals going forward. But definitely it is a much more visible topic now than it was back in 2009.

_____________________________

Jon Perry: 

Now, we’ve had information technology for not that long, but we’ve had it for a little while now, and there are certainly some troubling economic trends that we see–stagnating wages, rising inequality, for example. To what extent can we say that, say, information technology or even technology in general is at least partially the cause of these economic trends? And if that’s the thesis here, what would be the best way to actually try to tease that apart and measure technology’s impact on the labor force going forward?

Martin Ford:

It’s kind of a challenging problem. I believe obviously very strongly that information technology has been an important part of it. I would not argue that it’s all of it by any means. And in my new book, Rise of the Robots, I point out about seven general trends that you can look at.

That includes the fact that wages have stagnated while productivity has continued to increase, so productivity and incomes have kind of decoupled. It includes the fact that the share of income going to labor as opposed to capital has gone into a pretty precipitous decline, especially since the year 2000. The labor force participation rate, meaning the number of people who are actually actively engaged in work, is falling. We’re seeing wages for college graduates actually going into decline, so it’s not the case anymore that people with higher educations are doing extremely well; a lot of people with college degrees are also being impacted.

So, there are a number of things you could look at there, and the thing is that if you take any one of these and you look at the research that economists have done, there are lots of explanations. Technology is nearly always one of the explanations, but there, of course, are other explanations. There’s globalization, there’s a basic change in our politics, which will become more conservative. In particular, there’s the decimation of unions in the private sector. Depending on who’s doing the analysis and sometimes what their agenda is, they will point to those things as being more important than technology.

But what I believe is that if you take all of that evidence collectively, if you look at all of those things together, it’s really hard to come up with one explanation other than technology that can explain all of those things.•

Tags: ,

One revelation from the Reddit AMA conducted by Philip Zimbardo, still best known for the infamous 1971 Stanford Prison Experiment, a dress rehearsal more or less for Abu Ghraib, was that the psychologist was high school classmates with Stanley Milgram, author of the equally controversial “Obedience to Authority” study. That must have been some high school! Zimbardo was joined by writer Nikita Coulombe, to discuss their new book Man (Dis)connected. A few exchanges follow about the notorious test at Stanford.

___________________________

Question:

If you had a chance to do the Stanford Prison Experiment again, what would you do differently?

Philip Zimbardo:

Yes I would, I would have only played the role of researcher and there would be someone above me, who would be the superintendent of the prison and when things got out of hand I would have been in a better position to terminate the study earlier and more appropriately.

___________________________

Question:

In context of the famous prison experiment, when you were first organizing it, what were some of the specific dangers you tried to avoid?

Philip Zimbardo:

We selected young men who were physically healthy and psychologically normal, we had prior arrangements with student health if that was necessary. Each student was given informed consent, so they knew that there would likely be some levels of stress, so they had some sense of what was to come. Physical violence by the guards, especially if there was a revolt, solitary confinement beyond the established one hour limit, but primarily trying to minimise acts of sexual degradation.

___________________________

Question:

Being particularly interested in social psychology, I’m a big fan of what you have accomplished through your research. I was wondering what really got you interested in social psychology, and your research is connected to that of Stanley Milgram, another favourite psychologist of mine – so what I’m asking is what initially got you into this field of psychology, and what did you think of Milgram’s research when you first came across it?
permalink

Philip Zimbardo:

Thank you. I was interested in psychology from a young age: I grew up in the Bronx in the 1930s and started wondering why some people would go down certain paths, like joining a gang, while others didn’t. I was also high school classmates with Stanley Milgram; we were both asking the same questions.

___________________________

Question:

If there was a film adaptation dramatizing the events of the Stanford Prison Experiment, who would you want to play you?

Philip Zimbardo:

Glad you asked the question, amazingly there is a new Hollywood movie that just premiered at the Sundance film festival to great reviews winning lots of prizes titled The Stanford Prison Experiment. It will have national showings in America starting in July and hopefully in Europe in the Fall. I was hoping that the actor who would play me would be either Johnny Depp or Andy Garcia but they were not available so instead a wonderful young actor, Billy Crudup is Dr Z. You may be aware of his great acting in Almost Famous and Dr Manhattan in Watchmen.•

___________________________

“Jesus Christ, I’m burning up inside–don’t you know?”:

Tags: , ,

Last year, I posted a 1950 Brooklyn Daily Eagle article in which Norbert Wiener, father of cybernetics, predicted society being crushed by the metal grip of robots, with automation upending our accepted order. The year prior he was assigned to write “what the ultimate machine age is likely to be” by the New York Times. The piece was never published. That article is referenced in Martin Ford’s provocative new book, The Rise of the Robots, so I thought I would present an excerpt (which eventually made it into the NYT two years ago). From the “Mass-Produced Laborers” section:

We have so far spoken of the computing machine as an analogue to the human nervous system rather than to the whole of the human organism. Machines much more closely analogous to the human organism are well understood, and are now on the verge of being built. They will control entire industrial processes and will even make possible the factory substantially without employees.

In these the ultra-rapid digital computing machines will be supplemented by pieces of apparatus which take the readings of gauges, of thermometers, or photo-electric cells, and translate them into the digital input of computing machines. The new assemblages will also contain effectors, by which the numerical output of the central machine will be converted into the rotation of shafts, or the admission of chemicals into a tank, or the heating of a boiler, or some other process of the kind.

Furthermore, the actual performance of these effector organs as well as their desired performance will be read by suitable gauges and taken back into the machine as part of the information on which it works.

The general outline of the processes to be carried out will be determined by what computation engineers call taping, which will state and determine the sequence of the processes to be performed. The possibility of learning may be built in by allowing the taping to be re-established in a new way by the performance of the machine and the external impulses coming into it, rather than having it determined by a closed and rigid setup, to be imposed on the apparatus from the beginning.

The limitations of such a machine are simply those of an understanding of the objects to be attained, and of the potentialities of each stage of the processes by which they are to be attained, and of our power to make logically determinate combinations of those processes to achieve our ends. Roughly speaking, if we can do anything in a clear and intelligible way, we can do it by machine.•

Tags: ,

In a lucid and lively NYRB review of Robert D. Putnam’s Our Kids: The American Dream in Crisis, a new volume which suggests that a myriad of grassroots problems have caused social mobility to founder, Nicholas Lemann astutely questions if the author perhaps has the cause and effect confused. Maybe a fraying social fabric hasn’t led to a decline in upward mobility but a flattening of the middle class and those who aspire to it has instead caused a sense of community to crumble. 

Due to globalization, de-unionization, changes in taxation and automation, the piece of the American pie enjoyed by the non-wealthy has been in steep decline in the U.S. since 1974. Between the lines, Putnam, like Lemann and Thomas Piketty, seems to acknowledge that higher education, more than social capital, is a likelier remedy to the problem. I wonder, though, how much longer that will be true. Technological forces that have disrupted other industries will likely soon come for the university, which could lead to an increased polarization in credentials and access (even as more knowledge than ever will be available online). That shift coupled with what might be a decline in opportunity owing to robotization could neutralize even the great equalizer.

Another interesting bit from Lemann: Individual mobility has always been something of a myth, as each person has usually been raised or lowered by the greater sweep of history and politics. Dick, born at an inopportune moment, would likely remain ragged.  

An excerpt:

By the logic of the book, access to social capital ought to be strongly associated with going to college and doing well there—otherwise, why stress it so strongly? The syllogism would be: social capital leads to educational attainment, which leads to mobility. But for his classmates, Putnam reports, academic achievement was the factor most predictive of college attendance, and the link between such achievement and parental encouragement (of the kind he has copiously praised in the main body of the book) was only “modestly important,” and “much weaker” than the link between class rank and college attendance. Not only that:

No other measure of parental affluence or family structure or neighborhood social capital (or indeed anything else we had measured)—none of the factors that this book has shown are so important in producing today’s opportunity gap—had any appreciable effect on college attendance or other educational attainment.

In the methods appendix, Putnam refers readers to his website for more detail on his findings about his classmates. There, he writes:

No measure of parental resources adds any predictive power whatsoever—not parental occupational status, not parental unemployment, not family economic insecurity during high school, not homeownership, not neighborhood characteristics, and not family structure…. Parental education, parental encouragement, and class rank were all modestly predictive of extracurricular participation, but holding constant those variables, extracurricular participation itself was unrelated to college-going.

So is it really the case that Putnam has shown that strong social capital once produced individual opportunity—let alone that the deterioration of social capital has produced what he calls the opportunity gap? The passages I just quoted seem to indicate that the strong association between social capital and opportunity that is Putnam’s core assertion has not been proven. Putnam doesn’t define “social capital” precisely enough to rigorously test its effects, even on as small and unrepresentative a sample as the one in his survey, and he doesn’t attempt to test its effects precisely in the present. It could even be that, rather than social capital generating prosperity, prosperity might generate social capital, which would mean Putnam has been showing us the effects of inequality, not the causes.•

Tags: ,

It’s certainly disingenuous that the UK publication the Register plastered the word “EXCLUSIVE” on Brid-Aine Parnell’s Nick Bostrom interview, since the philosopher, who’s become widely known for writing about existential risks in his book Superintelligence, has granted many interviews in the past. The piece is useful, however, for making it clear that Bostrom is not a confirmed catastrophist, but rather someone posing questions about challenges we may (and probably will) face should our species continue in the longer term. An excerpt:

Even if we come up with a way to control the AI and get it to do “what we mean” and be friendly towards humanity, who then decides what it should do and who is to reap the benefits of the likely wild riches and post-scarcity resources of a superintelligence that can get us out into the stars and using the whole of the (uninhabited) cosmos.

“We’re not coming from a starting point of thinking the modern human condition is terrible, technology is undermining our human dignity,” Bostrom says. “It’s rather starting from a real fascination with all the cool stuff that technology can do and hoping we can get even more from it, but recognising that there are some particular technologies that also could bring risks that we really need to handle very carefully.

“I feel a little bit like humanity is a bit like an infant or a teenager: some fairly immature person who has got their hands on increasingly powerful instruments. And it’s not clear that our wisdom has kept pace with our increasing technological prowess. But the solution to that is to try to turbo-charge the growth of our wisdom and our ability to solve global coordination problems. Technology will not wait for us, so we need to grow up a little bit faster.”

Bostrom believes that humanity will have to collaborate on the creation of an AI and ensure its goal is the greater good of everyone, not just a chosen few, after we have worked hard on solving the control problem. Only then does the advent of artificial intelligence and subsequent superintelligence stand the greatest chance of coming up with utopia instead of paperclipped dystopia.

But it’s not exactly an easy task.•

Tags: ,

I loved the Rem Koolhaas book Delirious New York, but I happened to be in Seattle in 2004 the week the Central Library he designed opened and I wasn’t really enamored of it the way I am many of his other works. It has an impressive exterior, but the interior felt like it was meant more to be looked at than utilized, though I guess that is the epitome of the modern library in a portable world, the best-case scenario, even–perhaps people will at least take a glance.

As his Fondazione Prada is set to open in Milan this month in a repurposed, century-old industrial space, the architect has become more focused on revitalization and preservation rather than outré original visions. From a Spiegel Q&A with him conducted by Marianne Wellershoff:

Kultur Spiegel:

Does a building need to have a certain age or degree of prominence for us to recognize it as important?

Rem Koolhaas:

The idea of preservation dates back to the beginning of the modern age. During the 19th century, people essentially felt that something had to be at least 2,000 years old to be worthy of preservation. Today, we already decide during the planning stages how long a building should exist. At first, historical monuments were deemed worthy of preservation, then their surroundings, then city districts and finally large expanses of space. In Switzerland the entire Rhaetian Railway has been added to the list of UNESCO World Heritage Sites. The dimensions and repertoire of what is worthy of preserving have expanded dramatically.

Kultur Spiegel:

Were there structures in recent years that you think should have been better preserved?

Rem Koolhaas:

The Berlin Wall, for example. Only a few sections remain, because no one knew at the time how to deal with this monument. I find that regrettable.

Kultur Spiegel:

And what do you think of the concrete architecture of the 1960s, a style known as brutalism? Should it be protected or torn down?

Rem Koolhaas:

We should preserve some of it. It would be madness for an entire period of architectural history — that had a major influence on cities around the world — to disappear simply because we suddenly find the style ugly. This brings up a fundamental question: Are we preserving architecture or history?

Kultur Spiegel:

What is your answer?

Rem Koolhaas:

We have to preserve history.

 

Tags: ,

It would cost less to offer guaranteed paid work to unemployed Americans than to finance a social safety net, but there’s really no movement on either side of the aisle in Washington to aid the long-time unemployed, those left behind by the 2008 financial collapse and the growth of robotics. The problem has just been permitted to percolate.

In a Financial Times piece, Martin Wolf looks at two new titles about the haves and have-nots, Inequality: What Can be Done? by Anthony Atkinson and The Globalization of Inequality by François Bourguignon. Interesting that the acceleration of inequality is most marked in the U.S. and U.K. and has not been shared by all other industrialized nations. France, in fact, has seen disparity decrease during the same timeframe. An excerpt: 

Both authors agree that something should be done about inequality. Atkinson provides a number of arguments for concern over rising inequality within rich countries. Some argue, for example, that only equality of opportunity matters. To this he responds that successful personal outcomes are often merely a matter of luck, that the structure of rewards is often grossly unfair and that, with sufficient inequality of outcome, equality of opportunity must be mirage.

Beyond this, argues Atkinson, unequal societies do not function well. The need to protect personal security or to incarcerate ever more people is likely to become a drag on economic performance and inimical to civilised life. If inequality becomes extreme, many will be unable to participate fully in their society. In any case, argues Atkinson, a pound in the hands of someone living on £10,000 a year must be worth more than it is to someone living on £1m. This does not justify complete equality, since the attempt to achieve it will impose costs. But it does mean that high inequality needs to be justified.

Atkinson goes far further, offering a programme of radical reform for the UK. It is not merely radical, but precise and (to the extent such a programme can be) costed. It starts from the argument that rising inequality “is not solely the product of forces outside our control. There are steps that can be taken by governments, acting individually or collectively, by firms, by trade union and consumer organisations, and by us as individuals to reduce the present levels of inequality.”What about policy? At the global level, both authors recommend improved and more generous aid. Bourguignon adds that properly managed trade has much to offer developing countries. Within countries, both authors call for higher taxes on wealth and incomes, and for better regulation, particularly of finance. Also important, they agree, will be policies directly addressed at improving educational outcomes for the disadvantaged.

Thus policy makers should develop a national pay policy, including a statutory minimum wage set at the “living wage,” and should also offer guaranteed public employment at that rate.•

Tags: , ,

I’ve always traced the War on Drugs in the U.S. to the Nixon Administration, but British journalist Johann Hari, author of the new book Chasing the Scream, dates it to the end of Prohibition, particularly to bureaucrat Harry Anslinger, who later mentored Sheriff Joe Arpaio of Tent City infamy. He also reveals how intertwined crackdown was (and is) with racism. No shocker there.

The so-called War has been a huge failure tactically and financially and has criminalized citizens for no good reason. All the while, there’s been a tacit understanding that millions of Americans are hooked on Oxy and the like, dousing their pain with a perfectly legal script. These folks are far worse off than pot smokers, who are still afoul of the law in most states. I’m personally completely opposed to recreational drug use, but I feel even more contempt for the War on Drugs. It’s done far more harm than good.

Matthew Harwood of the ACLU interviews Hari at Medium. The opening:

Matthew Harwood:

So Chasing the Scream, what’s with the title?

Johann Hari:

The most influential person who no one has ever heard of is Harry Anslinger, the man who invented the modern War on Drugs — way before Nixon, way before Reagan. He’s the guy who takes over the Federal Bureau of Prohibition just as alcohol prohibition is ending. So, he inherits this big government department with nothing to do, and he basically invents the modern drug war to give his bureaucracy a purpose. For example, he had previously said marijuana was not a problem — he wasn’t worried about it, it wasn’t addictive — but he suddenly announces that marijuana is the most dangerous drug in the world, literally — worse than heroin — and creates this huge hysteria around it. He’s the first person to use the phrase “warfare against drugs.”

But he was driven by more than just trying to keep his large bureaucracy in work. When he was a little boy, he grew up in a place called Altoona in Pennsylvania, and he had this experience that really drove him all his life. He lived near a farmer and his wife, and one day, he goes to the farmhouse, and the farmer’s wife was screaming and asking for something. The farmer sent little Harry Anslinger to the local pharmacy to buy opiates — because of course opiates were legal. Harry Anslinger hurries back and gives the opiates to the farmer’s wife, and the farmer’s wife stops screaming. But he remembered this as this foundational moment where he realized the evils of drugs, and he becomes obsessed with eradicating drugs from the face of the earth. So I think of him as chasing this scream across the world. The tragedy is he created a lot of screams in turn.

It leads him to construct this global drug war infrastructure that we are all living with now. We are all living at end of the barrel of Harry Anslinger’s gun. He didn’t do it alone — I’m not a believer in the “Great Man Theory of History.” He could only do that because he was manipulating the fears of his time. But he played a crucial role.

Matthew Harwood:

We here at the ACLU look at the drug war and see that it has a disproportionate impact on communities of color. You find, however, that this war was pretty racist from the beginning.

Johann Hari:

If you had said to me four years ago, “Why were drugs banned?” I would have assumed it for the reasons people would give today — because you don’t want kids to use them or you don’t want people to become addicted. What’s striking when you look at the archives from the time is that almost never comes up. Overwhelmingly the reason why drugs are banned is race hysteria.•

Tags: , ,

Even someone as lacking in religion as myself can be perplexed by Richard Dawkins’ midlife anti-theology mission to irk people of faith on chat shows and the like. In his proselytizing–and that’s what it is–he has the fervor of a particularly devout and curmudgeonly priest. It’s true that many a horrid act has been committed in the name of the father, but so have many others been by those who believe (like Dawkins and I do) that we’re orphans. I don’t want to deny someone on an operating table (or the one doing the operating) from believing in a little in magic at that delicate moment, even if it is rot. Trust in science, and say a prayer if you like. 

But I wouldn’t let his noisily running a chariot over the gods make me deny his wonderful intellect and contributions to knowledge, from genes to memes. At Edge, the site’s founder and longtime NYC avant-gardist, John Brockman, has an engrossing talk with the evolutionary biologist about his “vision of life.” The transcript makes for wonderful reading.

Dawkins believes if life exists elsewhere in the universe (and his educated guess is that it does), it’s of the Darwinian, evolutionary kind, that no other biological system besides the one we know would work under the laws of physics. He also notes that we contribute in our own way to the amazing progress of life, even if our time on the playing field can be brutal and brief. As Dawkins puts it, “we are temporary survival machines” coded to be hellbent on seeing our genes persevere, even though life will eventually evolve in ways presently unimaginable to us. It will still be life, and that’s our gift to it. No matter what we personally feel is the main purpose of our existence, it’s actually that.

The opening:

Natural selection is about the differential survival of coded information which has power to influence its probability of being replicated, which pretty much means genes. Coded information, which has the power to make copies of itself—“replicator”—whenever that comes into existence in the universe, it potentially could be the basis for some kind of Darwinian selection. And when that happens, you then have the opportunity for this extraordinary phenomenon which we call “life.”

My conjecture is that if there is life elsewhere in the universe, it will be Darwinian life. I think there’s only one way for this hyper complex phenomenon which we call “life” to arise from the laws of physics. The laws of physics—if you throw a stone up in the air, it describes a parabola, and that’s it. But biology, without ever violating the laws of physics, does the most extraordinary things; it produces machines which can run, and walk, and fly, and dig, and swing through the trees, and think, and produce the whole of human technology, human art, human music. This all comes about because at some point in history, about 4 billion years ago, a replicating entity arose, not a gene as we would now see it, but something functionally equivalent to a gene, which because it had the power to replicate and the power to influence its own probability of replicating, and replicated with slight errors, gave rise to the whole of life. 

If you ask me what my ambition would be, it would be that everybody would understand what an extraordinary, remarkable thing it is that they exist, in a world which would otherwise just be plain physics. The key to the process is self-replication. The key to the process is that … let’s call them “genes” because nowadays they pretty much all are genes. Genes have different probabilities of surviving. The ones that survive, because they have such high fidelity replication, are the ones which we see in the world, the ones which dominate gene pools in the world. So for me, the replicator, the gene, DNA, is absolutely key to the whole process of Darwinian natural selection. So when you ask the question, what about group selection, what about higher levels of selection, what about different levels of selection, everything comes down to gene selection. Gene selection is fundamentally what is really going on. 

Originally these replicating entities would have been floating free and just replicating in the primeval soup, whatever that was. But they “discovered” a technique of ganging together into huge robot vehicles, which we call individual organisms.•

 

Tags: ,

At the Gawker site Phase Zero, William H. Arkin conducted a very interesting Q&A with Harper’s Washington Editor, Andrew Cockburn, who’s just published what’s a sadly timely book Kill Chain, which focuses on the U.S. droning program. Although the author doesn’t believe military droning will become automated, he feels the bigger-picture machinery of the system already is. Remote war has been a dream pursued since Tesla and now it’s a global reality. One exchange:

William M. Arkin:

The CIA’s drone program, the President’s drone program, Congressionally approved, not approved, tacitly accepted: almost every description of the drone program makes it sound like it isn’t the United States and its foreign policy. Is that the consequence of something unique to drones?

Andrew Cockburn:

It’s interesting, drones and covert foreign policy seem to go together. In Operation Menu, Nixon’s secret bombing of Cambodia, the B-52 flight paths were directed from the ground, as was the moment of bomb release. In other words, the B-52s were essentially drones. Maybe the drone campaign isn’t described as the foreign policy of the United States because there’s a tinge of embarrassment that we’re murdering people in foreign countries as a matter of routine.

Beyond that, maybe we should call it the drone program’s drone program, because it’s taken on a life of it’s own, a huge machine that exists to perpetuate itself. Just take a look at the jobs listed almost every day for just one of the Distributed Common Ground System stations at Langley AFB in the Virginia Southern Neck. On April 25, for example, various contractors (some of which you’ve never heard of) were asking for a “USAF Intelligence Resource Management Analyst,” a “Systems Integrator,” a “USAF Senior Intelligence Programs and Systems Support Analyst,” a “USAF ISR Weapons Systems Integration Support Analyst” a “DPOC Network Engineer,” whatever that is, and a few others. All high paying, all of course requiring Top Secret or higher clearances. Every so often we hear that the CIA drone program is going to be turned over to the military. I say, ‘good luck with that’ – is the CIA really going to obligingly hand over a huge chunk of its raison d’etre, and its budget, its enormous targeting apparatus? There’s a lot of talk about “autonomous drones,” which aren’t going to happen, but I think the whole system is autonomous, one giant robot that has become unstoppable as it grinds along, sucking up money and killing people along the way.•

Tags: ,

Oliver Sacks on a motorcycle in NYC, 1961. (Photo by Douglas White.)

I’ve read most of Lawrence Weschler’s books and gotten so much from them, particularly Seeing Is Forgetting the Name of the Thing One Sees and Vermeer in Bosnia. In a new Vanity Fair article, he uses passages from a long-shelved biography of his friend Oliver Sacks, the terminally ill neurologist, to profile the doctor in a way only a confidante and great writer can, revealing the many lives Sacks has lived, in addition to the public-intellectual one we’re all familiar with. Weschler is convinced that Sacks’ period of excessive experimentation with drugs when young led to his later scientific breakthroughs. An excerpt:

I had originally written him a letter, sometime in the late 70s, from my California home. Somehow back in college I had come upon Awakenings, published in 1973, an account of his work with a group of patients who had been warehoused for decades in a home for the incurable—they were “human statues,” locked in trance-like states of near-infinite remove following bouts of a now rare form of encephalitis. Some had been in this condition since the mid-1920s. These people were suddenly brought back to life by Sacks, in 1969, following his administration of the then new “wonder drug” L-dopa, and Sacks described their spring-like awakenings and the harrowing siege of tribulations that followed. In the book, Sacks gave the facility where all this happened the pseudonym “Mount Carmel,” an apparent reference to Saint John of the Cross and his Dark Night of the Soul. But, as I wrote to Sacks in that first letter, his book seemed to me much more Jewish and Kabbalistic than Christian mystical. Was I wrong?

He responded with a hand-pecked typed letter of a good dozen pages, to the effect that, indeed, the old people’s home in question, in the Bronx, was actually named Beth Abraham; that he himself came from a large and teeming London-based Jewish family; that one of his cousins was in fact the eminent Israeli foreign minister Abba Eban (another, as I would later learn, was Al Capp, of Li’l Abner fame); and that his principal intellectual hero and mentor-at-a-distance, whose influence could be sensed on every page of Awakenings, had been the great Soviet neuropsychologist A.R. Luria, who was likely descended from Isaac Luria, the 16th-century Jewish mystic.

Our correspondence proceeded from there, and when, a few years later, I moved from Los Angeles to New York, I began venturing out to Oliver’s haunts on City Island. Or he would join me for far-flung walkabouts in Manhattan. The successive revelations about his life that made up the better part of our conversations grew ever more intriguing: how both his parents had been doctors and his mother one of the first female surgeons in England; how, during the Second World War, with both his parents consumed by medical duties that began with the Battle of Britain, he, at age eight, had been sent with an older brother, Michael, to a hellhole of a boarding school in the countryside, run by “a headmaster who was an obsessive flagellist, with an unholy bitch for a wife and a 16-year-old daughter who was a pathological snitch”; and how—though his brother emerged shattered by the experience, and to that day lived with his father—he, Oliver, had managed to put himself back together through an ardent love of the periodic table, a version of which he had come upon at the Natural History Museum at South Kensington, and by way of marine-biology classes at St. Paul’s School, which he attended alongside such close lifetime friends as the neurologist and director Jonathan Miller and the exuberant polymath Eric Korn. Oliver described how he gradually became aware of his homosexuality, a fact that, to put it mildly, he did not accept with ease; and how, following college and medical school, he had fled censorious England, first to Canada and then to residencies in San Francisco and Los Angeles, where in his spare hours he made a series of sexual breakthroughs, indulged in staggering bouts of pharmacological experimentation, underwent a fierce regimen of bodybuilding at Muscle Beach (for a time he held a California record, after he performed a full squat with 600 pounds across his shoulders), and racked up more than 100,000 leather-clad miles on his motorcycle. And then one day he gave it all up—the drugs, the sex, the motorcycles, the bodybuilding. By the time we started talking, he had been pretty much celibate for almost two decades.•

Tags: ,

The instability of the Argentine banking system (and the expense of dealing with it) has led a growing number of citizens to embark on a bold experiment using Bitcoin to sidestep institutions, a gambit which would probably not be attempted with the same zest in countries with relative financial stability. But if the service proves to be a large-scale success in Argentina, will it influence practices in nations heretofore resistant to cryptocurrency? And will a massive failure doom the decentralized system?

In a New York Times Magazine article adapted from Nathaniel Popper’s forthcoming Digital Gold: Bitcoin and the Inside Story of the Misfits and Millionaires Trying to Reinvent Money, the author writes of this new dynamic in the South American republic, which is enabled by itinerant digital money-changers like Dante Castiglione. An excerpt:

That afternoon, a plump 48-year-old musician was one of several customers to drop by the rented room. A German customer had paid the musician in Bitcoin for some freelance compositions, and the musician needed to turn them into dollars. Castiglione joked about the corruption of Argentine politics as he peeled off five $100 bills, which he was trading for a little more than 1.5 Bitcoins, and gave them to his client. The musician did not hand over anything in return; before showing up, he had transferred the Bitcoins — in essence, digital tokens that exist only as entries in a digital ledger — from his Bitcoin address to Castiglione’s. Had the German client instead sent euros to a bank in Argentina, the musician would have been required to fill out a form to receive payment and, as a result of the country’s currency controls, sacrificed roughly 30 percent of his earnings to change his euros into pesos. Bitcoin makes it easier to move money the other way too. The day before, the owner of a small manufacturing company bought $20,000 worth of Bitcoin from Castiglione in order to get his money to the United States, where he needed to pay a vendor, a transaction far easier and less expensive than moving funds through Argentine banks.

The last client to visit the office that Friday was Alberto Vega, a stout 37-year-old in a neatly cut suit who heads the Argentine offices of the American Bitcoin company BitPay, whose technology enables merchants to accept Bitcoin payments. Like other BitPay employees — there is a staff of six in Buenos Aires — Vega receives his entire salary in Bitcoin and lives outside the traditional financial system. He orders what he can from websites that accept Bitcoin and goes to Castiglione when he needs cash. On this occasion, he needed 10,000 pesos to pay a roofer who was working on his house.

Commerce of this sort has proved useful enough to Argentines that Castiglione has made a living buying and selling Bitcoin for the last year and a half. “We are trying to give a service,” he said.

That mundane service — harnessing Bitcoin’s workaday utility — is what so excites some investors and entrepreneurs about Argentina. Banks everywhere hold money and move it around; they help make it possible for money to function as both a store of value and a medium of exchange. But thanks in large part to their country’s history of financial instability, a small yet growing number of Argentines are now using Bitcoin instead to fill those roles. They keep the currency in their Bitcoin “wallets,” digital accounts they access with a password, and use its network when they need to send or spend money, because even with Castiglione or one of his competitors serving as middlemen between the traditional economy and the Bitcoin marketplace, Bitcoin can be cheaper and more convenient than Argentina’s financial establishment. In effect, Argentines are conducting an ambitious experiment, one that threatens ultimately to spread to the United States and disrupt some of the most basic services its banks have to offer.

Tags: ,

If you want to stop bubonic plague, killing as many cats and dogs as possible is probably not the most effective gambit. But that’s what the Mayor of London opted to do in 1665, putting down nearly a quarter-million predators of rats, which carried the lethal fleas. 

While we have a far greater understanding of epidemiology than our counterparts in the 17th century, we still probably accept some asinine ideas as gospel. In a Medium essay, Weldon Kennedy questions our faith in ourselves, naming three contemporary beliefs he feels are incorrect.

I’ll propose one: It’s wrong that children, who are wisely banned from frequenting bars and purchasing cigarettes, are allowed to eat at fast-food restaurants, which set them up for a lifetime of unhealthiness. Ronald McDonald and Joe Camel aren’t so different.

Kennedy’s opening:

In 19th century London, everyone was certain that bad air caused disease. From cholera to the plague: if you were sick, everyone thought it was because of bad air. It was called the Miasma Theory.

As chronicled in The Ghost Map, it took the physician John Snow years, and cost thousands of lives, to finally disprove the Miasma Theory. He mapped every cholera death in London and linked it back to the deceased’s source of water, and still it took years for people to believe him. Now miasma stands as a by-word for widely held pseudo-scientific beliefs widely held throughout society.

The problem for Snow was that no one could see cholera germs. As a result, he, and everyone else of the time, was forced to measure other observable phenomenon. Poor air quality was aggressively apparent, so it’s easy to see how it might take the blame.

Thankfully, our means of scientific measurement have improved vastly since then. We should hope that any such scientific theory lacking a grounding in observable data would now be quickly discarded.

Where our ability to measure still lags, however, it seems probable that we might still have miasmatic theories.•

Tags:

Some of Ian Frazier’s customary whip-smart, wondrous prose is on display in his NYRB piece about a raft of volumes by and about Daniil Kharms, a writer from that self-inflicted wound called Russia, who was diagnosed as schizophrenic, incarcerated for being an “anti-Soviet children’s writer” and ultimately starved to death at 36. He matured as an artist under Stalin, an era bathed needlessly in blood, his dark, absurd sensibilities perfect for the time and place or perhaps warped into midnight by them. Though Frazier wisely warns against accepting this narrative as a comprehensive explication of Kharms’ work. The opening:

Russia is the funniest country in the world. Some countries, like America and England, are funny mostly on purpose, while others, like Germany and France, can be funny only unintentionally. (But that counts! Being funny is tricky, so any way you do it counts.) Russia, however, is funny both intentionally (Gogol, Zoshchenko, Bulgakov) and unintentionally (Vladimir Putin singing, as he did at a televised event a few years ago, “I found my thrill on Blueberry Hill”). Given the disaster Russian history has been more or less continuously for the last five centuries, its humor is of the darkest, most extreme kind. Russian humor is to ordinary humor what backwoods fundamentalist poisonous snake handling is to a petting zoo. Russian humor is slapstick, only you actually die.

Surveys that measure such distinctions often rate Russians among the world’s least happy people. To judge from the Russians I know, this information would hold little interest one way or the other. To Russians, happiness is not the big deal it is to us; the Declaration of Independence they don’t have makes no statement about it. On the street or otherwise encountering strangers Russians don’t paste big grins on their faces, the way we tend to do. They look sternly upon reflex smilers. Their humor is powerful without a lot of jollity, and it’s hard to imagine Bulgakov, say, convulsed and weeping with laughter, as I have been when reading certain scenes in his novel Heart of a Dog.

Daniil Kharms, a Russian writer who came of age in the worst of Soviet times, is categorized as an absurdist, partly (I think) because it’s hard to know what else to call him. To me he makes more sense as a religious writer.

He is really funny and completely not ingratiating, simultaneously.•

Tags: ,

The upside to the financial crisis of a medium, say like magazines with their economic model tossed into the crapper by technological progress, is that publications are forced to reinvent themselves, get innovative and try offbeat things. In that spirit, the resuscitated Newsweek assigned Wikileaks editor (not “self-styled editor”) Julian Assange to review Luke Harding’s The Snowden Files: The Inside Story of the World’s Most Wanted Man.

And what a gleefully obnoxious pan he delivers, making some salient points along the way, even if it’s not exactly unexpected that he would be bilious toward traditional media in favor of alterna-journalists like himself. Additionally: Assange proves he is a very funny writer. You know, just like Bill Cosby.

An excerpt:

In recent years, we have seen The Guardian consult itself into cinematic history—in the Jason Bourne films and others—as a hip, ultra-modern, intensely British newspaper with a progressive edge, a charmingly befuddled giant of investigative journalism with a cast-iron spine.

The Snowden Files positions The Guardian as central to the Edward Snowden affair, elbowing out more significant players like Glenn Greenwald and Laura Poitras for Guardian stablemates, often with remarkably bad grace.

“Disputatious gay” Glenn Greenwald’s distress at the U.K.’s detention of his husband, David Miranda, is described as “emotional” and “over-the-top.” My WikiLeaks colleague Sarah Harrison—who helped rescue Snowden from Hong Kongis dismissed as a “would-be journalist.”

I am referred to as the “self-styled editor of WikiLeaks.” In other words, the editor of WikiLeaks. This is about as subtle as Harding’s withering asides get. You could use this kind of thing on anyone.

Flatulent Tributes

The book is full of flatulent tributes to The Guardian and its would-be journalists. “[Guardian journalist Ewen] MacAskill had climbed the Matterhorn, Mont Blanc and the Jungfrau. His calmness now stood him in good stead.” Self-styled Guardian editor Alan Rusbridger is introduced and reintroduced in nearly every chapter, each time quoting the same hagiographic New Yorker profile as testimony to his “steely” composure and “radiant calm.”

That this is Hollywood bait could not be more blatant.•

Tags: , , ,

I mentioned this before when writing of Al Michaels’ obliviousness about the dark side of the NFL, but when you haven’t had to worry about food or shelter in a long time, you have to be on constant guard against the development of moral blind spots. Michaels likely thinks of himself as a solid citizen, but your assessment may vary considering his opinions about the racist Washington franchise name and the preponderance of serious health issues suffered by the league’s players.

At the recent Festival of Books at USC, Malcolm Gladwell (who is not incognizant of the NFL’s concussion issue) spoke to this same point. From Taylor Goldstein at the Los Angeles Times:

“I think it becomes very hard to be a good person after a certain point. Or at least it’s not impossible, it’s just harder to work. Just as, in David and Goliath, I talk about what it means to be, how hard it is, weirdly, to be a wealthy parent, how much more difficult it is to raise a child if you are very wealthy as opposed to middle class.

“It’s not impossible, but it requires more of you. There’s that whole thing I have about the difference between “can’t” and “won’t.” That saying no to a child of the middle class is very easy because you just say, “We can’t.” “You want a pony? Look around you! Where would the pony go? Look in the bedroom; is there room for a stable?”

“Of course, if you’re a billionaire, you can’t use ‘can’t,’ you have to use ‘won’t,’ and ‘won’t’ is really hard. ‘Won’t’ requires you to give an explanation, right? And in the same way, when you get, when you’re living a kind of normal life, being empathetic comes naturally. When you’re successful, you have to work at it.”•

Tags: ,

Francisco Cândido Xavier was a prolific writer, though he had help.

At least, that’s what the Brazilian man affectionately known as Chico Xavier claimed. He fancied himself as a ghostwriter for ghosts, a medium who would “receive” the books from the deceased and transcribe them. Psicografía, it is called. The opening ofDead Man Talking,” Laura Premack’s Boston Review article:

In Brazil, dead people write books. Not only do they write books, they sell them. Many fly off the shelves.

The process is called psicografía or psychography, also known in English as automatic writing: mediums go into trance, channel the spirits of the deceased, and record their words. Sometimes mediums channel the spirits of famous writers and poets such as Victor Hugo and Humberto de Campos, the renowned Brazilian poet and journalist whose family sued the medium-author of several collections of his supposedly posthumous poems and essays—not because they objected on principle but because they wanted a share of the profits. Sometimes mediums channel historical figures, such as nineteenth-century politician Bezerra de Menezes, and sometimes they channel unknowns.

Brazil’s most prolific and beloved medium was Francisco Cândido Xavier. Known fondly as Chico Xavier, he published more than 400 books from 1932 until his death at age ninety-two in 2002. At least 25 million copies of his books have been sold, likely more. They have been translated into many languages, including Greek, Japanese, and Braille. His Nosso Lar, a sort of spiritual memoir first published in 1944, is probably the biggest psychographic hit ever. More than sixty Brazilian editions have been printed and nearly 2 million copies sold.

In addition to publishing books, Xavier used his psychographic ability to record more than ten thousand letters from dead people to their families.•

__________________________

Beginning in 1970, Chico Xavier began appearing on the TV show Pingo Fogo.

Tags: , ,

I haven’t yet read Oakley Hall’s McCarthy Era Western, Warlock, but now I must. That’s the book Thomas Pynchon named in 1965 when Holiday magazine asked him to suggest an underappreciated title to its readers. It’s set in 1880s Tombstone, Arizona, which Pynchon believed to be Arthurian in stature. Here’s what he wrote about it:

Tombstone, Arizona, during the 1880’s is, in ways, our national Camelot; a never-never land where American virtues are embodied in the Earps, and the opposite evils in the Clanton gang; where the confrontation at the OK Corral takes on some of the dry purity of the Arthurian joust. Oakley Hall, in his very fine novel Warlock(Viking) has restored to the myth of Tombstone its full, mortal, blooded humanity.  Earp is transmogrified into a gunfighter named Blaisdell who, partly because of his blown-up image in the Wild West magazines of the day, believes he is a hero. He is summoned to the embattled town of Warlock by a committee of nervous citizens expressly to be a hero, but finds that he cannot, at last, live up to his image; that there is a flaw not only in him but also, we feel, in the entire set of assumptions that have allowed the ­image to exist. It is Blaisdell’s private abyss, and not too different from the ­town’s public one. Before the agonized epic of Warlock is over with—the rebellion of the proto-Wobblies working in ­the mines, the struggling for political control of the area, the gunfighting, mob violence, the personal crises of those in power—the collective awareness that is Warlock must face its own inescapable Horror: that what is called society, with ­its law and order, is as frail, as precari­ous, as flesh and can be snuffed out and assimilated back into the desert a easily as a corpse can. It is the deep sensitivity to abysses that makes Warlock, I think, one of our best American novels. For we are a nation that can, many of us, toss with all aplomb our candy wrapper into the Grand Canyon itself, snap a color shot and drive away; and we need voices like Oakley Hall’s to remind us how far that piece of paper, still fluttering brightly behind us, has to fall.

—Thomas Pynchon•

 

Tags: ,

« Older entries § Newer entries »