There was something rotten inside Robert Louis Stevenson, as there is in all of us to varying degrees, but he had a name for it: Mr. Hyde. Not to suggest the author’s voluminous and varied output can be reduced to one novella–I’m talking about the Strange Case of Dr Jekyll and Mr Hyde, of course–but it’s rare that something can written about the human mind, in this case the subconscious, that will be true as long there are people.
An articlein the December 17, 1894 Brooklyn Daily Eagle announced the death of the sickly author, who’d once described himself as “a mere complication of cough and bones.” He’d actually perished two weeks earlier from a cerebral hemorrhage he experienced while living on the Samoan Islands. His last words were a question posed to his wife: “Does my face look strange?”
“It started off as a kind of utopian promise,” Andrew O’Hagan writes of the Internet in a new Guardian essay that meditates on the death of privacy and, perhaps, the novel. During Web 1.0, some worried that this new-to-the-masses technology would be co-opted, watered down and lose it’s anarchic spirit, becoming a tool of corporations and governments. Never let it be tamed, they exhorted. Well, it never has been tamed and still has become a tool of corporations and governments. The anarchy is actually useful to them (see U.S. Presidential election, 2016).
The thing is, we’re still in the prelude of what the Internet will become and of what being connected will mean. Marshall McLuhan feared the Global Village, and we’re going to experience a version of it beyond what the visionary contemplated. That’s what the Internet of Things will effect, with every last object becoming a computer. It will bring great benefits while also being a machine with no OFF switch. We’ll all permanently be inside a contraption that may be antithetical to human nature. It will contain sensors but perhaps not sense.
As far as O’Hagan’s fears about the effect of social media on fiction, I addressed a similar subject in a 2015 essay about Charlie Brooker’s outstanding television program Black Mirror:
It’s tough being Paddy Chayefsky these days. Charlie Brooker, the brilliant satirist behind Black Mirror, comes closest. If he doesn’t make it all the way there, it’s not because he’s less talented than the Network visionary; it’s just that the era he’s working in is so different. I’ve read many articles about Brooker’s impressive program and pretty much all of them miss the point I believe he’s making about our brave new world of technology. That includes Jenna Wortham’sNew York Times Magazine essay, which referred to Mirror as “functioning as a twisted View-Master of many different future universes where things have strayed horribly off-course.” The Channel 4 show is barely about the future. It’s mostly about the present. And it isn’t about the present in the manner of many sci-fi works, which create outlandish scenarios which can never really be in the service of telling us about what currently is. Brooker’s scenarios aren’t the exaggerations they might seem at first blush. In almost no time, our hyperconnected world delivers something far more disturbing than his narratives.
Chayefsky and Andy Warhol and Marshall McLuhan could name the future and we’d wait 25 or 50 years as their predictions slowly gestated, only becoming fully manifest at long last. None of that trio of seers even lived long enough to experience the full expression of Mad As Hell of 15 Minutes of Fame or the Global Village. Brooker will survive to see all his predictions come to pass, and it won’t require an impressive lifespan.•
O’Hagan, author of The Secret Life: Three True Stories of the Digital Age, believes that since we’ve surrendered an interior life (“everything became fake”), there really isn’t even a way to observe the present let alone predict the future, and that writers and readers alike are being wrecked by living in public. The novel is a dogged form and may find a way to persist regardless of each of us living in our own Reality TV show while being flattered by or fired upon by armies of bots. Perhaps it can serve as an antidote to such an existence? The writer himself believes that could be the ultimate outcome. Regardless, his excellent essay is one that can be meditated on in myriad ways.
The other day I taped over the camera on my computer. Then I went upstairs and disabled the data collection capability on the TV. Because of several stories of mine, I’d suffered a few cyber-attacks recently, and, though a paragon of dullness, I decided to greet the future by making it harder to find me. One of the great fights of the 21st century will be the fight for privacy and self-ownership, which is also, to my mind, the struggle for literature as distinct from the dark babble of social media. Writers thrive on privacy, not onTwitter, and so do readers when the lights are low. Giving your sentences thoughtlessly away, and for nothing, seems a small death to contemplation, and does harm to the profession of writing, where you’re paid because you’re good at it. We are all entertainers now, politicians are theatrical in their every move, but even merely passable writers have something large at stake when it comes to opposing the global stupidity contest. Literature, which includes great journalism, might enhance the public sphere but it more precisely enriches the private one, and we are now at the point where privacy, the whole secret history of a people, might be the only corrective we have to the political forces embezzling our times.
In the interests of “national security”, in the service of “global harmony”, you are now obliged to become your own Winston Smith, both watched and self-watching. The TV downstairs may not be “off” at all – it may be “fake-off”, a condition defined in a joint programme of June 2014 between the CIA and MI5 called “Weeping Angel”. (Certain models of televisions are programmed to stay on, with their cameras operative, and the “data” they collect can be harvested by agencies.) The principle, as with Britain’sPreventcampaign, is to assume that everyone with a private life might have something to hide, which means that nobody, in the future, unless they have sinister motives, should expect the luxury of privacy. Some TVs and all phones operate “as a bug, recording conversations in the room and sending them over the internet to a covert CIA server”, reported WikiLeaks as it released the “Weeping Angel” documents. Being bugged at home or stopped and searched in the street and having your “information” handed to security agencies are now understood to be security measures, and questioning it will make you an enemy of the Daily Mail’s “common sense”. One doesn’t have to be much of a freedom fighter nowadays to be branded a member of the “liberalocracy”: all you have to do is believe in free speech and freedom of movement, and stand up for basic rights of sovereignty over your own thinking. Only recently have these sanctities been taken for the demands of a potential terrorist.•
Here in America–as in many other places in the world–we live in desperate times, barely capable of running our country despite great wealth, so the idea of us engineering new forms of life or even an entire universe seems beyond reason. Have we earned the right to play creator?
Freeman Dyson has written of a revolutionary vision for next-level space colonization, suggesting we design a baseball-sized, biotech Noah’s Ark that can “seed” the Milky Way with millions of species of life. “Sometime in the next few hundred years,” he’s theorized, “biotechnology will have advanced to the point where we can design and breed entire ecologies of living creatures adapted to survive in remote places away from Earth.” Dyson believes this scenario favorable to launching humans (as we know them) into radically unforgiving environments.
That’s mind-blowing enough, but some theoretical physicists takes matters a giant leap further, wondering if we can actually create new baby universes in vitro. Zeeya Merali, author of A Big Bang in a Little Room, has a smart Aeonarticle on the moral implications of “cosmogenesis.” She interviews Anders Sandberg, among others, on the thorny topic. The opening:
Physicists aren’t often reprimanded for using risqué humour in their academic writings, but in 1991 that is exactly what happened to the cosmologist Andrei Linde at Stanford University. He had submitted a draft article entitled ‘Hard Art of the Universe Creation’ to the journal Nuclear Physics B. In it, he outlined the possibility of creating a universe in a laboratory: a whole new cosmos that might one day evolve its own stars, planets and intelligent life. Near the end, Linde made a seemingly flippant suggestion that our Universe itself might have been knocked together by an alien ‘physicist hacker’. The paper’s referees objected to this ‘dirty joke’; religious people might be offended that scientists were aiming to steal the feat of universe-making out of the hands of God, they worried. Linde changed the paper’s title and abstract but held firm over the line that our Universe could have been made by an alien scientist. ‘I am not so sure that this is just a joke,’ he told me.
Fast-forward a quarter of a century, and the notion of universe-making – or ‘cosmogenesis’ as I dub it – seems less comical than ever. I’ve travelled the world talking to physicists who take the concept seriously, and who have even sketched out rough blueprints for how humanity might one day achieve it. Linde’s referees might have been right to be concerned, but they were asking the wrong questions. The issue is not who might be offended by cosmogenesis, but what would happen if it were truly possible. How would we handle the theological implications? What moral responsibilities would come with fallible humans taking on the role of cosmic creators?•
Facts today come into flavors: original and alternative.
Fox kicked off the Fake News Age in earnest just over two decades ago. The unspoken reason for selling lies and conspiracies and wedge issues rather than reality is that Republican policy had become twisted into something almost unrecognizable and truly deleterious to any non-rich citizen. It’s worked quite well as a strategy, even if it’s often made the popular vote at the national level unattainable.
The most recent Presidential election, with its armies of bots, alt-right trolls and Russian interference used Big Data to deliver lies at the granular level. It seemed shocking, although our society and technology has been heading in this direction for a long time. It was almost inevitable.
Of course, factual distortions are nothing new nor are they limited to current events. History can also be a funny thing, as the dangerous absurdity of modern North Korea reminds us every day. Suki Kim, author of Without You, There Is No Us, just conducted a Reddit AMA about her experience going undercover as a schoolteacher in the deeply troubled, delusional state to learn more about the culture. In two exchanges, she addresses historical distortions about the country that exist on the inside and also the outside.
What wildly held belief among your students surprised you the most?
There were so many things. They just learn totally upside down information about most things. But one thing I think most people do not realize is that they learn that South Korea & US attacked North Korea in 1950, and that North Korea won the war due to the bravery of their Great Leader Kim Il Sung. So they celebrate Victory Day, which is a huge holiday there. So this complete lie about the past then makes everything quite illogical. Because how do you then explain the fact that Korea is divided still, if actually North Korea “won” the war? One would have to question that strange logic, which they do not. So it’s not so much that they get taught lies as education, but that that second step of questioning what does not make sense, in general, does not happen, not because they are stupid but because they are forbidden and also their intelligence is destroyed at young age. There were many many examples of such.
In your experience, what are the biggest misconceptions Americans have about either North or South Korea?
I think the biggest misconception goes back to the basic premise. Most Americans have no idea why there are two Koreas, or why there are 30,000 US soldiers in South Korea and why North Korea hates America so much. That very basic fact has been sort of written out of the American consciousness. By repackaging the Korean War as a civil war, it has now created decades of a total misconception. The fact that the US had actually drawn the 38th Parallel that cut up the Korean peninsula, not in 1950 (the start of the war) but in 1945 at the liberation of Korea from Japan is something that no Korean has forgotten — that was the beginning of the modern Korean tragedy. That the first Great Leader (the grandfather of the current Great Leader) was the creation of the Soviet Union (along with the US participation) is another horrible puzzle piece that Americans have conveniently forgotten.
Anyone know where can I find information regarding how the first Great Leader was a creation of the U.S.A. & Soviets? I’d love to read about it
That would be taking it out of the context to claim that first Great Leader was “created” by US. He was a soldier (protege of the Soviet), while US participated in that set up handpicking the US educated South Korean first president. US had drawn the 38th Parallel, and that division was trumpeted by the Cold War, two separate govts formed by 1948 & war broke out in 1950. That is a very simplified version of the history of the two Koreas which most Americans don’t remember and now wonder why they are in South Korea today and why is North Korea mad at them. If you are genuinely curious, there are many many books on this topic by serious historians.•
Still haven’t written my thoughts on Garry Kasparov’s Deep Thinking. Will do so soon, I promise. For whatever philosophical differences I have with the author on technology, the long centerpiece about his pair of matches with Deep Blue in ’96 and ’97 is riveting. It’s also revealing in surprising ways, about both humans and machines.
In a New ScientistQ&A conducted by Sean O’Neill, the chessman is asked about surveillance, a topic which receives a scant few pages in his book, but I believe the question posed is the wrong one. The reporter wonders about new technologies being hoarded by the “ruling class,” which is silly, because these tools, ever cheaper and more powerful, will snake their way through every inch of society. Artificial Intelligence will be useful in countless ways, but it will just as surely enable the anarchy of the Internet to be visited upon the physical world. The problem we face isn’t that it may be controlled but that it absolutely cannot be. There’s no going back (nor should there be), but this progress will be attended by regress. Constantly trying to separate those realities will be our task–our burden.
What happens if AI, high-tech surveillance, military tech, and communications are sewn up by the ruling class?
Ruling class? Sounds like Soviet propaganda! New tech is always expensive and employed by the wealthy and powerful even as it provides benefits and trickles down into every part of society. But it seems fanciful – or dystopian – to think there will be a harmful monopoly. AI isn’t a nuclear weapon that can or should be under lock and key; it’s a million different things that will be an important part of both new and existing technology. Like the internet, created by the US military, AI won’t be kept in a box. It’s already out.
Will handing off ever more decisions to AI result in intellectual stagnation?
Technology doesn’t cause intellectual stagnation, but it enables new forms of it if we are complacent. Technology empowers intellectual enrichment and our ability to indulge and act on our curiosity. With a smartphone, for example, you have the sum total of human knowledge in your pocket and can reach practically any person on the planet. What will you do with that incredible power? Entertain yourself or change the world?•
I’m given pause when someone compares the Internet to the printing press because the difference of degree between the inventions is astounding. For all the liberty Gutenberg’s contraption brought to the printed word, it was a process that overwhelmingly put power into the hand of disparate professionals. Sure, eventually with Xeroxes, anyone could print anything, but the vast majority of reading material produced was still overseen by professional gatekeepers (publishers, editors, etc.) who, on average, did the bidding of enlightenment.
By 1969, Glenn Gould believed the new technologies would allow for the sampling, remixing and democratization of creativity, that erstwhile members of the audience would ultimately ascend and become creators themselves. He hated the hierarchy of live performance and was sure its dominance would end. “The audiences [will] become the performer to a large extent,” he predicted. He couldn’t have known how right he was.
The Web has indeed brought us a greater degree of egalitarianism than we’ve ever possessed, as the centralization of media dissipated and the “fans” rushed the stage to put on a show of their own. Now here we all are crowded into the spotlight, a turn of events that’s been both blessing and curse. The utter democratization and the filter bubbles that have attended this phenomenon of endless channels have proven paradoxically (thus far) a threat to democracy. It’s acknowledged even those who’ve been made billionaires by these new tools that “the Internet is the largest experiment involving anarchy in history,” though they never mention when some semblance of order might return.
In Stephen Fry’s excellent recent Hay Festival lecture “The Way Ahead” (h/t The Browser), the writer and actor spoke on these same topics and other aspects of the Digital Age that are approaching with scary velocity. Like a lot of us, he was an instant convert to Web 1.0, charmed by what it delivered and awed by its its staggering potential. Older, wiser and sadder for his knowledge of what’s come to pass, Fry tries to foresee what is next in a world in which 140 characters cannot only help topple tyrants but can create them as well, knowing that the Internet of Things will only further complicate matters. Odds are life may be greater and graver. He offers one word of advice: Prepare.
Gutenberg’s printing revolution, by way of Das Kapital and Mein Kampf, by way of smashed samizdat presses in pre-Revolutionary Russia, by way of The Origin of Species and the Protocols of the Elders of Zion, by way of the rolling offset lithos of Fleet Street, Dickens, Joyce, J. K. Rowling, Mao’s Little Red Book and Hallmark greetings cards brought us to the world into which all of us were born, it brought us, amongst other things – quite literally – here to Hay-on-Wye. I started coming to this great festival before the word Kindle had a technological meaning, when an “e-book” might be a survey of 90s Rave drug Culture, or possibly an Ian McMillan glossary of Yorkshire Dialect.
Printed books haven’t gone away, indeed, we are most of us I suspect, pleased to learn how much they have come roaring back, in parallel with vinyl records and other instances of analogue refusal to die. But the difference between an ebook and a printed book is as nothing when set beside the influence of digital technology as a whole on the public weal, international polity and the destiny of our species. It has embedded itself in our lives with enormous speed. If you are not at the very least anxious about that, then perhaps you have not quite understood how dependent we are in every aspect of our lives – personal, professional, health, wealth, transport, nutrition, prosperity, mind, body and spirit.
The great Canadian Marshall McLuhan –– philosopher should one call him? – whose prophetic soul seems more and more amazing with each passing year, gave us the phrase the ‘Global Village’ to describe the post-printing age that he already saw coming back in the 1950s. Where the Printing Age had ‘fragmented the psyche’ as he put it, the Global Village – whose internal tensions exist in the paradoxical nature of the phrase itself: both Global and a village – this would tribalise us, he thought and actually regress us to a second oral age. Writing in 1962, before even ARPANET, the ancestor of the internet existed, this is how he forecasts the electronic age which he thinks will change human cognition and behaviour:
“Instead of tending towards a vast Alexandrian library the world will become a computer, an electronic brain, exactly as in an infantile piece of science fiction. And as our senses go outside us, Big Brother goes inside. So, unless aware of this dynamic, we shall at once move into a phase of panic terrors, exactly befitting a small world of tribal drums, total interdependence, and superimposed co-existence. […] Terror is the normal state of any oral society, for in it everything affects everything all the time. […] In our long striving to recover for the Western world a unity of sensibility and of thought and feeling we have no more been prepared to accept the tribal consequences of such unity than we were ready for the fragmentation of the human psyche by print culture.”
Like much of McLuhan’s writing, densely packed with complex ideas as they are, this repays far more study and unpicking than would be appropriate here, but I think we might all agree that we have arrived at that “phase of panic terrors” he foresaw.•
Social mobility as it relates to geography, gender, integration, education and other factors is at the heart of much of the research conducted by Stanford economist Raj Chetty. An erstwhile wunderkind who’s still very young at 37, the academic, an immigrant from New Dehli whose family relocated to Milwaukee when he was a child, has often wondered what allowed his success. Certainly native genius was a key component and having a father who was an economist and mother a pulmonologist didn’t hurt, but how much did physical location and primary and secondary schools matter?
It’s a topic I consider often not only because the American Dream has been dragging for many for decades, but because I grew up in a lower-income, blue-collar neighborhood that didn’t have a bookstore. It was hard to get from here to there, and part of the problem went beyond money, location and access, though those factors undoubtedly loomed large. The problem was also cultural, as scholarly achievements–even a mere love of reading–was viewed as a “sellout” or sorts. Don’t know if that’s still the situation where I’m from, but I bet it stubbornly persists in other quarters of the country.
Certainly the nativism and scapegoating of the most recent Presidential election was so shockingly acceptable to so many citizens in part because of our ever-widening economic segregation. The terrible outcome of that race will likely only exacerbate the issue.
Tyler Cowen just interviewed Chetty. Three excerpts follow.
It’s a common view, derived fromWilliam BaumolandBowen, that education is subject to a kind ofcost disease, that it’s harder and harder to augment productivity, wages rise in other sectors of the economy, education takes a rising share of GDP but doesn’t really get much better. Do you accept that story, or, if not, how would you modify it? Are we doomed to low productivity growth in K–12 education?
I don’t think so because, while in some limited case that might end up being true, at the moment I see so many opportunities within the US K–12 education system to potentially have significantly higher productivity without dramatically higher cost. Let me give you an example. Coming back to the case of teachers, my sense is, if we were to try to keep the most effective teachers in the classroom and either retrain or dismiss the teachers who are less effective, we could substantially increase productivity without significantly increasing cost.
But say we do that. What do we do next?
I think eventually it’s conceivable that you move up the quality ladder, and you’ve got everybody getting a very good primary school education. Then you need to work on secondary education and so forth. But there again, I would say there are lots of bargains to be found.
In our most recent work looking at colleges and upward mobility, we see that there are a number of colleges where kids seem to be doing extremely well that are not all that expensive. Also, I think, here a macroeconomic perspective is useful. If you look at countries that have some of the best educational outcomes, like Scandinavian countries, they’re not actually spending dramatically more than the United States.
At some abstract level, I think that logic has to be right, that eventually, in order to raise the level of education beyond some point, we’re going to have to spend more and more on that, but I don’t think we’re close enough empirically to such a point that that is really a critical consideration at the moment.
If you told the story about molecules impinging on your body and impelling you to action, what’s the best story you can come up with for Iowa, say, or Utah?
Yeah, a few different things. Iowa is known for having very good public schools for a long time.
But that too is arguably just part of the package.
Yes. Where did that come from? Why does Iowa have good public schools?
One of the strong correlates we find is that places that are more integrated across socioeconomic groups, that have lower segregation, tend to have better outcomes for kids. And that kind of thing in a rural area — you can see why that occurs and why it might lead to better outcomes.
If you live in a big city, it’s very easy to self-segregate in various ways. You live in a gated community, you send your kids to a private school. You essentially don’t interact with people from different socioeconomic classes. If you live in a small town in Iowa, pretty much there’s one place your kids are going to go to school. There’s one set of activities that you can all participate in. And that is likely to lead to more integration.
As I’m sure you know, since the 1990s, segregation by income has been rising in this country. And here, Silicon Valley is one of the most extreme cases of that. So seeing that, are you on net a segregation optimist or pessimist? If I may ask.
I think current trends suggests that segregation will continue to grow in the US. Take the case of driverless cars, for example. One way that could go is, if you have access to driverless cars, it makes it all the more easy to go live further away in a secluded place, further reduce interaction, right?
So I think it’s very important to think about social policy in the context of that type of technology. How do you set cities up? How do you do urban planning and architecture in a way such that you don’t actually just facilitate more segregation? Such that you make it attractive to live in a more mixed-income community? That’s a key challenge, I think.•
Overall I enjoyed Garry Kasaprov’sDeep Thinking. Have philosophical disagreements with it, for sure, and there is some revisionism in regards to his personal history, but the author’s take on his career developing parallel to the rise of the machines and his waterloo versus IBM is fascinating. It’s clear that if there had been a different World Chess Champion during Kasparov’s reign, one who lacked his significant understanding of the meaning of computers and maverick mindset, the game would have been impoverished for it. I’ll try to make time this weekend to write a long review.
The 20-year retrospective on Deep Blue’s 1997 victory would be incomplete without reflection by Steven Levy, who penned the famous Newsweek cover story “The Brain’s Last Stand” as a preface to the titanic match in which humanity sunk. (It turns out Levy himself composed that perfectly provocative cover line that no EIC could refuse.)
The writer focuses in part on the psychological games that Deep Blue was programmed to play, an essential point to remember as computers are integrated into every aspect of life–when nearly every object becomes “smart.” Levy points out that no such manipulations were required for DeepMind to conquer Go, but those machinations might be revisited when states and corporations desire to nudge our behaviors.
The turning point of the match came in Game Two. Kasparov had won the first game and was feeling pretty good. In the second, the match was close and hard fought. But on the 36th move, the computer did something that shook Kasparov to his bones. In a situation where virtually every top-level chess program would have attacked Kasparov’s exposed queen, Deep Blue made a much subtler and ultimately more effective move that shattered Kasparov’s image of what a computer was capable of doing. It seemed to Kasparov — and frankly, to a lot of observers as well — that Deep Blue had suddenly stopped playing like a computer (by resisting the catnip of the queen attack) and instead adopted a strategy that only the wisest human master might attempt. By underplaying Deep Blue’s capabilities to Kasparov, IBM had tricked the human into underestimating it. A few days later, he described it this way: “Suddenly [Deep Blue] played like a god for one moment.” From that moment Kasparov had no idea what — or who — he was playing against. In what he described as “a fatalistic depression,” he played on, and wound up resigning the game.
After Game Two, Kasparov was not only agitated by his loss but also suspicious at how the computer had made a move that was so…un-computer like. “It made me question everything,” he now writes. Getting the printouts that explained what the computer did — and proving that there was no human intervention — became an obsession for him. Before Game Five, in fact, he implied that he would not show up to play unless IBM submitted printouts, at least to a neutral party who could check that everything was kosher. IBM gave a small piece to a third party, but never shared the complete file.
Kasparov was not the same player after Game Two.•
“It was very easy, all the machines are only cables and bulbs.”
Inconvenient it is for any state when an erstwhile national hero turns into an embarrassment. In America, for instance, we have Bobby Fischer, whose mind proved a buggy machine, and earlier, Charles Lindbergh, who crashed and burned after soaring to unprecedented heights.
Norway knew its own shocking albatross in 1940 when Knut Hamsun, the Nobel Prize-winning author, embraced Adolf Hitler as a liberator, even arranging a meeting with the German madman. It’s been some years since I read Hunger, with its nameless Raskolnikovian protagonist, a down-and-out intellectual, though I feel pretty confident saying that it was better than a Canetti but not as good as a Dostoyevsky.
Trump is certainly not Nixonian in intellect or policy, but he shares with his predecessor an utter disregard for truth, a deep paranoia that mints enemies like pennies and a nefariousness that will probably lead to disgrace if not tragedy. His sense of being cheated, a rich man who feels deeply impoverished, has its origins in a Rosebud-ian psychological wound and perhaps some mental illness, has rendered him extremely immoral and deeply disturbed. In the country’s future–should there be one–it will be possible to have a worse President if that person retains all his terrible qualities but is basically competent. We should be glad of his ineptitude, provided it doesn’t get us all killed.
On the day when the Washington Postdelivered what appears to be a bombshell about a terrible breach by Trump in the company of his Russian comrades, a misstep to be added to his litany of lies, acts of kleptocracy and attacks on American democracy, here’s a piece from Garry Willis’ 1974 New York Review of Bookspiece about Woodward and Bernstein’s All the President’s Men:
Nixon was always Wronged; so, since the score could never be settled entirely, he felt no qualms about getting back what slight advantage he could when no one was looking. Even at the height of his power, he feels he must steal one extra vote, tell the marginal little lie. He is like a man who had to steal as a child, in order to eat, and acquired a sacred license—even a duty—to steal thenceforth; it would punish the evil that had first deprived him. Thus he took as his intimate into the Oval Office the very man who helped him try to cheat his way into the office of governor of California. Those who say Nixon did not know what kind of thing his lieutenants were up to forget that the judge who decreed in favor of plaintiffs in the fake postcard-poll case of 1962 did so on the grounds that both Haldeman and Nixon knew about the illegal tactic. Watergate is the story of a man who has just pulled off a million-dollar heist and gets caught when he hesitates to steal an apple off a passing vendor’s cart.
Nixon engages in a kind of antipolitics; a punishment of politics for what it has done to him. That is why he could never understand “the other side” in the Washington Post’s coverage of the Watergate investigation. Jeb Magruder has written that his staff was pleased when two unknown local reporters, Bernstein and Woodward, were given the break-in as their assignment. When the story did not lapse after a decent interval, Nixon conceived it as an ideological vendetta directed by Katharine Graham for the benefit of George McGovern—something to be countered by high-level threats, intimidation, and “stonewalling.” Even Henry Kissinger tried to intervene with Mrs. Graham.
Actually, if the coverage had been political, it might have failed. Very few columns or editorials played up Watergate in the election period, even at the Post. Those wanting high political sources and theoretical patterns would not have found the sneaky little paths under out-of-the-way bushes, as Woodward and Bernstein did. They thought, from the outset, they were dealing with robbers, not politicians. When their tips kept leading them toward the White House; they balked repeatedly, out of awe and fear and common sense; but the evidence kept tugging them against the pull of expectation. The editors kept them at it, but gave them little help. They must pursue their modest leads even after they wanted to be switched to “the big story” at the Ervin hearings. Others would theorize, editorialize, do the White House circuit. Theirs was the leg work, the endless doors knocked on, wrong numbers called, the days of thirty leads checked out and nothing to show for it. A leitmotiv of the book is “back to square one.”
They advanced, as it were, backward—always back to the same sources; would they talk this time? No. Then put them on the list of people to go back to. Back and back. Which became up and up. Up, scarily at the last, “to the very top” (as the Justice Department man had put it). Their sources—originally secretaries and minor functionaries—were added to when parts of the gang like Dean started dealing to get out; but there had always been people who talked because they were sincerely shaken by what was going on—not only Hugh Sloan at the outset, but the mysterious White House cooperator called “Deep Throat.” It is good to know the gang could not entirely succeed in imposing its code of omertà.•
Intelligent doesn’t necessarily mean good, in humans or machines.
I doubt I’ve come across any public figure who’s read more books than Tyler Cowen, yet in the country’s darkest hour, he’s pulled his punches with his fellow Libertarian Peter Thiel, who’s behaved abysmally, dangerously, in his ardent Trump support. The Administration, a gutter-level racist group, has apparently allowed Russian espionage to snake its way into the U.S. and is working in earnest to undo American democracy, to put itself beyond the reach of the law. Those who’ve gone easy on its enablers are complicit.
Maybe the machines will behave more morally than us when they’ve turned away from our lessons to teach themselves? Maybe less so?
· · ·
The pro-seasteading economist just interviewed Garry Kasparov, whose new book, Deep Thinking, I’m currently reading. Likely history’s greatest chess player, the Russian was turned deep blue by IBM during the interval between Cold Wars, when he could conjure no defense for the brute force of his algorithmically advantaged opponent.
Initially, Kasparov was too skeptical, too weighed down by human ego, to fully appreciate the powers of computers, but sometimes those who’ve most fiercely resisted religion become the most ardent believers, redirecting their fervent denial into a passionate embrace. That’s where Kasparov seems to be now in his unbridled appreciation for what machines will soon do for us, though I can comment more once I’ve completed his book.
He’s certainly right that much of what will happen with AI over the course of this century is inevitable given the way technologies evolve and the nature of human psychology. With those developments, we’ll enjoy many benefits, but with all progress comes regress, a situation heightened as the tools become more powerful. It’s clear to me that we’re not merely building machines to aid us but permanently placing ourselves inside of one with no OFF switch.
A lot of humans don’t play chess, but we’re looking at a future where AI will make decisions about who gets a monetary loan, who is diagnosed as being schizophrenic or bipolar. How cars drive on the road increasingly is controlled by software.
The fact that the decisions of the software are not so transparent — and you see this also in computer chess — how will ordinary human beings respond to the fact that more and more of their lives will be “controlled” by these nontransparent processes that are too smart for them to understand? Because in your book, you have emotional conflict with Deep Blue, right?
Exactly. I’m telling you that it’s inevitable. There are certain things that are happening, and it’s called progress. This is the history of human civilization. The whole history is a steady process of replacing all forms of labor by machines. It started with machines replacing farm animals and then manual laborers, and it kept growing and growing and growing.
There was a time I mentioned in the book, people didn’t trust elevators without operators. They thought it would be too dangerous. It took a major strike in the city of New York that was equal a major disaster. You had to climb the Empire State Building with paralyzed elevators.
I understand that today, people are concerned about self-driving cars, absolutely. But now let us imagine that there was a time, I’m sure, people were really concerned, they were scared stiff of autopilots. Now, I think if you tell them that autopilot’s not working in the plane, they will not fly because they understand that, in the big numbers, these decisions are still more qualitative.
While I understand also the fear of people who might be losing jobs, and they could see that machines are threatening their traditional livelihood, but at the same time, even these people whose jobs are on chopping block of automation, they also depend on the new wave of technology to generate economic growth and to create sustainable new jobs.
This is a cycle. The only difference with what we have been seeing throughout human history is that now, machines are coming after people with college degrees, political influence, and Twitter accounts.•
He was, no doubt, a brilliant visionary who knew decades early the Reality Age was approaching, even if he calibrated the time span we’d all be famous far too cautiously. The Pop Artist and keen media philosopher, however, was careless about those troubled souls he assembled in his Factory, his role that of the foreman unconcerned about the safety of the ones working on the floor. It was somehow glamorized, though it had all the charm of a heroin souk on Halloween. The scene in Midnight Cowboy when Joe Buck and Ratso Rizzo wander, shocked, through a decadent party inside a Warholian vomitorium seems apt.
Warhol wasn’t responsible for those in his constellation, but he didn’t need to be so irresponsible. He didn’t have to be a father, but he should have been a better friend.
In Gatsby terms, he curated a “rotten crowd” in the Sixties, and into their spin waltzed New England patrician purity in the slight form of Edie Sedgwick, who was destined to be a star of the shooting variety. An aristocrat descending into hades, how amazing! Except that it wasn’t. Within a few years she was worn out, used up and dead of a drug overdose. Like Zelda Fitzgerald, she’d been burned alive.
A decade after her death, Jean Stein, a restless type of Hollywood royalty, created a great oral history of Sedgwick that also spoke to the era. Not that Stein’s book fully captured the 1960s anymore than did Joan Didion’s Slouching Towards Bethlehem, both volumes laser focused on the dark side of the decade. But you also couldn’t tell nearly as well the story of that tumultuous time without their reporting.
Stein just died in a fall from her 15th-floor Manhattan apartment, likely a suicide, after sliding into a depression.Lee Smith of the Weekly Standard, a former employee and confidante of the author and editor, wrote the best obituary about her, an uncommonly deep dive into her psyche and milieu. An excerpt from the obit is followed by one from Michiko Kakutani’s 1982 review of Stein’s Sedgwick book and a 1965 video of Andy and Edie in an appropriately odd appearance on Merv Griffin’s talk show.
Most people speak because they like to hear themselves speak, and the trick for a journalist is to respect, and then profit from, human frailty long enough to keep your own mouth shut. But other people, usually more interesting people, don’t want to speak. Jean’s genius was in getting those people to talk by speaking herself. She understood that social space wants to be filled. Everyone fears certain types of silence, so they fill it with talk, the question then is about the quality of the talk. By exposing parts of her own pain, Jean made her subjects not only willing to reveal some of their own, but also, and more importantly, keen to protect her and join her at the place of her pain so she wouldn’t be left alone.
Here’s a practical example: Next time you attend a party and are called on to introduce two people but have forgotten the name of one or both, stutter. At least one, most likely both, will quickly volunteer their names in order to rescue you from your awkwardness. Why? Arguably, it’s because people are good. In any case, Jean’s aesthetic was premised on the idea that people are basically good and don’t want others to hurt, especially not in public. And that was perhaps Jean’s great theme—public hurt, American pain.
Her first book, also edited by Plimpton, wasAmerican Journey: The Times of Robert Kennedy, an oral biography centered around the funeral train that took Kennedy’s body from New York City to Washington, D.C. But Edie was Jean’s masterpiece, also an oral biography, a book that I think is generally misunderstood as a love song to the Warhol gang and the groovy 1960s underground.
Generations of young women, up to the present, have gone to New York with the legend of young Edie Sedgwick, the beautiful and doomed socialite celebrity, on their minds, steered by half-formed dreams of becoming the next “It” girl. One of those young women, a friend of mine, visited the Grand Street office when Jean was there and gushed to her about how much she loved the book, the scene it portrayed, the ethos of the moment. Jean’s face became very serious. She shook her head emphatically. “It was not glamorous,” she told my friend. And then I started to imagine how Jean must have seen it—like a vision of the underworld with generations of beautiful and naïve young women on the arm of some painter, or writer, or actor, eventually to be discarded and left alone in hell. That’s who Edie was, a kid who didn’t learn quickly enough the cost of not leaving a parade of death.
The space Jean Stein occupied was unique, moral, ambiguously optimistic in the American style, and is filled now by her books, a central part of the historiography of 20th-century America.•
Beautiful and charming, she had an ability to conjure up a magical world of grace and fun, and when she came to New York in 1964, she almost immediately became the leading lady of the fashionable demimonde. Her arrival happened to coincide with that period when all the old rules were suddenly breaking down – her gift for the outrageous seemed, to many, to personify the times – and she quickly replaced Baby Jane Holzer as Andy Warhol’s newest star. Mr. Warhol, with his gift for exploiting image and personality, escorted her to parties and featured her in his films, and Vogue magazine was soon dubbing her a ”Youthquaker,” ”22, going whither, God knows, but at a great rate!”
A friend who knew Edie as a teen-ager recalls in the book that she always ”liked walking very close to extinction,” and the world of Warhol’s Factory – with its drugs and sexual experimentation – fueled her fatal predilections. There were shoplifting sprees at department stores, injections of LSD and speed, and increasingly frequent stays at hospitals and clinics. Although Edie finally left New York, returning to California, where she got married, she never seemed to get the hang of ordinary life. Happiness and the order that her grandparents had once predicated their lives on remained elusive, and on Nov. 16, 1971, she died from ”acute barbiturate intoxication.” She was 28 years old.•
Warhol refuses to speak during a 1965 appearance on Merv Griffin’s talk show, allowing a still-healthy-looking Sedgwick to handle the conversation. Not even the Pop Artist himself could have realized how correct he was in believing that soon just being would be enough to warrant stardom, that it wouldn’t matter what you said or if you said a thing, that traditional content would lose much of its value.
In this odd moment of food fetishization and FitBit (at least in the West), we’re gorging and gauging as Rome burns and seas rise. We’re urged to live healthier and happier lives by corporations and governments, not an unreasonable request, but it’s an impossible mission sans social safety nets and a habitable plant. Ballooning wealth inequality is detrimental to democracies and their citizenry alike, and there’s just so much individuals can do to steel themselves from the chaos it brings.
It makes sense that in America the culmination of this medicine show is a President whose family literally worshiped, when he was a child, at the church of Norman Vincent Peale. The power of positive thinking, however, won’t remove lead from the Flint water supply, cancel climate change or prevent factories from falling into the grip of robot hands and low-paid contractors. It’s really a false doctrine, a Depression Era dance marathon reimagined for the Digital Age. The last one to hit the floor wins.
Still, Penny manages to find some value in yoga and self-care despite the gathering storm.
Late capitalism is like your love life: it looks a lot less bleak through an Instagram filter. The slow collapse of the social contract is the backdrop for a modern mania for clean eating, healthy living, personal productivity, and “radical self-love”—the insistence that, in spite of all evidence to the contrary, we can achieve a meaningful existence by maintaining a positive outlook, following our bliss, and doing a few hamstring stretches as the planet burns. The more frightening the economic outlook and the more floodwaters rise, the more the public conversation is turning toward individual fulfillment as if in a desperate attempt to make us feel like we still have some control over our lives.
Coca-Cola encourages us to “choose happiness.” Politicians take time out from building careers in the debris of democracy to remind us of the importance of regular exercise. Lifestyle bloggers insist to hundreds of thousands of followers that freedom looks like a white woman practicing yoga alone on a beach. One such image (on the @selflovemantras Instagram) informs us that “the deeper the self love, the richer you are.” That’s a charming sentiment, but landlords are not currently collecting rent in self-love.
Can all this positive thinking be actively harmful? Carl Cederström and André Spicer, authors of The Wellness Syndrome, certainly think so, arguing that obsessive ritualization of self-care comes at the expense of collective engagement, collapsing every social problem into a personal quest for the good life. “Wellness,” they declare, “has become an ideology.”•
David Grann was asked to name a quintet of great “True Crime” titles for a Five Books interview, and among the volumes about brutal and strange murders, he made a non-obvious and timely choice with All the President’s Men.
Woodward’s been a perplexing figure for decades and Bernstein has since the 1970s had to wrestle personal demons that sometimes sidetracked his brilliant career, but there’s no denying their book’s greatness or their impact on American liberty.
Of course, dogged journalism alone can’t protect democracy. If enough citizens and congresspeople don’t care that the President is a crook–a traitor, even–any ink spilled will merely document a society in steep decline.
Grann also explained why he didn’t include on his list Truman Capote’s In Cold Blood, which is written as immaculately as any work can be. He’s most troubled by the issue of veracity, which is certainly valid, though I’m also bothered by how the author suspensefully builds to the Clutter family murders, as if he were penning a thriller about fictional characters.
I wonder if Grann considered Hiroshima by John Hersey.
Your fourth book—All The President’s Men by Carl Bernstein and Bob Woodward—comes as something of a relief after all maniacal murdering. But it’s still a pretty frightening tale, and no less so for being so well-known. Talk us through it.
It’s probably the most iconic book of reporting in the United States to this day. It’s written by Woodward and Bernstein, and about their investigation, when they were young reporters at the Washington Post, into the shocking crimes committed by President Richard Nixon. When I first read that book, it gave me a sense that reporting could have a nobility and a moral purpose behind it. Of course, much reporting is not quite like that but…
And, to be clear, the crimes here are moral and political ones. It was articles of the US Constitution that were being butchered, rather than individuals.
Yes. The crimes include everything from breaking into the Democratic National Committee’s offices to bugging political opponents to covering up evidence. I think the book is particularly relevant today which is partly why I picked it. In a day and age when public officials are trying to subvert and muddy the truth, the need for deep reporting to hold these people accountable is as important as ever. This book is a seminal case of that—a case where investigative reporting was essential to revealing the corruption at the highest levels of the United States and to preserving our democracy.
Too often when we think of crime stories, we think of them in one dimensional ways—we think of a bank robbery, or a holdup, or someone breaking into a house—but some of the crimes that are just as important, in some ways maybe even more important, are those that are political in nature. They don’t need to involve murders. This one almost provoked a constitutional crisis.
And the victim count is much higher. It’s a whole nation.
Precisely, and this was a case where the system was driven to the brink but ultimately functioned: Nixon eventually stepped down. Woodward and Bernstein’s reporting played an essential role in protecting the country. This book, and all the books on this list, have left a mark on me, often in different ways, and what I remember most about this one is the doggedness of the reporting. All the President’s Men is a book where there is no fanciful writing—Bernstein and Woodward are not Mailer or Capote. They are journalists writing perfectly clean, decent prose and they have a story to tell, and they tell it in such a way that it has enormous power.
It’s certainly a case in which the symbiosis of the writer and the detective is as clear as can be. The writer in this sense is like a vigilante—he has charged himself with finding the truth that no one else, through lack of will or ability, has.
Yes. I think what makes important true crimes books is not simply the stories they relate but the authors that investigate them. You can have investigative historians likeLarson. Or you can have investigative reporters like Woodward and Bernstein. In both cases they are trying to unearth some deeper truth. In many true crime books, the author-investigator is not unlike the detectives he or she is writing about. The skills are very similar, I think, in terms of unearthing evidence and trying to create some kind of structure, plot, or narrative that helps to make sense of the chaos, and piece things back together.•
One of the least-true popular sayings ever is F. Scott Fitzgerald’s saw that “there are no second acts in American lives.” Unfortunately for Maxwell Bodenheim, he was the rare case where the line rang true.
A successful Jazz Age poet and novelist whose erotically charged works positioned him as a scandalous if fashionable figure, Bodenheim became something of a pre-Beat character in later decades, before eventually slipping from Greenwich Village prominence into skid row obscurity, undone by alcoholism, mental illness and other symptoms of the human condition.
The end was even worse than the decline: In 1954, the writer and his third wife, Ruth Fagin, a sometimes prostitute, were murdered by a dishwasher in a Bowery flophouse. It was a scene only Weegee could have truly appreciated, and it’s no shock that the above photograph of Fagin’s body being loaded into an ambulance was taken by the world’s most celebrated tabloid photographer.
Bodenheim was known in his decline phase for trading poems for drinks, getting tossed from saloons where he’d once held court, and panhandling for money on the street while pretending to be blind. It’s understandable if he didn’t want to see what had become of him.
Two articles about the double murder ran in the February 8, 1954 Brooklyn Daily Eagle.
The Player was about the movie business in the same way that The Godfather was about the Mafia: ostensibly.
Both were actually studying a larger idea, American capitalism itself, and the way that money and power can awaken a ruthlessness in those looking to make the grade. The savagery of Michael Tolkin’s 1988 Hollywood Babylon seems almost quaint in retrospect, a stunning turn of events that shows how far we’ve fallen in the decades since. That’s not an isolated event: Dick Cheney, a war criminal, now seems a cooler head by comparison in 2017, the year that the U.S.A. went full apeshit. In this sickening moment of the White House occupied by a Berlusconi who dreams of being a Mussolini, the dystopia fits into the shrunken screens of our smartphones. The pictures got small, yes, but so have we.
In a smart Los Angeles Review of BooksQ&Aconducted by David Breithaupt, Tolkin considers culture, government and climate change in the time of Trump, and discusses his futuristic new novel about biological disaster, NK3. An excerpt:
We have suffered catastrophes throughout history. Do you think our current one can be corrected?
So the story goes that Max Brod, Kafka’s friend and biographer, asked Kafka, “Franz, is there hope?” And Kafka answered, “Oh yes, Max, there’s plenty of hope, an infinity of hope — but not for us.” We’re an omnivorous, territorial, and essentially lazy ape that gathers in bands to steal from others, or force them to work for us, and then sing about it and sometimes even feel bad about how bad we are, but still, you know, go on more with the bad than the good. We’re wired for apprehension and hoarding, and we follow the leader. We have religion to mitigate and excuse. We have art for who the fuck knows, really? We’re funny, no question about our sense of humor, especially our gallows humor. We leave loopholes in all our contracts. This is the dystopia now and has been for a long time. The essence of climate denial is to make a bet that the scientists are wrong so there’s no necessity for prudence, just in case the scientists are right. To be prudent might cost money, and if the scientists are wrong, then that money would be wasted. The denial argument is an equation: better to risk the life of the planet than lose money. And we go along with this because it’s too hard to fight peacefully over a long period. The arc of history may bend toward justice, but not in our lifetimes. There’s going to be a massive die off, but in the long run … Consider the animal videos on YouTube, all the little movies showing animal intelligence, animal capacity for love, and animal capacity for joy. This is a new thing — they are evolving ahead of us, they are rejoicing. That dog and goose chasing each other around the rock, that Russian crow sledding on a pitched roof, that cat rescuing the puppy from the ditch, that elephant sitting on the car. They know something. They know we’re on the way out, even if a million more species are killed, in the very long run, soulful life will return to dominion, finding niches and making a shared ecology, without us. And that’s just the way it’s going to be. In the short run, the fuckers are going to have their celebration of blood. In the long run, intelligent bacteria will eat their flesh.
That’s a nice image, wildlife taking over the Earth after we are gone, perhaps the only comforting thought about our dilemma. Spalding Gray said Mother Earth needs a good long break from us. Is it time to pack it in? One of your characters at the end of NK3 says that every civilization is crushed by its own stupidity. Kurt Vonnegut thought we have passed the point of no return. Where do we go from here?
Get out the vote. That’s where we go. Otherwise, it’s pitchforks and torches, and that’s what we’re being goaded toward.•
Futurists often speak of an approaching if indeterminate time when we’ll enjoy radical abundance, with 3D printers spitting out cars and homes affordable to all and on-demand EVs charging pennies to ferry us around. That could happen.
Of course, we already have radical abundance, more than enough resources to feed, clothe, home and educate every person on the planet. Distribution, however, has been a problem. And the greater the bounty, the worse it seems to be divided.
On number 2: It suggests we’ll get to a much higher plane of technology, a time when, for better or worse, we can control evolution, which would mean we haven’t yet destroyed ourselves with more rudimentary tools or the emergent ones.
On number 3: If Harari is right, even if his prophecy is remarkably aggressive in its timeline of six or so decades, it will change wealth inequality in a fundamental way. The popular belief in recent decades about new tools has been best articulated by the economist Hal Varian: “A simple way to forecast the future is to look at what rich people have today.” Would that still be true if, as the historian puts it, we’re talking about biotech remaking us into gods? The pattern that delivered computers and cell phones in short shrift from early adapters to the masses might not hold, the disparity never remedied.
From Jeremy Olsham at MarketWatch:
“The greatest industry of the 21st century will probably be to upgrade human beings,” historian Yuval Harari, author of the fascinating new book Homo Deus, told MarketWatch. …
As new technologies yield humans with much longer battery lives, killer apps and godlike superpowers, within the next six decades, if Harari is right, even the finest human specimens of 2017 will in hindsight seem like flip phones.
There is, of course, a catch. Many of us will remain flip phones, as the technology to upgrade humans to iPhones is likely to be costly, and regulated differently around the world. These advances will likely “lead to greater income inequality than ever before,” Harari said. “For the first time in history it will be possible to translate economic inequality into biological inequality.”
Such a divide could give rise to a new version of “old racist ideologies that some races are naturally superior to others,” Harari said. “Except this time the biological differences will be real, something that is engineered and manufactured.”•
Behavioral science, which I just mentioned, is usually sold as a modern means of guiding us to healthier decisions about food and finances, among other areas, nudging us to do right rather than forcing us to. It’s billed as being avuncular rather than autocratic, paternalistic instead of despotic.
Even if that’s so, the field’s application is still often fairly creepy, marked by manipulation. It’s real noble contribution would be to teach us about the biases we unwittingly possess and the flaws in our thought processes, so we could analyze them and overcome these failings in time through the development of better critical thinking. Perhaps we’re only in the Proterozoic period of the discipline, and that’s what the branch actually contributes in the long run.
Until that more enlightened age, capitalism almost demands that abuses of the subject will be employed by enough players hoping to pad their bank accounts through “priming” and other predatory practices. Even if the efficacy of these methods is overstated, there’s still plenty of money to be made on the margins, prodding the more prone among us to purchase or politick in a particular way.
In 2007, and again in 2008, Kahneman gave a masterclass in “Thinking About Thinking” to, among others, Jeff Bezos (the founder of Amazon), Larry Page (Google), Sergey Brin (Google), Nathan Myhrvold (Microsoft), Sean Parker (Facebook), Elon Musk (SpaceX, Tesla), Evan Williams (Twitter), and Jimmy Wales (Wikipedia).3At the 2008 meeting, Richard Thaler also spoke about nudges, and in the clips we can view online he describes choice architectures that guide people toward specific behaviors but that can be reversed with one click if the subject doesn’t like the outcome. In Kahneman’s talk, however, he tells his assembled audience of Silicon Valley entrepreneurs that “priming”—picking a suitable atmosphere—is one of the most important areas of psychological research, a technique that involves offering people cues unconsciously (for instance flashing smiley faces on a screen at a speed that makes them undetectable) in order to influence their mood and behavior. He insists that there are predictable and coherent associations that can be exploited by this sort of priming. If subjects are unaware of this unconscious influence, the freedom to resist it begins to look more theoretical than real.
The Silicon Valley executives clearly saw the commercial potential in these behavioral techniques, since they have now become integral to that sector. When Thaler and Sunstein last updated their nudges.org website in 2011, it contained an interview with John Kenny, of the Institute of Decision Making, in which he says:
You can’t understand the success of digital platforms like Amazon, Facebook, Farmville, Nike Plus, and Groupon if you don’t understand behavioral economic principles…. Behavioral economics will increasingly be providing the behavioral insight that drives digital strategy.
And Jeff Bezos of Amazon, in a letter to shareholders in April 2015, declared that Amazon sellers have a significant business advantage because “through our Selling Coach program, we generate a steady stream of automated machine-learned ‘nudges’ (more than 70 million in a typical week).” It is hard to imagine that these 70 million nudges leave Amazon customers with the full freedom to reverse, after conscious reflection, the direction in which they are being nudged.
Facebook, too, has embraced the behavioral insights described by Kahneman and Thaler, having received wide and unwanted publicity for researching priming. In 2012 its Core Data Science Team, along with researchers at Cornell University and the University of California at San Francisco, experimented with emotional priming on Facebook, without the awareness of the approximately 700,000 users involved, to see whether manipulation of their news feeds would affect the positivity or negativity of their own posts. When this came to light in 2014 it was generally seen as an unacceptable form of psychological manipulation. But Facebook defended the research on the grounds that its users’ consent to their terms of service was sufficient to imply consent to such experiments.•
IfMoby-Dickwere the only Herman Melville book I’d ever read, I would have assumed that he was a mediocre writer with great ideas. Having gone through all of his shorter works, however, I know he could be a precise and cogent talent. He seemed to have reached for everything with his most famous novel–aiming to fashion a sort of Shakespearean Old Testament story of good and evil–and buckled under the weight of his ambitions.
The far better Moby-Dick is Cormac McCarthy’s 1992 Blood Meridian: Or the Evening Redness in the West, a horse opera of Biblical proportions, a medicine show peddling poison, which takes an unsparing look at our black hearts and leaves the reader with a purple bruise. Twenty-five years on, it remains as profound and disturbing as any American novel.
The British writerDavid Vannreveals he’s similarly admiring of this McCarthy work in a “Twenty Questions” interview in the Times Literary Supplement. He’s also despairing of what he believes is the bleak future of literature. I believe as long as humans are largely human, we’ll always enamored by narratives. My fear is mainly that sometimes we choose the wrong ones.
Is there any book, written by someone else, that you wish you’d written?
There are hundreds, but the foremost from this time is Cormac McCarthy‘s Blood Meridian, which I think is the greatest novel ever written in English. He’s not a dramatist, and I write Greek tragedy, so I never could have imagined skipping the dramatic plane and going straight to vision. I do write in the same American landscape tradition, extending literal landscapes into figurative ones, but I’ll never do it as powerfully as he does.
What will your field look like twenty-five years from now?
Less money for sure. We’ve already lost so much to piracy and shrinking readerships and economic downturns. Publishers will be less brave, editors will edit less, more books will be published online for nothing, we’ll continue to lose experts and have to put up with even more reviews from unqualified idiots, and as entire generations learn to read without subtext about what someone had for lunch, we can expect literature to look more like an account of what someone had for lunch. There is absolutely no way in which the technology or literary theory of the past decades will enrich literature. We should be honest about what is crap. …
What author or book do you think is most overrated? And why?
I should never answer this kind of question, because I’m only shooting myself in the foot, but when Jonathan Franzen appeared on the cover of Time as the Great American Novelist, who could not have thought of McCarthy, Proulx, Robinson, Morrison, Oates, Roth, DeLillo and at least a hundred others far better than Franzen? And to call The Corrections the best book in ten years? Really?•
Even at his most outlandish, Saunders never seems to be writing about the future but instead providing social critiques about contemporary life. What is the quiet nightmare The Semplica Girls Diariesabout if not the growing divide between the haves and have-nots as we shift from the Industrial Age to the Digital one, the way technocracy removes the friction from our lives and disappears the “downsized” from our minds?
Closing thought: I like the audacity of this book. I like less the places where it feels like I went into Auto-Quirky Mode. Ah youth! Some issues: Life amid limitations; paucity. Various tonalities of defense. Pain; humiliation inflicted on hapless workers – some of us turn on one another. Early on, this read, could really feel this young writer’s aversion to anything mild or typical or bland. Feeling, at first, like a tic. But then it started to grow on me — around “400 Pound CEO.” This performative thing then starts to feel essential; organic somehow – a way to get to the moral outrage. I kept thinking of the word “immoderation.” Like the yelp of someone who’s just been burned.
Sadly those sick feelings have fully metastasized in the intervening 20 years, and now CivilWarLand isn’t the only thing in bad decline.
In a Guardian essay, Saunders does a brilliant job explaining the process of creative writers, though I think he actually explains creativity more broadly. It’s largely about noticing small details and extemporaneously making connections between them.
A guy (Stan) constructs a model railroad town in his basement. Stan acquires a small hobo, places him under a plastic railroad bridge, near that fake campfire, then notices he’s arranged his hobo into a certain posture – the hobo seems to be gazing back at the town. Why is he looking over there? At that little blue Victorian house? Stan notes a plastic woman in the window, then turns her a little, so she’s gazing out. Over at the railroad bridge, actually. Huh. Suddenly, Stan has made a love story. Oh, why can’t they be together? If only “Little Jack” would just go home. To his wife. To Linda.
What did Stan (the artist) just do? Well, first, surveying his little domain, he noticed which way his hobo was looking. Then he chose to change that little universe, by turning the plastic woman. Now, Stan didn’t exactly decide to turn her. It might be more accurate to say that it occurred to him to do so; in a split-second, with no accompanying language, except maybe a very quiet internal “Yes.”
He just liked it better that way, for reasons he couldn’t articulate, and before he’d had the time or inclination to articulate them.
An artist works outside the realm of strict logic. Simply knowing one’s intention and then executing it does not make good art. Artists know this. According to Donald Barthelme: “The writer is that person who, embarking upon her task, does not know what to do.” Gerald Stern put it this way: “If you start out to write a poem about two dogs fucking, and you write a poem about two dogs fucking – then you wrote a poem about two dogs fucking.” Einstein, always the smarty-pants, outdid them both: “No worthy problem is ever solved in the plane of its original conception.”•
Margaret Atwood is in an odd position: As our politics get worse, her stature grows. Right now, sadly (for us), she’s never towered higher.
Appropriate that on International Women’s Day and the A Day Without a Woman protests, the The Handmaid’s Tale novelistconducted a Reddit Ask Me Anything to coincide with the soon-to-premiere Hulu version of her most famous work. Dystopia, feminism and literature are, of course, among the discussion topics. A few exchanges follow.
Thank you so much for writing The Handmaid’s Tale. It was the book that got me hooked on dystopian novels. What was your inspiration for the story?
Ooo, three main things: 1) What some people said they would do re: women if they had the power (they have it now and they are); 2) 17th C Puritan New England, plus history through the ages — nothing in the book that didn’t happen, somewhere and 3) the dystopian spec fics of my youth, such as 1984, Ray Bradbury’s Fahrenheit 451, etc. I wanted to see if I could write one of those, too.
What is a book you keep going back to read and why?
This is going to sound corny but Shakespeare is my return read. He knew so much about human nature (+ and minus) and also was an amazing experimenter with language. But there are many other favourites. Wuthering Heights recently. In moments of crisis I go back to (don’t laugh) Lord of the Rings, b/c despite the EVIL EYE OF MORDOR it comes out all right in the end. Whew.
How, if at all, has your feminism changed over the last decade or so? Can you see these changes taking place throughout your literature? Lastly, can you offer any advice for feminists of the millennial generation? What mistakes are we making/repeating? What are our priorities in this political climate?
Hello: I am so shrieking old that my formative years (the 40s and 50s) took place before 2nd wave late-60’s feminist/women’s movement. But since I grew up largely in the backwoods and had strong female relatives and parents who read a lot and never told me I couldn’t do such and such because of being a girl, I avoided the agit-prop of the 50s that said women should be in bungalows with washing machines to make room for men coming back from the war. So I was always just very puzzled by some of the stuff said and done by/around women. I was probably a danger to myself and others! (joke) My interest was in women of all kinds — and they are of all kinds. They are interesting in and of themselves, and they do not always behave well. But then I learned more about things like laws and other parts of the world, and history… try Marilyn French’s From Eve to Dawn, pretty massive. We are now in what is being called the 3rd wave — seeing a lot of pushback against women, and also a lot of women pushing back in their turn. I’d say in general: be informed, be aware. The priorities in the US are roughly trying to prevent the roll-back that is taking place especially in the area of women’s health. Who knew that this would ever have to be defended? Childbirth care, pre-natal care, early childhood care — many people will not even be able to afford any of it. Dead bodies on the floor will result. It is frightful. Then there is the whole issue of sexual violence being used as control — it is such an old motif. For a theory of why now, see Eve’s Seed. It’s an unsettled time. If I were a younger woman I’d be taking a self-defense course. I did once take Judo, in the days of the Boston Strangler, but it was very lady-like then and I don’t think it would have availed. There’s something called Wen-Do. It’s good, I am told.
The Handmaid’s Tale gets thrown out as your current worst-case scenario right now but I read The Heart Goes Last a few months ago and I was surprised how possible it felt. Was there a specific news story or event that compelled you to write that particular story?
The Heart Goes Last — yes, came from my interest in what happens when a region’s economy collapses and people are really up against it, and the only “business” in which people can have jobs is a prison. It pushes the envelope (will there really be some Elvis robots?) but again, much of what was only speculation then is increasingly possible.
How did your experience with the 2017 version differ from the 1990 version of The Handmaid’s Tale?
Different times (that world is closer now!) and a 90 minute film is a different proposition from a 10 part 1st season series, which can build out and deep dive because it has more time. The advent of high-quality streamed or televised series has opened up a whole new set of possibilities for longer novels. We launched the 1990 film in West and then East Berlin just as the Wall was coming down… and I started writing book when the Wall was still there… Framed it in people’s minds in a different way. Also, then, many people were saying “It can’t happen here.” Now, not so much….•
Dennett believes that our conception of conscious creatures with subjective inner lives—which are not describable merely in physical terms—is a useful fiction that allows us to predict how those creatures will behave and to interact with them.
Nagel draws an analogy between Dennett’s ideas and the Behaviorism of B.F. Skinner and other mid-century psychologists, a theory that never was truly satisfactory in explaining the human mind. Dennett’s belief that we’re more machine-like than we want to believe is probably accurate, though his assertion that all consciousness is illusory–if that’s what he’s arguing–seems off.
Dennett’s life work about consciousness and evolution has certainly crested at right moment, as we’re beginning to wonder in earnest about AI and non-human-consciousness, which seems possible at some point if not on the immediate horizon. In a Financial Timesinterview conducted by John Thornhill, Dennett speaks to the nature and future of robotics.
AI experts tend to draw a sharp distinction between machine intelligence and human consciousness. Dennett is not so sure. Where many worry that robots are becoming too human, he argues humans have always been largely robotic. Our consciousness is the product of the interactions of billions of neurons that are all, as he puts it, “sorta robots”.
“I’ve been arguing for years that, yes, in principle it’s possible for human consciousness to be realised in a machine. After all, that’s what we are,” he says. “We’re robots made of robots made of robots. We’re incredibly complex, trillions of moving parts. But they’re all non-miraculous robotic parts.” …
Dennett has long been a follower of the latest research in AI. The final chapter of his book focuses on the subject. There has been much talk recently about the dangers posed by the emergence of a superintelligence, when a computer might one day outstrip human intelligence and assume agency. Although Dennett accepts that such a superintelligence is logically possible, he argues that it is a “pernicious fantasy” that is distracting us from far more pressing technological problems. In particular, he worries about our “deeply embedded and generous” tendency to attribute far more understanding to intelligent systems than they possess. Giving digital assistants names and cutesy personas worsens the confusion.
“All we’re going to see in our own lifetimes are intelligent tools, not colleagues. Don’t think of them as colleagues, don’t try to make them colleagues and, above all, don’t kid yourself that they’re colleagues,” he says.
Dennett adds that if he could lay down the law he would insist that the users of such AI systems were licensed and bonded, forcing them to assume liability for their actions. Insurance companies would then ensure that manufacturers divulged all of their products’ known weaknesses, just as pharmaceutical companies reel off all their drugs’ suspected side-effects. “We want to ensure that anything we build is going to be a systemological wonderbox, not an agency. It’s not responsible. You can unplug it any time you want. And we should keep it that way,” he says.•
Ezra Klein of Vox has anexcellent interviewabout meditation and much more with Yuval Noah Harari, though I don’t know that I’m buying the main premise which is that the Israeli historian can so ably communicate such cogent ideas because of his adherence to this “mind-clearing” practice.
If that’s so, then I would have to suppose Harari was meditating far less while writing Homo Deus than when composing Sapiens, because the follow-up, while still worth reading, is not nearly as incisive or effective as his first book. (Jennifer Senior had avery good reviewof the sophomore effort in the New York Times.)
What separates Harari from other historians trying to communicate with a lay audience is his ability to brilliantly synthesize ideas in a very organic way. Even when I’m not sure if I’m totally buying one of these combinations (e.g., Alan Turing created a test in which a computer could pass for a human because he spent his brief, tragic life trying to pass for heterosexual), it still provokes me to think deeply on the subject.
I would assume this talent is more a quirk of his own brain chemistry and diligent development of natural gifts than anything else. Of course, meditation could be aiding in the process, or, perhaps, the practice of Vipassana is more correlation than causation. I doubt even Harari truly knows for sure.
The two opening exchanges:
You told the Guardian that without meditation, you’d still be researching medieval military history — but not the Neanderthals or cyborgs. What changes has meditation brought to your work as a historian?
Yuval Noah Harari:
Two things, mainly. First of all, it’s the ability to focus. When you train the mind to focus on something like the breath, it also gives you the discipline to focus on much bigger things and to really tell the difference between what’s important and everything else. This is a discipline that I have brought to my scientific career as well. It’s so difficult, especially when you deal with long-term history, to get bogged down in the small details or to be distracted by a million different tiny stories and concerns. It’s so difficult to keep reminding yourself what is really the most important thing that has happened in history or what is the most important thing that is happening now in the world. The discipline to have this focus I really got from the meditation.
The other major contribution, I think, is that the entire exercise of Vipassana meditation is to learn the difference between fiction and reality, what is real and what is just stories that we invent and construct in our own minds. Almost 99 percent you realize is just stories in our minds. This is also true of history. Most people, they just get overwhelmed by the religious stories, by the nationalist stories, by the economic stories of the day, and they take these stories to be the reality.
My main ambition as a historian is to be able to tell the difference between what’s really happening in the world and what are the fictions that humans have been creating for thousands of years in order to explain or in order to control what’s happening in the world.
One of the ideas that is central to your book Sapiens is that the central quality of Homo sapiens, what has allowed us to dominate the earth, is the ability to tell stories and create fictions that permit widespread cooperation in a way other species can’t. And what you count as fiction ranges all the way from early mythology to the Constitution of the United States of America.
I wouldn’t have connected that to the way meditation changes what you see as real, but it makes sense that if you’re observing the way your mind creates imaginary stories, maybe much more ends up falling into that category than you originally thought.
Yuval Noah Harari:
Yes, exactly. We seldom realize it, but all large-scale human cooperation is based on fiction. This is most clear in the case of religion, especially other people’s religion. You can easily understand that, yes, millions of people come together to cooperate in a crusade or a jihad or to build the cathedral or a synagogue because all of them believe some fictional story about God and heaven and hell.
What is much more difficult to realize is that exactly the same dynamic operates in all other kinds of human cooperation. If you think about human rights, human rights are a fictional story just like God and heaven. They are not a biological reality. Biologically speaking, humans don’t have rights. If you take Homo sapiens and look inside, you find the heart and the kidneys and the DNA. You don’t find any rights. The only place rights exist is in the stories that people have been inventing.
Another very good example is money. Money is probably the most successful story ever told. It has no objective value. It’s not like a banana or a coconut. If you take a dollar bill and look at it, you can’t eat it. You can’t drink it. You can’t wear it. It’s absolutely worthless. We think it’s worth something because we believe a story. We have these master storytellers of our society, our shamans — they are the bankers and the financiers and the chairperson of the Federal Reserve, and they come to us with this amazing story that, “You see this green piece of paper? We tell you that it is worth one banana.”
If I believe it and you believe it and everybody believes it, it works. It actually works. I can take this worthless piece of paper, go to a complete stranger who I never met before, give him this piece of paper, and he in exchange will give me a real banana that I can eat.
This is really amazing, and no other animal can do it. Other animals sometimes trade. Chimpanzees, for example, they trade. You give me a coconut. I’ll give you a banana. That can work with a chimpanzee, but you give me a worthless piece of paper and you expect me to give you a banana? That will never work with a chimpanzee.
This is why we control the world, and not the chimpanzees.•
New Texas Monthly EIC Tim Taliaferro was quoted in the Columbia Journalism Review as saying that “Texans don’t care about politics,” so the publication was moving in a new and softer direction. Oil itself couldn’t be more Texan than politics, so the editor quickly found himself sliding around in a slick andbackpedaled as hastily as possible. Who knows what the future holds for the legendary long-form magazine, but it doesn’t sound hopeful.
One of the linchpins of Texas Monthly reportage–and, more broadly, of the New Journalism of the ’60s and ’70s–was Gary Cartwright, a Dallas-Fort Worth gonzo who just passed away. The hard-drinking, freewheeling writer was an exquisite prose producer and confidante of Jack Ruby, who spent many of the days leading up to his murder of Lee Harvey Oswald planted, along with his stripper girlfriend Jada, on Cartwright’s couch. The scribe wrote of the two-bit assassin who dreamed of global fame: “He didn’t make history; he only stepped in front of it. When he emerged from obscurity into that inextricable freeze-frame that joins all of our minds to Dallas, Jack Ruby, a baldheaded little man who wanted above all else to make it big, had his back to the camera.”
In the 1960s and ’70s Mr. Cartwright belonged to a group of writers — including Mr. Shrake, Dan Jenkins, Billy Lee Brammer and Larry L. King, one of the writers of the hit Broadway musical “The Best Little Whorehouse in Texas” — whose hard, boozy living and freewheeling prose captured and exemplified the era.
“It seemed like they were living lives of joy and engagement and with a sense of recklessness that was beyond the reach of most of us,” Joe Holley, a columnist and editorial writer for The Houston Chronicle, said in an interview. “They lived hard. They wrote well, and they seemed to be intensely alive.
“What we didn’t realize until later, when the heart attacks began and when they started writing confessional memoirs, was that hard living exacted a price.”
Mr. Cartwright published another memoir, The Best I Recall, in 2015. He also wrote screenplays and novels.
He was born in Dallas in 1934 and grew up in the tiny West Texas oil town of Royalty in the late 1930s. With defense plants in the Dallas-Fort Worth area hiring after the start of World War II, the family moved to Arlington, the Dallas suburb, where his mother worked in a dress shop. His father worked at a defense plant in Fort Worth.
After high school Mr. Cartwright attended Arlington State College and the University of Texas, enlisted in the Army for a two-year stateside stint and earned his bachelor’s degree afterward at Texas Christian University.
He got his start in journalism in the mid-1950s, covering the police and sports for newspapers in Fort Worth and Dallas. He became the anchor of Texas Monthly and mentored a generation of young journalists, including Nicholas Lemann, the author and former dean of the Columbia University Graduate School of Journalism.
“Gary was a Texas news guy to the core — somebody who grew up in old-school, smoke-filled, blue-collar newsrooms and went on to become one of the first Texas journalists to make a national reputation in long-form journalism,” Mr. Lemann said.
The opening of Cartwright’s great 1976 Texas Monthly profile of notorious Dallas stripper Juanita Dale Slusher, aka Candy Barr:
On the road home to Brownwood in her green ’74 Cadillac with the custom upholstery and the CB radio, clutching a pawn ticket for her $3000 mink, Candy Barr thought about biscuits. Biscuits made her think of fried chicken, which in turn suggested potato salad and corn. For as long as she could remember, in times of crisis and stress, Candy Barr always thought of groceries. It was a miracle she didn’t look like a platinum pumpkin, but she didn’t: even at 41, she still looked like a movie star.
For once, the crisis was not her own. It was something she had read a few days earlier about how the omnipotent, totalitarian they were about to jackboot the remnants of the once happy and prosperous life of a 76-year-old Dallas electrician named O. E. Cole. Candy had never heard of O. E. Cole until she spotted his pitiful tale in the Brownwood newspaper. She didn’t know if Cole was black or white, mean or generous, judgmental or forgiving. She only knew he was in trouble. For nearly fifty years Cole had been an upright, hardworking citizen of a city Candy Barr had every reason to hate; then his wife Nettie suffered a stroke and lingered in a coma for eighteen months while their savings were sucked away. According to the newspaper account, Cole spent $500 for Nettie’s headstone, which left him a balance of $157. Before he could use that money to cover mortgage payments on his home and the electrician’s shop at the back, a gunman shot and robbed him. Now, when he was too old to apply for additional credit, they were prepared to foreclose.
“This is a goddamn crime!” Candy raged, throwing her suitcase on the bed and barking a string of orders to her houseguests: Scott, her 22-year-old boyfriend of the moment, and Susan Slusher, her 17-year-old niece who had recently come to stay with “Aunt Nita” from a broken home in Philadelphia.
Scott and Susan had been around just long enough to know that when Candy blew—as often as she did without warning—to look not for explanations but for something sturdy to hang onto. Try to imagine a hurricane in a Dixie Cup. The laughing tropical green eyes boiled, and the innocence that had made that perfect teardrop face a landmark in the sexual liberation of an entire generation of milquetoasts became the wrath of Zeus. They say she once sat waiting in a rocking chair talking to sweet Jesus and when her ex-husband kicked down the door she threw down on him with a pistol that was resting conveniently in her lap. She shot him in the stomach, but she was aiming for the groin. When she caught mobster Mickey Cohen talking to another woman, she slugged him in the teeth. She carved her mark on a dyke in the prison workshop: this was not a lovers’ quarrel, as an assistant warden indicated on her record, but a disagreement stemming from Candy’s hard-line belief that a worker should take pride in her job.
Candy had a cosmic way of connecting things, which to the more prosaic mind might appear coincidental. So it was that the ill-fated placement of a Citizens National Bank of Brownwood ad next to the article outlining the plight of O. E. Cole ignited her fuse. The bank ad suggested that had it not been for a Revolutionary War banker named Robert Morris, we might all be sipping tea with crumpets and begging God to save our Queen. What the average eye might take as harmless Bicentennial puffery hit Candy’s heart dead center.
“I watched the bastards do the same thing to my daddy,” Candy fumed, removing her mink from the cedar chest and raking bottles and jars of cosmetics into her overnight bag. “I sold my hunting rifle three times to help my daddy. It’s a crime what they can do to people, a goddamn crime. Don’t call me a criminal if you’re gonna be one.”
With the skillful employment of her CB radio, “The Godmother” and her two young companions made the 160 miles to Dallas in less that two hours. Candy hocked her mink for $250, then called on dancer Chastity Fox and other friends to help raise another $150. Then Candy painted her face with soft missionary shades of tan and gold and called on O. E. Cole, introducing herself as Juanita Dale Phillips of Brownwood and presenting the goggle-eyed electrician with $400 and a copy of her book of prison poems, A Gentle Mind … Confused. Cole couldn’t have been more confused if he had found Fidel Castro in his refrigerator. When I spoke with Cole two weeks later, there were still some blank spaces behind his eyes, but the crisis had passed.
“I didn’t know who she was till I saw her name on that little book,” he told me. “Oh, yes, I knew the name Candy Barr. You couldn’t live in Dallas long as I had and not know that name. But it wasn’t for me to judge her. What is past is past. It’s what a person is now I go on, and she was awful nice. We sat around and talked for hours. In fact, we talked all night long.”•
Rebuilding the world has to rank at the top of economic low-hanging fruit of the last century. America, its forces marshaled, played a leading role in piecing together the shattered globe in the wake of WWII. Yes, four decades of unfortunate tax rates, globalization, automation and the demise of unions have all abetted the decline of the U.S. middle class, but just as true is that the good times simply ended, the job completed (more or less), the outlier ran headlong into entropy. The contents of this half-empty glass finally spilled all over the world in 2016, provoking outrageously regressive political shifts, with perhaps more states becoming submerged this year.
As An Extraordinary Time author Marc Levinson wrote in 2016 in the Wall Street Journal: “The quarter-century from 1948 to 1973 was the most striking stretch of economic advance in human history. In the span of a single generation, hundreds of millions of people were lifted from penury to unimagined riches.” In “End of a Golden Age,” an Aeon essay, the economist and journalist further argues the global circumstances of the postwar era were a one-time-only opportunity for runaway productivity, a fortunate arrangement of stars likely to never align again.
Well, never is an extremely long stretch (we hope), but the economic-growth-rate promises brought to the trail by Sanders and Trump, which have made it to the White House with the unfortunate election of the latter candidate, were at best fanciful, though delusional might also be a fair assessment. If I had to guess, I would say someday we’ll see tremendous growth again, but when that happens and what precipitates it, I don’t know. Nobody really does.
When it comes to influencing innovation, governments have power. Grants for scientific research and education, and policies that make it easy for new firms to grow, can speed the development of new ideas. But what matters for productivity is not the number of innovations, but the rate at which innovations affect the economy – something almost totally beyond the ability of governments to control. Turning innovative ideas into economically valuable products and services can involve years of trial and error. Many of the basic technologies behind mobile telephones were developed in the 1960s and ’70s, but mobile phones came into widespread use only in the 1990s. Often, a new technology is phased in only over time as old buildings and equipment are phased out. Moreover, for reasons no one fully understands, productivity growth and innovation seem to move in long cycles. In the US, for example, between the 1920s and 1973, innovation brought strong productivity growth. Between 1973 and 1995, it brought much less. The years between 1995 and 2003 saw high productivity gains, and then again considerably less thereafter.
When the surge in productivity following the Second World War tailed off, people around the globe felt the pain. At the time, it appeared that a few countries – France and Italy for a few years in the late 1970s, Japan in the second half of the ’80s – had discovered formulas allowing them to defy the downward global productivity trend. But their economies revived only briefly before productivity growth waned. Jobs soon became scarce again, and improvements in living standards came more slowly. The poor productivity growth of the late 1990s was not due to taxes, regulations or other government policies in any particular country, but to global trends. No country escaped them.
Unlike the innovations of the 1950s and ’60s, which were welcomed widely, those of the late 20th century had costly side effects. While information technology, communications and freight transportation became cheaper and more reliable, giant industrial complexes became dinosaurs as work could be distributed widely to take advantage of labour supplies, transportation facilities or government subsidies. Workers whose jobs were relocated found that their years of experience and training were of little value in other industries, and communities that lost major employers fell into decay. Meanwhile, the welfare state on which they had come to rely began to deteriorate, its financial underpinnings stressed due to the slow growth of tax revenue in economies that were no longer buoyant. The widespread sharing in the mid-century boom was not repeated in the productivity gains at the end of the century, which accumulated at the top of the income scale.
For much of the world, the Golden Age brought extraordinary prosperity. But it also brought unrealistic expectations about what governments can do to assure full employment, steady economic growth and rising living standards. These expectations still shape political life today.•