The new Space Race is upon us, this one as much a contest between private and public as among nations. Norman Mailer was the one, above all others, who correctly read the subtext of the 1960s iteration, realizing the Apollo mission a permanent subjugation of humanity–“space travel proposed a future world of brains attached to wires.” Hemingway’s bullfights and other macho challenges were hopelessly diminished in a time of space odyssey. Now we’ll return to space with greater desperation, hoping to safeguard the species from existential risks. Of course, we will simultaneously mutate and end the species as we know it when we stretch across the sky. We’ll become them.
In aVantage essay, Doug Bierend writes of Abandoned in Place, Roland Miller’s glorious collection of photos which captures the gentle decline of decommissioned launch sites and NASA structures of yore. An excerpt:
Shot with a reverent eye, NASA’s sprawling launch sites and structures, gleaming test facilities, and rusting machinery come together as visual a document and testament to 21st century humanity’s ever-extending reach into the cosmos. Mute monuments to what were once our most lofty ideals.
“I’m a child of the ‘60s, and for anybody that was growing up during that time it was so exciting, it was like science fiction come to life — they were going to try to land on the moon, and they did,” says Miller. “And here we are almost 50 years later and we couldn’t land it on the moon — I doubt we could make the same nine-year window if we started now.”
With the scrapping of the Space Shuttle, public excitement over space exploration seemed to reach an all time low, along with NASA’s budget. But whether due to a combination of private innovation by the likes ofSpaceX, the effective popularizing of science by figures like Neil DeGrasse Tyson, or a renewed schedule of NASA programs (including missions to Mars), a second golden age of space exploration may be dawning.
Some of Miller’s photos come across as essentially documentary, showing the current state of a once gleaming endeavor. Others are more abstract, revealing textures and colors and forms that allude to something ineffable. An aesthetic that’s as much part of science fiction as science fact, conjuring notions of space and worlds beyond our own, and how we might get there.•
Some of the things contemporary consumers most desire to possess are tangible (smartphones) and others not at all (Facebook, Instagram, etc.). In fact, many want the former mainly to get the latter. A social media “purchase” requires no money but is a trade of information for attention, a dynamic that’s been widely acknowledged, but one that still stuns me. Our need to share ourselves–to write our names Kilroy-like on a wall, as Hunter S. Thompson once said–is etched so deeply in our brains. Manufacturers have used psychology to sell for at least a century, but the transaction has never been purer, never required us to not only act on impulse but to publish that instinct as well. Judging by the mood of America, this new thing, while it may provide some satisfaction, also promotes an increased hunger in the way sugar does. And while the Internet seems to encourage individuality, its mass use and many memes suggests something else.
On a somewhat related topic: Rebecca Spang’s Financial Times article analyzes a new book which argues that a consumerist shift is more a political movement than we’d like to believe, often a culmination of large-scale state decisions rather than of personal choice. The passage below is referring to material goods, but I think the implications for the immaterial are the same. The excerpt:
In Empire of Things, Frank Trentmann brings history to bear on all these questions. His is not a new subject, per se, but his thick volume is both an impressive work of synthesis and, in its emphasis on politics and the state, a timely corrective to much existing scholarship on consumption. Based on specialist studies that range across five centuries, six continents and at least as many languages, the book is encyclopedic in the best sense. In his final pages, Trentmann intentionally or otherwise echoes Diderot’s statement (in his own famous Encyclopédie) that the purpose of an encyclopedia is to collect and transmit knowledge “so that the work of preceding centuries will not become useless to the centuries to come”. Empire of Things uses the evidence of the past to show that “the rise of consumption entailed greater choice but it also involved new habits and conventions . . . these were social and political outcomes, not the result of individual preferences”. The implications for our current moment are significant: sustainable consumption habits are as likely to result from social movements and political action as they are from self-imposed shopping fasts and wardrobe purges.
When historians in the 1980s-1990s first shifted from studying production to consumption, our picture of the past became decidedly more individualist. In their letters and diaries, Georgian and Victorian consumers revealed passionate attachments to things — those they had and those they craved. Personal tastes and preferences hence came to rival, then to outweigh, abstract processes (industrialisation, commodification, etc) as explanations for historical change. The world looked so different! Studied from the vantage point of production, the late 18th and 19th centuries had appeared uniformly dark and dusty with soot; imagined from the consumer’s perspective, those same years glowed bright with an entire spectrum of strange, distinct colours (pigeon’s breast, carmelite, eminence, trocadero, isabella, Metternich green, Niagra [sic] blue, heliotrope). At the exact moment when Soviet power seemed to have collapsed chiefly from the weight of repressed consumer desire, consumption emerged as a largely positive, almost liberating, historical force. “Material culture” became a common buzzword; “thing theory” — yes, it really is a thing — was born.•
Asking if innovation is over is no less narcissistic than suggesting that evolution is done. It flatters us to think that we’ve already had all the good ideas, that we’re the living end. More likely, we’re always closer to the beginning.
Of course, when looking at relatively short periods of time, there are ebbs and flows in invention that have serious ramifications for the standard of living. In Robert Gordon’s The Rise and Fall of American Growth, the economist argues that the 1870-1970 period was a golden age of productivity and development unknown previously and unmatched since.
In an excellent Foreign Affairs review, Tyler Cowen, who himself has worried that we’ve already picked all the low-hanging fruit, lavishly praises the volume–“likely to be the most interesting and important economics book of the year.” But in addition to acknowledging a technological slowdown in the last few decades, Cowen also wisely counters the book’s downbeat tone while recognizing the obstacles to forecasting, writing that “predicting future productivity rates is always difficult; at any moment, new technologies could transform the U.S. economy, upending old forecasts. Even scholars as accomplished as Gordon have limited foresight.” In fact, he points out that the author, before his current pessimism, predicted earlier this century very healthy growth rates.
My best guess is that there will always be transformational opportunities, ripe and within arm’s length, waiting for us to pluck them.
In the first part of his new book, Gordon argues that the period from 1870 to 1970 was a “special century,” when the foundations of the modern world were laid. Electricity, flush toilets, central heating, cars, planes, radio, vaccines, clean water, antibiotics, and much, much more transformed living and working conditions in the United States and much of the West. No other 100-year period in world history has brought comparable progress. A person’s chance of finishing high school soared from six percent in 1900 to almost 70 percent, and many Americans left their farms and moved to increasingly comfortable cities and suburbs. Electric light illuminated dark homes. Running water eliminated water-borne diseases. Modern conveniences allowed most people in the United States to abandon hard physical labor for good.
In highlighting the specialness of these years, Gordon challenges the standard view, held by many economists, that the U.S. economy should grow by around 2.2 percent every year, at least once the ups and downs of the business cycle are taken into account. And Gordon’s history also shows that not all GDP gains are created equal. Some sources of growth, such as antibiotics, vaccines, and clean water, transform society beyond the size of their share of GDP. But others do not, such as many of the luxury goods developed since the 1980s. GDP calculations do not always reflect such differences. Gordon’s analysis here is mostly correct, extremely important, and at times brilliant—the book is worth buying and reading for this part alone.
Gordon goes on to argue that today’s technological advances, impressive as they may be, don’t really compare to the ones that transformed the U.S. economy in his “special century.” Although computers and the Internet have led to some significant breakthroughs, such as allowing almost instantaneous communication over great distances, most new technologies today generate only marginal improvements in well-being. The car, for instance, represented a big advance over the horse, but recent automotive improvements have provided diminishing returns. Today’s cars are safer, suffer fewer flat tires, and have better sound systems, but those are marginal, rather than fundamental, changes. That shift—from significant transformations to minor advances—is reflected in today’s lower rates of productivity.•
There’s never been greater access to books than there is right now, but all progress comes with a price. If print fiction and histories and such should disappear or become merely a luxury item, digital media would change the act of reading in unexpected ways over time.
Some see screen reading promoting a decline in analytical skills, but the human brain sure seems able to adapt to new forms once it becomes acclimated. Even as someone raised on paper books, I’m not worried that what’s lost in translation will be greater than what’s gained. Of course, I say that while still primarily using dead-tree volumes.
In a smart BBC Future article, Rachel Nuwer traces the fuzzy history of e-books and considers the future of reading. Some experts she interviews hope for a “bi-literate” society that values both the paperback and the Kindle. That would be a great outcome, but I don’t know how realistic a scenario it is. The opening:
WhenPeter Jamespublished his novel Host on two floppy disks in 1993, he was ill-prepared for the “venomous backlash” that would follow. Journalists and fellow writers berated and condemned him; one reporter evendragged a PC and a generatorout to the beach to demonstrate the ridiculousness of this new form of reading. “I was front-page news of many newspapers around the world, accused of killing the novel,”James told pop.edit.lit. “[But] I pointed out that the novel was already dying at an alarming rate without my assistance.”
Shortly after Host’s debut, James also issued a prediction: that e-books would spike in popularity once they became as easy and enjoyable to read as printed books. What was a novelty in the 90s, in other words, would eventually mature to the point that it threatened traditional books with extinction. Two decades later, James’ vision is well on its way to being realised.
That e-books have surged in popularity in recent years is not news, but where they are headed – and what effect this will ultimately have on the printed word – is unknown. Are printed books destined to eventually join the ranks of clay tablets, scrolls and typewritten pages, to be displayed in collectors’ glass cases with other curious items of the distant past?
And if all of this is so, should we be concerned?•
Her latest salvo tries to locate the real legacy of Steve Jobs, who was mourned equally in office parks and Zuccotti Park. In doing so she calls on the two recent films on the Apple architect, Alex Gibney’s and Danny Boyle’s, and the new volume about him by Brent Schlender and Rick Tetzeli. Ultimately, the key truth may be that Jobs used a Barnum-esque “magic” and marketing myths to not only sell his new machines but to plug them into consumers’ souls.
So why, Gibney wonders as his film opens—with thousands of people all over the world leaving flowers and notes “to Steve” outside Apple Stores the day he died, and fans recording weepy, impassioned webcam eulogies, and mourners holding up images of flickering candles on their iPads as they congregate around makeshift shrines—did Jobs’s death engender such planetary regret?
The simple answer is voiced by one of the bereaved, a young boy who looks to be nine or ten, swiveling back and forth in a desk chair in front of his computer: “The thing I’m using now, an iMac, he made,” the boy says. “He made the iMac. He made the Macbook. He made the Macbook Pro. He made the Macbook Air. He made the iPhone. He made the iPod. He’s made the iPod Touch. He’s made everything.”
Yet if the making of popular consumer goods was driving this outpouring of grief, then why hadn’t it happened before? Why didn’t people sob in the streets when George Eastman or Thomas Edison or Alexander Graham Bell died—especially since these men, unlike Steve Jobs, actually invented the cameras, electric lights, and telephones that became the ubiquitous and essential artifacts of modern life?* The difference, suggests the MIT sociologist Sherry Turkle, is that people’s feelings about Steve Jobs had less to do with the man, and less to do with the products themselves, and everything to do with the relationship between those products and their owners, a relationship so immediate and elemental that it elided the boundaries between them. “Jobs was making the computer an extension of yourself,” Turkle tells Gibney. “It wasn’t just for you, it was you.”•
Curtis is, in many ways, the USGS gatekeeper, the public affairs officer who serves as a frontline liaison with the community and the press. Her office sits directly across the hall from the conference room, and if you call the Survey, chances are it will be her low-key drawl you’ll hear on the line. In her late forties, dark-haired and good-humored, Curtis has been at the USGS since 1979, and in that time, she’s staked out her own odd territory as a collector of earthquake predictions, which come across the transom at sporadic but steady intervals, like small seismic jolts themselves.
“I’ve been collecting almost since day one,’ she tells me on a warm July afternoon in her office, adding that it’s useful for USGS to keep records, if only to mollify the predictors, many of whom view the scientific establishment with frustration, paranoia even, at least as far as their theories are concerned.
“Basically,” she says, ‘we are just trying to protect our reputation. We don’t want to throw these predictions in the wastebasket, and then a week later…” She chuckles softly, a rolling R sound as thick and throaty as a purr. “Say somebody predicted a seven in downtown L.A., and we ignored it. Can you imagine the reaction if it actually happened? So this is sort of a little bit of insurance. If you send us a prediction, we put it in the file.”•
Audio only of Ray Bradbury lecturing in 1964 at UCLA during the “most exciting age in the history of mankind,” labeling the Space Age as greater than the Renaissance. He was right that we would soon be on the moon, but he didn’t foresee the forestalling of space exploration post-Apollo and the geographical barriers to global unity. Bradbury also speaks adoringly of Mad magazine and discusses aLife short story to be published at the beginning of 1965.
If I could have dinner with any three living Americans, Ricky Jay would definitely be one, even though I can’t say I care much for magic. Jay, of course, practices magic in the same sense that Benjamin Franklin flew kites. It’s the invisible stuff being conducted that makes all the difference.
It’s always amazed me that Jay’s enjoyed so much success despite having a brilliance driven so far from the mainstream by manias about marginalia, things barely perceptible to most. In that vein, he’s written a book about Matthias Buchinger, an eighteenth-century German magician whose unlikely success even outdoes Jay’s.
Buchinger was a 29-inch tall phocomelic who lacked properly formed limbs yet managed to gain acclaim in a variety of fields: marksmanship, bowling, illustration, music, dance and micrography. The latter gift–the ability to write in incredibly small letters–is the basis of the book and a part of an exhibit at the Met.
The magician Ricky Jay, considered by many the greatest sleight-of-hand artist alive, is also a scholar, a historian, a collector of curiosities. Master of a prose style that qualifies him as perhaps the last of the great 19th-century authors, he has written about oddities like cannonball catchers, poker-playing pigs, performing fleas and people who tame bees. But probably his most enduring interest is a fellow polymath, an 18th-century German named Matthias Buchinger.
Buchinger (1674-1739) was a magician and musician, a dancer, champion bowler and trick-shot artist and, most famously, a calligrapher specializing in micrography — handwriting so small it’s barely legible to the naked eye. His signature effect was to render locks of hair that, when examined closely, spelled out entire Psalms or books from the Bible. What made his feats even more remarkable is that Buchinger was born without hands or feet and was only 29 inches tall. Portraits show him standing on a cushion and wearing a sort of lampshade-like robe. Yet he married four times and had 14 children. Some people have suggested that he also had up to 70 mistresses, but Mr. Jay says that’s nonsense.
Nothing’s so useful in politics as boogeymen. Fixing an actual large-scale problem is hard, sometimes impossible, so attention is often diverted to a relatively miniscule one. There’s an added bonus: Frightened people are paralyzed, easy to manipulate.
During the second half of the 1960s, when the American social fabric began to fray in a cultural revolution that no one could contain, motorcycle gangs became useful stooges as symbols of barbarians at the gates. In 1966, when a shocking report of a California crime made the Hell’s Angels Public Enemy No. 1, Hunter S. Thompson elucidated the disproportionate attention the unholy rollers were receiving inan articlein the Nation. Of course, the following year he fed the myth himself with a book about his travels–and travails–with the club. An excerpt:
The California climate is perfect for motorcycles, as well as surfboards, swimming pools and convertibles. Most of the cyclists are harmless weekend types, members of the American Motorcycle Association, and no more dangerous than skiers or skin divers. But a few belong to what the others call “outlaw clubs,” and these are the ones who–especially on weekends and holidays–are likely to turn up almost anywhere in the state, looking for action. Despite everything the psychiatrists and Freudian casuists have to say about them, they are tough, mean and potentially as dangerous as a pack of wild boar. When push comes to shove, any leather fetishes or inadequacy feelings that may be involved are entirely beside the point, as anyone who has ever tangled with these boys will sadly testify. When you get in an argument with a group of outlaw motorcyclists, you can generally count your chances of emerging unmaimed by the number of heavy-handed allies you can muster in the time it takes to smash a beer bottle. In this league, sportsmanship is for old liberals and young fools. “I smashed his face,” one of them said to me of a man he’d never seen until the swinging started. “He got wise. He called me a punk. He must have been stupid.”
The most notorious of these outlaw groups is the Hell’s Angels, supposedly headquartered in San Bernardino, just east of Los Angeles, and with branches all over the state. As a result of the infamous “Labor Day gang rape,” the Attorney General of California has recently issued an official report on the Hell’s Angels. According to the report, they are easily identified:
The emblem of the Hell’s Angels, termed “colors,” consists of an embroidered patch of a winged skull wearing a motorcycle helmet. Just below the wing of the emblem are the letters “MC.” Over this is a band bearing the words “Hell’s Angels.” Below the emblem is another patch bearing the local chapter name, which is usually an abbreviation for the city or locality. These patches are sewn on the back of a usually sleeveless denim jacket. In addition, members have been observed wearing various types of Luftwaffe insignia and reproductions of German iron crosses.* (*Purely for decorative and shock effect. The Hell’s Angels are apolitical and no more racist than other ignorant young thugs.) Many affect beards and their hair is usually long and unkempt. Some wear a single earring in a pierced ear lobe. Frequently they have been observed to wear metal belts made of a length of polished motorcycle drive chain which can be unhooked and used as a flexible bludgeon… Probably the most universal common denominator in identification of Hell’s Angels is generally their filthy condition. Investigating officers consistently report these people, both club members and their female associates, seem badly in need of a bath. Fingerprints are a very effective means of identification because a high percentage of Hell’s Angels have criminal records.
In addition to the patches on the back of Hell’s Angel’s jackets, the “One Percenters” wear a patch reading “1%-er.” Another badge worn by some members bears the number “13.” It is reported to represent the 13th letter of the alphabet, “M,” which in turn stands for marijuana and indicates the wearer thereof is a user of the drug.
The Attorney General’s report was colorful, interesting, heavily biased and consistently alarming–just the sort of thing, in fact, to make a clanging good article for a national news magazine. Which it did; in both barrels. Newsweek led with a left hook titled “The Wild Ones,” Time crossed right, inevitably titled “The Wilder Ones.” The Hell’s Angels, cursing the implications of this new attack, retreated to the bar of the DePau Hotel near the San Francisco waterfront and planned a weekend beach party. I showed them the articles. Hell’s Angels do not normally read the news magazines. “I’d go nuts if I read that stuff all the time,” said one. “It’s all bullshit.”
Newsweek was relatively circumspect. It offered local color, flashy quotes and “evidence” carefully attributed to the official report but unaccountably said the report accused the Hell’s Angels of homosexuality, whereas the report said just the opposite. Time leaped into the fray with a flurry of blood, booze and semen-flecked wordage that amounted, in the end, to a classic of supercharged hokum: “Drug-induced stupors… no act is too degrading… swap girls, drugs and motorcycles with equal abandon… stealing forays… then ride off again to seek some new nadir in sordid behavior…”
Where does all this leave the Hell’s Angels and the thousands of shuddering Californians (according to Time) who are worried sick about them? Are these outlaws really going to be busted, routed and cooled, as the news magazines implied? Are California highways any safer as a result of this published uproar? Can honest merchants once again walk the streets in peace? The answer is that nothing has changed except that a few people calling themselves the Hell’s Angels have a new sense of identity and importance.
After two weeks of intensive dealings with the Hell’s Angels phenomenon, both in print and in person, I’m convinced the net result of the general howl and publicity has been to obscure and avoid the real issues by invoking a savage conspiracy of bogeymen and conning the public into thinking all will be “business as usual” once this fearsome snake is scotched, as it surely will be by hard and ready minions of the Establishment.•
Northwestern economist Robert Gordon may be too bearish on the transformative powers of the Internet, but he does make a good case that the technological innovations of a century ago dwarf the impact of the information revolution.
A well-written and sadly un-bylined Economist review of the academic’s new book, The Rise and Fall of American Growth, looks at how the wheels came off the U.S. locomotive in the 1970s, courtesy of the rise of global competition and OPEC along with increasing inequality on the homefront. Gordon is dour about the prospects of a new American century, believing technologists are offering thin gruel and that Moore’s Law is running aground. The reviewer thinks the economist is ultimately too dismissive of Silicon Valley.
The technological revolutions of the late 19th century transformed the world. The life that Americans led before that is unrecognisable. Their idea of speed was defined by horses. The rhythm of their days was dictated by the rise and fall of the sun. The most basic daily tasks—getting water for a bath or washing clothes—were back-breaking chores. As Mr Gordon shows, a succession of revolutions transformed every aspect of life. The invention of electricity brought light in the evenings. The invention of the telephone killed distance. The invention of what General Electric called “electric servants” liberated women from domestic slavery. The speed of change was also remarkable. In the 30 years from 1870 to 1900 railway companies added 20 miles of track each day. By the turn of the century, Sears Roebuck, a mail-order company that was founded in 1893, was fulfilling 100,000 orders a day from a catalogue of 1,162 pages. The price of cars plummeted by 63% between 1912 and 1930, while the proportion of American households that had access to a car increased from just over 2% to 89.8%.
America quickly pulled ahead of the rest of the world in almost every new technology—a locomotive to Europe’s snail, as Andrew Carnegie put it. In 1900 Americans had four times as many telephones per person as the British, six times as many as the Germans and 20 times as many as the French. Almost one-sixth of the world’s railway traffic passed through a single American city, Chicago. Thirty years later Americans owned more than 78% of the world’s motor cars. It took the French until 1948 to have the same access to cars and electricity that America had in 1912.
The Great Depression did a little to slow America’s momentum. But the private sector continued to innovate. By some measures, the 1930s were the most productive decade in terms of the numbers of inventions and patents granted relative to the size of the economy. Franklin Roosevelt’s government invested in productive capacity with the Tennessee Valley Authority and the Hoover Dam.
The second world war demonstrated the astonishing power of America’s production machine. After 1945 America consolidated its global pre-eminence by constructing a new global order, with the Marshall Plan and the Bretton Woods institutions, and by pouring money into higher education. The 1950s and 1960s were a golden age of prosperity in which even people with no more than a high-school education could enjoy a steady job, a house in the suburbs and a safe retirement.
But Mr Gordon’s tone grows gloomy when he turns to the 1970s.•
From Socrates to Snapchat, technology has been feared as a threat to human intelligence and memory, though it usually ends up making us better. In a smart Paul La FargeNautilus essay, the writer argues the Internet and e-books will not be the ruination of us, particularly our ability to read.
For someone like myself who was raised on printed matter, there’s a special joy in devouring paper books, but I don’t think a complete transition from that media would be a disaster. I’m also not overly concerned about the absolute flood of information now available to all of us. I do think the brain can rewire to accommodate such a challenge, even if memory isn’t particularly elastic.
The medium is the message and our tools shape us after we shape them, sure, but I don’t think human learning is that simple, either. We seem awfully adept at choosing the information we want, regardless of the vehicle that delivers it. That process appears more internal than anything, for better or worse. We’re not bad now where we used to be good. We’ve always been a mix of those things and probably always will be.
La Farge’s opening:
In A History of Reading, the Canadian novelist and essayist Alberto Manguel describes a remarkable transformation of human consciousness, which took place around the 10th century A.D.: the advent of silent reading. Human beings have been reading for thousands of years, but in antiquity, the normal thing was to read aloud. When Augustine (the future St. Augustine) went to see his teacher, Ambrose, in Milan, in 384 A.D., he was stunned to see him looking at a book and not saying anything. With the advent of silent reading, Manguel writes,
… the reader was at last able to establish an unrestricted relationship with the book and the words. The words no longer needed to occupy the time required to pronounce them. They could exist in interior space, rushing on or barely begun, fully deciphered or only half-said, while the reader’s thoughts inspected them at leisure, drawing new notions from them, allowing comparisons from memory or from other books left open for simultaneous perusal.
To read silently is to free your mind to reflect, to remember, to question and compare. The cognitive scientist Maryanne Wolf calls this freedom “the secret gift of time to think”: When the reading brain becomes able to process written symbols automatically, the thinking brain, the I, has time to go beyond those symbols, to develop itself and the culture in which it lives.
A thousand years later, critics fear that digital technology has put this gift in peril.•
To many Americans, the sons of Bundy and their militia mates evoke one question: why? Why would a group of well-fed, so-called patriots think their own government, for whatever flaws it possesses, is the devil? It seems madness, alien to any rational interpretation. That could be because the terroristic behavior isn’t driven by facts but by faith, and one given to particularly violent tendencies. You don’t need religion to do something rash and scary, of course, but it can be a very potent ingredient in a toxic mix.
There has always been faith-fuelled madness in the country, as best demonstrated in Gilbert Seldes’ book The Stammering Century, and Jon Krakauer believes the Oregon occupation is powered by the same spiritual madness that abetted the murders committed by Dan and Ron Lafferty, which he investigated in Under the Banner of Heaven.
In a Medium article, the author excerpts two pieces of his 2004 book particularly pertinent to current events. An excerpt:
After Dan Lafferty read The Peace Maker in the early 1980s and resolved to start living the principle of plural marriage, he announced to his wife, Matilda, that he intended to wed her oldest daughter — his stepdaughter. At the last minute, however, he abandoned that plan and instead married a Romanian immigrant named Ann Randak, who took care of the horses on one of Robert Redford’s ranches up Spanish Fork Canyon, in the mountains east of the Dream Mine. Ann and Dan met when he borrowed a horse from her to ride in a local parade. She wasn’t LDS, says Dan, “but she was open to new experiences. Becoming my plural wife was her idea.” Ann, he adds, “was a lovely girl. I called her my gypsy bride.”
Living according to the strictures laid down in The Peace Maker felt good to Dan — it felt right, as though this really were the way God intended men and women to live. Inspired, Dan sought out other texts published by a well-known fundamentalist and Dream Mine backer, Ogden Kraut, about Mormonism as it was practiced in the early years of the church.
It didn’t take him long to discover that polygamy wasn’t the only divine principle the modern LDS Church had abandoned in its eagerness to be accepted by American society. Dan learned that in the 19th century, both Joseph Smith and Brigham Young had preached about the righteousness of a sacred doctrine known as “blood atonement:” Certain grievous acts committed against Mormons, as Brigham explained it, could only be rectified if the “sinners have their blood spilt upon the ground.” And Dan learned that Joseph had taught that the laws of God take precedence over the laws of men.
Legal theory was a subject of particular interest to Dan. His curiosity had first been aroused when he was training to be a chiropractor in California, following a run-in he had with state and county authorities. At the time, he supported his family primarily by running a small sandwich business out of their home. Dan, Matilda, and the oldest kids would get out of bed before dawn every morning in order to make and wrap stacks of “all natural” vegetarian sandwiches, which Dan would then sell to other chiropractic students during the lunch hour.
“It was a very profitable little hustle,” Dan says proudly. “Or it was until the Board of Health closed me down for not following regulations. They claimed I needed a license, and that I wasn’t paying the required taxes.” Just before he was put out of business, Matilda had given birth to a baby boy. Money was tight. Losing their main source of income was problematic. It also proved to be a pivotal event in Dan’s passage to fundamentalism.
“After they shut me down,” Dan recalls, “I didn’t know quite what to do. It didn’t seem right to me, that the government would penalize me just for being ambitious and trying to support my family — that they would actually force me to go on welfare instead of simply letting me run my little business. It seemed so stupid — the worst kind of government intrusion. In The Book of Mormon, Moroni talks about how all of us have an obligation to make sure we have a good and just government, and when I read that, it really got me going. It made me realize that I needed to start getting involved in political issues. And I saw that when it comes right down to it, you can’t really separate political issues from religious issues. They’re all tied up together.”•
Donald Trump doesn’t want to force menstruating women to wear burqas, but what else can he do? I mean, he’s a businessman.
It’s amusing to listen to the hideous hotelier try to torpedo Bernie Sanders with cheap insults the way he does his fellow Republicans, the Vermont Senator impervious to taunts like “wacko” because of his gravitas, common sense and sheer likability. It’s difficult to envision Sanders faring well in Southern primaries, but perhaps he’s awakening a positive populist energy that won’t readily go away as Trump has awakened an enduring hatred. As the Occupy movement framed the 2012 election season, maybe Sanders will do the same now. Is he part of an elongated prelude to something significant?
In 2003 I wrote in my The New Ruthless Economythat one of the great imponderables of the twenty-first century was how long it would take for the deteriorating economic circumstances of most Americans to become a dominant political issue. It has taken over ten years but it is now happening, and its most dramatic manifestation to date is the rise of Bernie Sanders. While many political commentators seem to have concluded that Hillary Clinton is the presumptive Democratic nominee, polls taken as recently as the third week of December show Sanders to be ahead by more than ten points in New Hampshire and within single-figure striking distance of her in Iowa, the other early primary state.
Though he continues to receive far less attention in the national media than Hillary Clinton or Donald Trump, Sanders is posing a powerful challenge not only to the Democratic establishment aligned with Hillary Clinton, but also the school of thought that assumes that the Democrats need an establishment candidate like Clinton to run a viable campaign for president. Why this should be happening right now is a mystery for historians to unravel. It could be the delayed effect of the Great Recession of 2007-2008, or of economists Thomas Piketty and Emmanuel Saez’s unmasking of the vast concentration of wealth among the top 1 percent and the 0.1 percent of Americans, or just the cumulative effect of years of disappointment for many American workers.
Such mass progressive awakenings have happened before. I remember taking part in antiwar demonstrations on the East and West coasts in the Fall and Winter of 1967–1968. I noticed that significant numbers of solid middle-class citizens were joining in, sometimes with strollers, children, and dogs in tow. I felt at the time that this was the writing on the wall for Lyndon Johnson, as indeed it turned out to be. We may yet see such a shift away from Hillary Clinton, despite her strong performance in the recent debates and her recent recovery in the polls.
If it happens, it will owe in large part to Sanders’s unusual, if not unique, political identity.•
Vanderbilt historian Michael Bess, author of Our Grandchildren Redesigned, believes–fears, really–that we’re on the brink of a slew of technological and bioengineering breakthroughs which in the next few decades will do much good and be attended by many problems.
In the long run–if there is one–he’s right, but while these “games” will begin being played within his timeframe, I don’t really feel most of them will play out by then. For instance: Bess wonders what lifespans of 160 years or more will mean for marriage and family. That’s not likely to be a concern this century, and if life is eventually radically extended, family will have changed greatly numerous times by then.
Over the coming decades – probably a lot sooner than most people realize –the next great wave of technological change will wash over our lives. Its impact will be similar in sweep and rapidity to the advent of computers, cell phones, and the web; but this time around, it is not our gadgets that will be transformed – it is we ourselves, our bodies, our minds. This will be a shift that cuts even more deeply than the great industrial revolutions of the past. It will not only alter how we make a living, communicate, and interact with each other, but will offer direct and precise control over our own physical and mental states.
Through the use of pharmaceuticals, we are learning how to modulate our moods, boost our physical and mental performance, increase our longevity and vitality. Through the application of prostheses, implants, and other bioelectronic devices, we are not only healing the blind and the paralyzed, but beginning to reconfigure our bodies, enhance our memories, and generate entirely new ways of interacting with machines. Through genetic interventions, we are not only neutralizing certain diseases long thought incurable, but opening up the very real possibility of taking evolution into our own hands – redesigning the human “platform” of body and mind in a thoroughgoing way.
If you talk to the authors of this revolution – the scientists, doctors, and engineers who labor tirelessly at the vanguard of biotechnology – most of them will deny that this is what they have in mind. They are not seeking to bring about the transmogrification of the human species, they insist: they are simply doing their best to heal the sick, to repair the injured. But once you stand back and look at the big picture, sizing up the cumulative impact of all their brilliant efforts, a different conclusion emerges. Whether they intend it or not, they are giving our species the instruments with which to radically redesign itself. Those instruments are already becoming available in crude form today, and they will fully come into their own over the next few decades. By the time our grandchildren have grown to adulthood, this wave of change will have passed through our civilization.
Historian Yuval Harari thinks techno-religions are the future, with older testaments no longer relevant in a time of bioengineering and the like. Astronomy also poses a challenge for traditionalists: Will the discovery of life beyond Earth collapse the foundations of familiar faiths. I don’t know that any of it would matter to hardcore religionists somehow able to deny evolution in 2015, but more reasonable believers might rethink their beliefs if contact is made.
It seems there are two competing narratives between religion and astronomy. Religion is the story of how every single person is special, while astronomy is the long reveal of how our planet is not all that special. Is there room for coexistence?
David A. Weintraub:
It depends on what astronomers find, and then how different religions deal with that. The existence of alien life does not, in and of itself, threaten religion. A lot are quite compatible with, even happy with, the idea that extraterrestrial life exists. There are only some religions that seem worried.
So, if aliens land in Times Square tomorrow, which ones are in trouble?
David A. Weintraub:
Let me step back for a moment and say that what I was writing about was not aliens in flying saucers making contact. What astronomers are doing is detecting chemical signatures in the atmosphere that says life is out there, which is very different from aliens climbing out of flying saucers and saying, “We’re here.” But the ones that would have problems are the most conservative forms of Christianity.
David A. Weintraub:
They put the most literal weight on the creation of humanity through God creating Adam and Eve—that the Garden of Eden was a literal place on the physical Earth, and that’s how intelligent beings were created. If there are intelligent beings from another place, that would threaten the idea that evolution doesn’t occur. Because either life somehow gets started in other places and evolves to become intelligent, or God made a decision to create intelligent life in some other place, and that would seem puzzling if we’re supposed to be the favored creatures.
What religions would be cool with it?
David A. Weintraub:
Judaism could care less. That has nothing to do with other intelligent beings. If God wants to creates other beings, why should we care? Mormons seem to believe quite strongly there are intelligent beings elsewhere. Within the scriptural writings of Islam, there seem to be strong assertions of intelligent beings elsewhere. The same goes for Hindus and Buddhists. There doesn’t seem to be any contradictions for religions that believe in reincarnation. Reincarnation can happen anywhere in the universe, so why wouldn’t there be life elsewhere? There might be something special about being reincarnated in human form on Earth, a special opportunity for shedding bad karma or generating good karma, but in terms of simply the opportunity, reincarnation doesn’t preclude it from happening anywhere else in the universe.•
I learned to read on old copies of Mad and still have a special place in my heart for Harvey Kurtzman and Will Elder’s Melvin Mole (which introduced my to Existentialism long before I knew the term existed) and Elder’s haunting adaptation of Poe’s The Raven. These magazines were the gateway drug for the first two novels I read, Animal Farm and Wuthering Heights. William Gaines and the “usual gang of idiots” had prepared me well.
I never cared much for superhero comic books at a kid, but at the late, great Coliseum Books on Manhattan’s West Side, I came across copies of Raw and Art Spiegelman’s Maus, which wrecked my brain (in the best sense) the way Mad had. In my early adulthood, I would alternate reading these titles with books by Nathanael West, Dostoyevsky and Kafka. It was tremendous.
Raw editor Françoise Mouly, who has gone on to do stupendous work turning out New Yorker covers, is trying to enable future generations of readers with her own panel-centric imprint, Toon Books. Jeff MacGregor of Smithsonianinterviewed her recently. An excerpt about literacy in the Digital Age:
Do you think it’s more useful to have these in school or to have them in the home?
You cannot, in this day and age, get them in the home. Everybody [used to] read newspapers, everybody read magazines, everybody read books. There were books in the home. Not media for the elite, [but] mass media. Books and magazines were as prevalent then as Facebook is, as Twitter is. That’s not the case anymore. Most kids at the age of 5 or 6 don’t see their parents picking up a newspaper or a magazine or a pulp novel or literary novel. So you know, [it becomes] “You must learn to read.” It’s completely abstract.
The libraries are playing an essential role. The librarians and the teachers were the ones removing comics from the hands of kids back in the ’60s and ’70s. Now it’s actually almost the other way around. Most kids discover books and comics, if they haven’t had them for the first five years of their lives, when they enter school. Because when they enter school, they are taken to the library. And librarians, once they open the floodgates, they realize, “Oh my God, the kids are actually asking to go to the library because they can sit on the floor and read comics.” You don’t have to force them — it’s their favorite time. So then what we try to do, when we do programs with schools, is try to do it in such a way that a kid can bring a book home because you want them to teach their parents.
Is there an electronic future for these?
One of my colleagues was saying e-books replaced cheap paperbacks and maybe that’s good. A lot of this disposable print can be replaced by stuff you didn’t want to keep. But when I read a book, I still want to have a copy of the book. I want it to actually not be pristine anymore, I want to see the stains from the coffee – not that I’m trying to damage my book, but I want it to have lived with me for that period of time. And similarly, I think that the kids need to have the book. It’s something they will hold in their hand, and they will feel the care we put into it. The moment I was so happy was when a little girl was holding one of the Toon Books, and she was petting it and closing her eyes and going, “I love this book, I love this book.” The sensuality of her appreciation for the book, I mean, that’s love.•
When the grounds shift beneath our feet, we tend to hold on for dear life. We try to retreat into the world we knew, even if it’s no longer viable.
Our communications have been reconfigured by the Second Machine Age, and it appears the same is happening to our economic model. That’s scary. It causes fingers to be pointed, blame assigned. Such transitions may be the natural order of things, but to mere mortals they can feel very unnatural. Is the realization of the Global Village that frightened Marshall McLuhan so much what’s behind the ripples of nativism and violent expressions of fundamentalism across the globe?
Appearing recently at the Royal Geographical Society in London, Israeli historian Yuval Noah Harari, author of the wondrous Sapiens, argues that the ferocious reactions to hierarchal transformation at the outset of the Industrial Revolution in the 19th century are being replayed now in the early Digital Age. Harari feels certain that withdrawals into old orthodoxies will fail today as they did then, with ancient scriptures having no answers to new questions, leaving techno-ideologies to own the future.
If he hadn’t been in his prime in the 1960s, Terry Southern couldn’t have quite been Terry Southern as we know him. The era allowed him to stretch and bend, and he did what he could to warp it in return. The cultural explosion of those years and his own personality (perceptive, not protean) made it possible for the author to co-write with Kubrick and cover a political convention with Genet and Burroughs. Southern’s literary fantasia continued for decades, never betraying the unique time when his personal narrative began to be writ large.
It must have been a gas, to borrow one of his favorite terms, to get a letter from Terry Southern. Each was its own little acid trip, streaked with innuendo and poached in a satirical kind of intellectual flop sweat. He used thin, expensive paper and sealed some of his letters with wax. People were said to read them aloud to whoever was in the room.
It must further have been a groove, to use another of his favorite terms, to get a letter from Southern (1924-95) because he seemed to know everyone, from George Plimpton and Lenny Bruce to Ringo Starr and Dennis Hopper and had stories to tell.
It’s hard to sum up how brightly Southern’s star burned in the mid-1960s. A countercultural Zelig, he was nowhere and everywhere. Tom Wolfe credited Southern’s article “Twirling at Ole Miss,” published in Esquire in 1963, with jump-starting the New Journalism. Southern helped write the screenplay for Stanley Kubrick’s Dr. Strangelove (1964), injecting the software (wit) into the hardware (dread).•
Haven’t yet read Inventing the Future: Postcapitalism and a World Without Work by Nick Srnicek and Alex Williams, which I blogged about last month, though it’s on my list, my fucking list. Bookforum has published an excerpt. The authors are hopeful that a technological future–“Marxism basically dressed up with robotics,” as they’ve termed it–will free us from drudgery if we can ever unloose ourselves from the Puritan work ethic. I think regardless of work hours or mindset, the menial, physical or otherwise, will always be part of the human experience. There’s just something small about us.
THE RIGHT TO BE LAZY
One of the most difficult problems in implementing a universal basic income (UBI) and building a post-work society will be overcoming the pervasive pressure to submit to the work ethic. Indeed, the failure of the United States’ earlier attempt to implement a basic income was primarily because it challenged accepted notions about the work ethic of the poor and unemployed. Rather than seeing unemployment as the result of a deficient individual work ethic, the UBI proposal recognized it as a structural problem. Yet the language that framed the proposal maintained strict divisions between those who were working and those who were on welfare, despite the plan effacing such a distinction. The working poor ended up rejecting the plan out of a fear of being stigmatized as a welfare recipient. Racial biases reinforced this resistance, since welfare was seen as a black issue, and whites were loath to be associated with it. And the lack of a class identification between the working poor and unemployed—the surplus population—meant there was no social basis for a meaningful movement in favor of a basic income. Overcoming the work ethic will be equally central to any future attempts at building a post-work world. Neoliberalism has established a set of incentives that compel us to act and identify ourselves as competitive subjects. Orbiting around this subject is a constellation of images related to self-reliance and independence that necessarily conflict with the program of a post-work society. Our lives have become increasingly structured around competitive self-realization, and work has become the primary avenue for achieving this. Work, no matter how degrading or low-paid or inconvenient, is deemed an ultimate good. This is the mantra of both mainstream political parties and most trade unions, associated with rhetoric about getting people back into work, the importance of working families, and cutting welfare so that “it always pays to work.” This is matched by a parallel cultural effort demonizing those without jobs. Newspapers blare headlines about the worthlessness of welfare recipients, TV shows sensationalize and mock the poor, and the ever looming figure of the welfare cheat is continually evoked. Work has become central to our very self-conception—so much so that when presented with the idea of doing less work, many people ask, “But what would I do?” The fact that so many people find it impossible to imagine a meaningful life outside of work demonstrates the extent to which the work ethic has infected our minds.
While typically associated with the protestant work ethic, the submission to work is in fact implicit in many religions. These ethics demand dedication to one’s work regardless of the nature of the job, instilling a moral imperative that drudgery should be valued. While originating in religious ideas about ensuring a better afterlife, the goal of the work ethic was eventually replaced with a secular devotion to improvement in this life. More contemporary forms of this imperative have taken on a liberal-humanist character, portraying work as the central means of self-expression. Work has come to be driven into our identity, portrayed as the only means for true self-fulfilment. In a job interview, for instance, everyone knows the worst answer to “Why do you want this job?” is to say “Money,” even as it remains the repressed truth. Contemporary service work heightens this phenomenon. In the absence of clear metrics for productivity, workers instead put on performances of productivity—pretending to enjoy their job or smiling while being yelled at by a customer. Working long hours has become a sign of devotion to the job, even as it perpetuates the gender pay gap. With work tied so tightly into our identities, overcoming the work ethic will require us overcoming ourselves.
The central ideological support for the work ethic is that remuneration be tied to suffering. Everywhere one looks, there is a drive to make people suffer before they can receive a reward.•
Writer Lucian Truscott IV is one of the figures featured in the latest 3 Videos, and here’s a little more about him from a 1979 People piece penned by Cheryl McCall at the outset of his very abbreviated marriage to writer-photographer Carol Troy. In an age when people cared at least somewhat about print journalists, the couple was apparently, fleetingly, an F. Scott and Zelda, which is a mixed blessing, of course. An excerpt:
Lucian Truscott IV and Carol Troy both write. His current book is the best-selling novel Dress Gray; hers is Cheap Chic Update. But literary achievement isn’t the only reason the New York Times compared them, a little waspishly, to Scott and Zelda Fitzgerald. Take Truscott and Troy’s enthusiasm for disagreement.
When they met in 1975 at a party in his New York loft, she found him “awfully gruff.” The following year they were fixed up by a mutual friend. It started off disastrously. “Vassar girls and West Point guys hated each other,” ex-cadet Truscott recalls. “We wouldn’t dance with them at mixers,” Troy (Vassar ’66) explains. They went to dinner at a Japanese restaurant—”Dutch,” Troy says dryly, “and got into a huge fight.” Truscott agrees: “Sparks were flying,” and then adds, “We didn’t know they were sparks of love.”
As befits New York’s literary darlings, they were married in a Roman Catholic church in the artsy SoHo district this past St. Patrick’s Day. Then 250 guests, including Norman Mailer, were bused uptown with champagne aboard to the swank Lotos Club for the reception. (“Our only salvation is in extravagance,” Fitzgerald once wrote.)
Bride and bridegroom are not only handsome and well-thought-of; they’re rich. Dress Gray, a thriller about homosexuality and murder at the military academy, earned $1.4 million before a copy was sold—thanks to subsidiary rights negotiated by the author without an agent. Paramount bought the movie option and Gore Vidal is writing the screenplay.
“I wanted to go to West Point my whole life,” says Truscott, 32. His grandfather, Gen. Lucian K. Truscott Jr., was a World War II hero who commanded the Allied landing at Anzio Beach in Italy. Lucian III was West Point ’45, retiring as a colonel to become a watchmaker in 1971. Lucian’s mother, Anne, is a medical secretary; he’s the eldest of five children. The family lived in eight states, Germany and Japan, and Lucian recalls: “I grew up liking Army officers. I bagged their groceries, I washed their cars, I mowed their lawns.”
At West Point he was, however, less than a model cadet. In his sophomore year he began a letter-writing campaign to New York’s Village Voice. One epistle, he remembers, contained the line: “Jerry Rubin is palpably full of sh**.” On campus he challenged compulsory chapel attendance (it was found unconstitutional three years after he graduated).
But Truscott’s most serious transgression was getting caught—with three other cadets—using a telephone credit card number that reportedly belonged to the left-wing Students for a Democratic Society. “Hell, I wasn’t calling a subversive,” Truscott claims. “I was calling my grandmother.” Nevertheless, West Point slapped him with 30 demerits for “gross lack of judgment.” Truscott barely graduated—658th in a class of 800.
He began serving his five-year Army commitment in 1969 as an infantry lieutenant at Fort Carson, Colo. There he wrote an article on heroin addiction among enlisted men for the Voice, in which he admitted he had smoked marijuana. That, plus a refusal to serve on courts-martial because “they were patently unfair and ridiculous,” led to his resignation and a general discharge under “other than honorable conditions” in 1970. Conservative military columnist Col. Robert Heinl wrote that Truscott had “disparaged and derogated” West Point’s creed: “Duty, honor, country.”
Truscott settled on a barge in New Jersey and joined the Voice staff, freelancing on the side. Five years later he met Troy. The daughter of Francis Troy, a Borden executive, and his wife, Bernice, she grew up living American Graffiti in the suburbs of San Francisco. Dolled up in tight skirts, sweater sets and Weejuns, she liked to cruise in her parents’ hot-pink Mercury with black interior (she still owns it). After Vassar and studying film at Stanford, she turned journalist, working for Newsday, Oui and a pre-publication issue of People, among others.
During this time she made a virtue of scrimping, developing the skills she later wrote about in Cheap Chic. (It was a hit even though Troy recalls Barbara Walters describing the book on the Today show as “written only for skinny young girls who didn’t have jobs.”) Now she and Truscott visit flea markets and garage sales to furnish their New York loft and a $100,000 Victorian carriage house in Sag Harbor.
Lucian, purposely avoiding military subjects, has begun a novel about a businesswoman. “Writing doesn’t have to be a painful, gut-wrenching experience, the 3 a.m. of the soul that Fitzgerald talked about,” he says cheerfully. “I like the experience of writing.” Troy, 34, is doing a screenplay about the fashion industry and pondering a magazine editing job.
Though Fiat heiress Delfina Rattazzi has thrown a party for them and they rate a table at Manhattan’s celebrity feeding trough, Elaine’s, Truscott and Troy have an unpretentious side. Evenings they may show slides or reminisce about souvenir matchbooks and place mats. They hang out in unsung places like the Spring Street Bar in SoHo. Carol takes modern dance classes and when in Sag Harbor Lucian body-surfs. He gave up tennis, which he learned at West Point from Lt. Arthur Ashe, and skiing because “that stuff has become so chichi.”
They expect to have children within the decade, though Troy isn’t quivering with anticipation. “I don’t know anything about kids,” she says, “because I was an only child. But I’m sure Lucian will be a good father. I don’t know how many we’ll have. They make so much noise. One sounds like a lot.”•
Lucian Truscott IV, the great, great, great, great grandson of Thomas Jefferson and graduate of the United States Military Academy, began his writing career penning pieces on hippies and heroin addiction, eventually making his mark at the Village Voice and Rolling Stone. In 1972, he was assigned by the former to reviewHunter S. Thompson’s genius, drug-fuelled phantasmagoria Fear and Loathing in Las Vegas. An excerpt:
Hunter Thompson lived in Aspen then, and his ranch, located outside town about 10 miles, tucked away up a valley with National Forest land on every side, was the first place I stopped. It was late afternoon and Thompson was just getting up, bleary-eyed and beaten, shaded from the sun by a tennis hat, sipping a beer on the front porch.
I got to know him while I was still in the Army in the spring of 1970, when he and a few other local crazies were gearing up for what would become the Aspen Freak Power Uprising, a spectacular which featured Thompson as candidate for sheriff, with his neighbor Billy for coroner. They ran on a platform which promised, among other things, public punishment for drug dealers who burned their customers, and a campaign guaranteed to rid the valley of real estate developers and ‘nazi greedheads’ of every persuasion. In a compromise move toward the end of the campaign, Thompson promised to “eat mescaline only during off-duty hours.” The non-freak segment of the voting public was unmoved and he was eventually defeated by a narrow margin.
In the days before the Freak Power spirit, Thompson’s ranch served as a war room and R&R camp for the Aspen political insurgents. Needless to say there was rarely a dull moment. When I arrived last summer, however, things had changed. Thompson was in the midst of writing a magnum opus, and it was being cranked out at an unnerving rate. I was barely across the threshold when I was informed that he worked (worked?) Monday through Friday and saved the weekends for messing around. As usual, he worked from around midnight until 7 or 8 in the morning and slept all day. There was an edge to his voice that said he meant business. This was it. This was a venture that had no beginning or end, that even Thompson himself was having difficulty controlling.
“I’m sending it off to Random House in 20,000-word bursts,” he said, drawing slowly on his ever-present cigarette holder. “I don’t have any idea what they think of it. Hell, I don’t have any idea what it is.”
“What’s it about?” I asked.
“Searching for The American Dream in Las Vegas,” replied Thompson coolly.•
In 1974, Truscott, again representing the Voice, tagged along with another gonzo character, Evel Knievel, at the time of his Snake River Canyon spacecycle jump, a spectacle promoted (in part) by professional wrestling strongman Vince McMahon Jr. Truscott shows up in this awesome video at 6:22, giving the event all the respect it deserved while simultaneously summing up his reporting career. (Because of privacy settings, you have to click through and watch it on the Vimeo site.)
Oriana Fallaci did as much serious journalism as anyone during her era, but she wasn’t above the lurid if the story was good and the check likely to clear. Case in point: Her 1967 Look magazine article “The Dead Body and the Living Brain,” about pioneering head-transplant experimentation. In the piece, Fallaci reports on the sci-fi-ish experiments that Prof. Robert White was conducting with rhesus monkeys at a time when consciousness about animal rights was on the rise. The opening:
Libby had eaten her last meal the night before: orange, banana, monkey chow. While eating she had observed us with curiosity. Her hands resembled the hands of a newly born child, her face seemed almost human. Perhaps because of her eyes. They were so sad, so defenseless. We had called her Libby because Dr. Maurice Albin, the anesthetist, had told us she had no name, we could give her the name we liked best, and because she accepted it immediately. You said “Libby!” and she jumped, then she leaned her head on her shoulder. Dr. Albin had also told us that Libby had been born in India and was almost three years, an age comparable to that of a seven-year-old girl. The rhesuses live 30 years and she was a rhesus. Prof. Robert White uses the rhesus because they are not expensive; they cost between $80 and $100. Chimpanzees, larger and easier to experiment with, cost up to $2,000 each. After the meal, a veterinarian had come, and with as much ceremony as they use for the condemned, he had checked to be sure Libby was in good health. It would be a difficult operation and her body should function as perfectly as a rocket going to the moon. A hundred times before, the experiment had ended in failure, and though Professor White became the first man in the entire history of medicine to succeed, the undertaking still bordered on science fiction. Libby was about to die in order to demonstrate that her brain could live isolated from her body and that, so isolated, it could still think.•
Fallaci wasn’t always insightful when assessing her subjects, missing out entirely on Indira Gandhi’s dictatorial leanings and Alfred Hitchcock’s deep seediness, but she was accurate in her judgment of Muammar el-Qaddafi when conversing with that shock jock Charlie Rose in 2003.
In Fran Lebowitz’s 1993 Paris Review Q&A, the writer’s maternal nature, or something like it, came to the fore. An excerpt:
Young people are often a target for you.
I wouldn’t say that I dislike the young. I’m simply not a fan of naïveté. I mean, unless you have an erotic interest in them, what other interest could you have? What are they going to possibly say that’s of interest? People ask me, Aren’t you interested in what they’re thinking? What could they be thinking? This is not a middle-aged curmudgeonly attitude; I didn’t like people that age even when I was that age.
Well, what age do you prefer?
I always liked people who are older. Of course, every year it gets harder to find them. I like people older than me and children, really little children.
Out of the mouths of babes comes wisdom?
No, I’m just intrigued by them, because, to me, they’re like talking animals. Their consciousness is so different from ours that they constitute a different species. They don’t have to be particularly interesting children; just the fact that they are children is sufficient. They don’t know what anything is, so they have to make it up. No matter how dull they are, they still have to figure things out for themselves. They have a fresh approach.•
In this 1977 Canadian talk show, Lebowitz, selling her bookMetropolitan Life,was concerned that digital watches and calculators and other new technologies entitled kids (and adults also) to a sense of power they should not have. She must be pleased with smartphones today.
Simon Winchestertold the New York Times about the reading that’s important to him and the kind unimportant. An excerpt:
Which writers — novelists, essayists, critics, journalists, poets — working today do you admire most?
Billy Collins; Paul Muldoon; Ian Buruma; William Boyd; Simon Schama; Paul Theroux; Pico Iyer; Salman Rushdie.
What genres do you especially enjoy reading?
I’m unashamedly drawn to tales of the remote, the lonely and the hard — like Willa Cather on Nebraska or Ivan Doig on Montana. The Icelandic Nobelist Halldor Laxness, with his “Independent People,” still is, for me, the supreme example. But I also like railway murder stories and timetable mysteries, especially those involving Inspector French and his Dublin-born creator, Freeman Wills Crofts.
And which do you avoid?
Frankly, anything that has the name Derrida in it.
What kinds of stories are you drawn to?
I enjoy the bizarre and the fantastic — Georges Perec’s Life: A User’s Manual, or Borges and his “Tlön, Uqbar, Orbis Tertius,” which I still think one of the cleverest things I’ve read. I also want to revive the reputation of the detective writer John Franklin Bardin, whose books are so richly insane that you feel your own sanity slipping away as you read, The Deadly Percheron being a fine instance.
And which do you avoid?
Sensible people tell me I should like stories with zombies, but try as I might, I don’t.•
For a 1975“Talk of the Town” piecein the New Yorker, Anthony Hiss toured Los Angeles, that strange and fascinating turf, enjoying near journey’s end an audience with Philip K. Dick, whose visions weren’t fully appreciated during his abbreviated lifetime and were even sort of undersold in this article. An excerpt:
In the afternoon, we drove over to Fullerton to see Philip K. Dick, my favorite science-fiction writer, author of 33 novels and 170 short stories. Past the House of Egg Roll, past Moy’s Coffee Shop (Chop Suey, Hot Cakes), past Bowser Beautiful, through Bel Air. We drove to the end of Sunset Boulevard, where we saw seagulls, 18 surfers in wet suits, a blue suggestion of Catalina to the southwest, and an Indian girl in a green-and-gold sari on the beach. Then south, past a concrete wall painted ‘TOMMY SURKO SAYS FOR MY KIND OF GIRL THERE’S ONLY ONE! TOMMY SURKO!’ Behind the tall palms on Venice we could see snow on the mountains. Kids were skateboarding down a hill on Lincoln. Past Woody’s Smorgasburger, onto a freeway to Fullerton.
Philip K. Dick lives in an apartment full of books and records and photographs with his wife, Tessa; his small son, Christopher; and two cats, Harvey Wallbanger and Sasha. He is jolly and tubby and bearded. His books, which are hilarious, are popular in France, because the French think they are about how grim everything is. Dick showed us a French newspaper piece about him—the subtitles were ‘Le Chaos,’ ‘L’Acide,’ ‘Le Suicide,’ ‘Les Machines’ ‘La Société Totalitaire,’ ‘La Paranoïa.’ Dick has just finished a book about Tim Leary and the LSD crowd, and what happened to them.
We had stopped in to make a short call of homage, and wound up talking along for hours, drinking wine, and Tessa going out for some Chinese food, and then talking about cosmologies until it was almost time for our plane back to N.Y. The apartment also contains a two-foot-high metal rocket ship on a wooden base—this is his Hugo Award, the highest award in science fiction. The plaque is missing, though, because Dick once used the award to break up a fight. ‘It grabs good,’ he says. As for the cosmologies, this is what emerged from our discussions: cosmologies all seem to be based on repetition—you know, first the universe expands, then it contracts, then it expands again, etc.—but maybe that’s not so. Maybe this whole expansion business that the universe is currently embarked upon is going to happen only once. That would mean that every day really is a new day, right? Also, maybe it’s not true that Einstein was smarter than Newton. Maybe Newton’s laws accurately described the universe as it then existed. But since then it’s expanded and got more complicated, and can be accurately described only by Einstein physics. Which will eventually become outdated, maybe.•
In 1977, PKD described tangential, alternative worlds which he felt may have existed in reality–or perhaps just his mind.
One who sees the curtain coming down sooner than later is the Christian evangelist and dapper apocalypse salesman Hal Lindsey, co-author with Carole C. Carlson of the meshuganah 1970 bestseller,The Late Great Planet Earth,which estimated 1988 as the Judgement Day. Missed by that much. Lindsey, who’s still alive as are many of us, spends his dotage accusing President Obama of being “the Antichrist.” Whatever.
In 1979, when the batshit book had been made into a film–with Orson Welles picking up late-life wine-and-bullfight money for handling the narration–Lindsey was profiled in a People pieceby Lucretia Marmon. The opening:
In 1938 Orson Welles terrified radio listeners with War of the Worlds, an imaginative report of a Martian invasion. Now Welles, as gloomy-voiced narrator of a film, The Late Great Planet Earth, out this fall, tells another frightening tale. This time it is a movie version of the end of the world, based on a scenario by evangelist-author Hal Lindsey. The script, claims Lindsey, really isn’t his. It’s all in the Scriptures.
Lindsey’s book Earth, published in 1970, has been translated into 31 languages and 10 million copies have been sold. The public also snapped up five subsequent Lindsey books on the same subject, running his sales total to over 14 million.
Thus Lindsey, 47, may now be the foremost modern-day Jeremiah. ‘If I had been writing 15 years ago I wouldn’t have had an audience,’ he concedes. ‘But a tremendous number of people are worried about the future. I’m just part of that phenomenon.’
Lindsey splices Bible prophecies of doom with contemporary signs. For instance, he says the Bible pinpoints Israel’s rebirth as a nation as the catalyst to Judgment Day, which will probably occur by 1988. The intervening years will see the emergence of a 10-nation confederacy (prophet Daniel’s dreadful 10-horned beast) or, as Lindsey sees it, the European Common Market. Eventually Russia (biblical Magog) will attack Israel and precipitate a global nuclear war. Only Jesus’ followers will be spared. Hence, Lindsey advises, “the only thing you need to understand is that God offers you in Jesus Christ a full pardon.”
Meanwhile, is Lindsey cowering in his fallout shelter? Not at all. Sporting a gold Star of David around his neck and another on his pinky (‘After all, Jesus was a Jew’), Lindsey zips around Southern California in a Mercedes 450 SL. He conducts services on the beach and indulges in his hobbies of photography and surfing.•
There’s nothing quite like the IBT columns of antisocial antivirus expert John McAfee, pieces that read like PKD-esque fever dreams propelled by acute paranoia, actual knowledge and perhaps pharmaceuticals. In a recent article, he warned that Electromagnetic Pulse generators (or EMPs) could be used to destroy an American city at any moment. An excerpt:
EMPs can be generated in many ways. Much has been said about nuclear EMPs, but that threat concerns me far less than other, more specific means of generating EMPs. The US recently announced our ownEMP weapon, which can be carried aboard a missile. Using a technology based on hydraulically compressing and decompressing rods made of specific elements, the device is able to create multiple EMPs very quickly.
The weapon can be focused to take out individual buildings within a city and can take out dozens of individual buildings in a single pass of the missile. I will admit that such technology is beyond the reach of the average individual. But what if the individual is not concerned with precision strikes and merely wants to take out an entire city block or the entire city? Well, that technology is readily available, cheap, and simple to construct.
I am not going to give a course on constructing EMP weapons. I am only trying to raise the awareness of the world to a real and imminent threat.
I also received many questions about how an EMP could kill people. The answer is easy. A large-scale localized attack that involved all of our power stations would leave us all permanently without power. An attack that included our water processing plants would leave us without potable water, except that which we could purchase at the supermarket.
Localized attacks on food processing plants, attacks on mass transportation and attacks on centralized communication organizations would leave us without food and communications. Attacks on oil processing plants would ultimately leave us without individual transportation. What percentage of the population do you think would survive such a catastrophe? And all of this without a single nuclear explosion.•
In our facacta political season, McAfee is, of course, running for President, decrying the cyber illiteracy of the average Washington representative. Despite being an erstwhile murder suspect, he’s not even close to the most deplorable candidate. Here he is in September announcing his campaign to Greta Van Susteren, a Scientologist with an unsustainable face.
Racing legend Jackie Stewart was king of a sport in which his competitors–his friends–kept dying, one after another on the dangerous-as-can be-courses of the ’60s and early ’70s. The opening of Robert F. Jones 1973 Sport Illustrated article “There Are Two Kinds of Death“:
Contrasted with the current woes of the real world—the new Arab-Israeli war, the old Watergate maunder-ings—it might have seemed a week of minor tragedy on the Grand Prix circuit. But for John Young Stewart, 34, the finest road racer in the game, it was perhaps the most agonizing week of his life. A month earlier, at the Italian Grand Prix at Monza, Stewart had captured his third world driving championship in five years. During the course of this racing season he had become the most successful Formula I driver ever, with 27 Grand Prix victories to his credit (compared with 25 for his late Scottish countryman, Jim Clark, and 24 for his idol, Argentina’s Juan Manuel Fangio). And certainly Stewart had outdone both of them in the main chance of racing: money.
Jackie Stewart is the canniest man ever to don a fireproof balaclava—and certainly the gutsiest ever to con a sponsor. Earning close to $1 million a season in prize money and other emoluments, Stewart seemed to have turned motor racing into some kind of a private treasure trove—and survived to enjoy it. Then why not retire?
That was the first source of his agony last weekend. At Watkins Glen for the 15th running of the U.S. Grand Prix, Stewart played coy with the question. Indeed, even his business agent claimed that the wee Scot was hung on the horns of that old sportsman’s dilemma: quit on a peak of success, or press on to try for even greater rewards? The business agent also was well aware that the timing of a retirement statement by a figure so prominent as Stewart could bring in lots of bucks, and perhaps the coyness was merely a question of timing to suck up more cash. “If Jackie were single,” said his lovely wife Helen, “there would be no question. He would continue to race. I would like to see him retire, but I cannot press him. No, there is nothing that could fill the role of racing for him if he were to quit.”
Stewart himself was brusque on the question. He sidestepped it with every slick word at his command—and they are as many and as evasive as the black grouse of Scotland’s moors. But still it all seemed a game.
Then, on qualifying day before the race, Stewart’s good friend and teammate, Francois Cevert, was killed in a smashup during practice. Stewart had already lost three close friends to the sport: Clark in 1968, Piers Courage and.Jochen Rindt in 1970. In his poignant account of that last tragic season in his recent book, Faster! A Racer’s Diary, Stewart had likened Grand Prix racing to a disease and wondered in painful print if he himself were not a victim. With Cevert’s death last Saturday, it seemed to many that Stewart must at last accept the prognosis. He must—finally—retire and let sad enough alone.•
A 1973 documentary about Formula One racing, known at various times as One by One, Quick and the Dead, and Champions Forever, this interesting period piece with a funked-up score focuses on Stewart, Peter Revson and their peers. Stacy Keach is the cool-as-can-be narrator, but Cévert sums it up simply and best, admitting, “steering is hard.”
My favorite book published in the U.S. in 2015 isSapiens, a brilliant work about our past (and future) by Israeli historian Yuval Noah Harari. In a New Statesman essay, the author argues that if we’re on the precipice of a grand human revolution–in which we commandeer evolutionary forces and create a post-scarcity world–it’s being driven by private-sector technocracy, not politics, that attenuated, polarized thing. The next Lenins, the new visionaries focused on large-scale societal reorganization, Harari argues, live in Silicon Valley, and even if they don’t succeed, their efforts may significantly impact our lives. An excerpt:
Whatever their disagreements about long-term visions, communists, fascists and liberals all combined forces to create a new state-run leviathan. Within a surprisingly short time, they engineered all-encompassing systems of mass education, mass health and mass welfare, which were supposed to realise the utopian aspirations of the ruling party. These mass systems became the main employers in the job market and the main regulators of human life. In this sense, at least, the grand political visions of the past century have succeeded in creating an entirely new world. The society of 1800 was completely destroyed and we are living in a new reality altogether.
In 1900 or 1950 politicians of all hues thought big, talked big and acted even bigger. Today it seems that politicians have a chance to pursue even grander visions than those of Lenin, Hitler or Mao. While the latter tried to create a new society and a new human being with the help of steam engines and typewriters, today’s prophets could rely on biotechnology and supercomputers. In the coming decades, technological breakthroughs are likely to change human society, human bodies and human minds in far more drastic ways than ever before.
Whereas the Nazis sought to create superhumans through selective breeding, we now have an increasing arsenal of bioengineering tools at our disposal. These could be used to redesign the shapes, abilities and even desires of human beings, so as to fulfil this or that political ideal. Bioengineering starts with the understanding that we are far from realising the full potential of organic bodies. For four billion years natural selection has been tinkering and tweaking with these bodies, so that we have gone from amoebae to reptiles to mammals to Homo sapiens. Yet there is no reason to think that sapiens is the last station. Relatively small changes in the genome, the neural system and the skeleton were enough to upgrade Homo erectus – who could produce nothing more impressive than flint knives – to Homo sapiens, who produces spaceships and computers. Who knows what the outcome of a few more changes to our genome, neural system and skeleton might be? Bioengineering is not going to wait patiently for natural selection to work its magic. Instead, bioengineers will take the old sapiens body and intentionally rewrite its genetic code, rewire its brain circuits, alter its biochemical balance and grow entirely new body parts.
On top of that, we are also developing the ability to create cyborgs.•
In a London TED Talk from earlier this year, Harari details why Homo sapiens came to rule the world, and why that development wasn’t always such a sure bet.