Nicholas Carr

You are currently browsing articles tagged Nicholas Carr.

In America, quantity matters.

Peter Thiel’s billions has impressed all manner of serious people, economists and social scientists and politicos, dollar signs making them somehow ignore that this “genius” was sure that there were WMDs in Iraq and certain that an unqualified sociopath should lead the nation. He may be smart about business, but he’s stupid about life, and it’s the kind of stupid that can get people killed. Thiel’s a rich man and a poor one.

His fellow Silicon Valley billionaire Mark Zuckerberg is also sanctified for large numbers. Not only does he have oodles of money, but reportedly close to two billion people are active on Facebook, that quasi-nation, which is part surveillance state and also the world’s largest sweatshop. 

Despite years of dubious business moves, comments and “social experiments,” Zuckerberg may not be a bad person, but he also isn’t necessarily a wise one, despite what the shallowest of scoreboards may say. His recent religious revival and 50-state “listening tour” provoked speculation that he’d watched another (perhaps) billionaire celebrity snake his way into the Oval Office and decided he wanted to get into the game. He certainly would be better than Trump, but maybe we should actually elect someone who’s qualified?

But why settle for a petty bureaucrat’s position like President when you can lord over a multi-national empire?

In Zuckerberg’s recent 5,700-word position paper, “Building Global Community,” he asserts that his company must lead the way in building an Earth-sized social fabric, something that doesn’t take into consideration that a) many of us want no part of Facebook, b) many of the users possess bigoted and anti-social views, c) having for-profit companies play such a role is huge potential for abuses and d) there’s no substitute for good government or actual (rather than virtual) political movements. Moreover, social media is as much a bane to democracy as a boon–and that may be a hopeful reading of its effects–so such an initiative may be akin to treating a poisoning victim with more poison.

The opening of Nicholas Carr’s outstanding Rough Type post about the Facebook founder’s massive missive, which eviscerates its “self-serving fantasy about social relations”:

The word “community” appears, by my rough count, 98 times in Mark Zuckerberg’s latest message to the masses. In a post-fact world, truth is approached through repetition. The message that is transmitted most often is the fittest message, the message that wins. Verification becomes a matter of pattern recognition. It’s the epistemology of the meme, the sword by which Facebook lives and dies.

Today I want to focus on the most important question of all: are we building the world we all want?

It’s a good question, though I’m not sure there is any world that we all want, and if there is one, I’m not sure Mark Zuckerberg is the guy I’d appoint to define it. And yet, from his virtual pulpit, surrounded by his 86 million followers, the young Facebook CEO hesitates not a bit to speak for everyone, in the first person plural. There is no opt-out to his “we.” It’s the default setting and, in Zuckerberg’s totalizing utopian vision, the setting is hardwired, universal, and nonnegotiable.

Our greatest opportunities are now global — like spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty, and accelerating science. Our greatest challenges also need global responses — like ending terrorism, fighting climate change, and preventing pandemics. Progress now requires humanity coming together not just as cities or nations, but also as a global community.  …

Facebook stands for bringing us closer together and building a global community. When we began, this idea was not controversial.

The reason the idea  — that community-building on a planetary scale is practicable, necessary, and altogether good — did not seem controversial in the beginning was that Zuckerberg, like Silicon Valley in general, operated in a technological bubble, outside of politics, outside of history. Now that history has broken through the bubble and upset the algorithms, history must be put back in its place. Technological determinism must again be made synonymous with historical determinism.•

Tags: ,

I read Sean Penn’s “El Chapo Speaksat the beginning of 2016, and spent the rest of the year trying to absorb as many great articles as I could to erase from my mind the awful reporting and prose. “Espinoza is the owl who flies among falcons,” wrote the actor-director-poetaster. Yes, Sean, okay, but go fuck yourself.

The following 50 articles made me feel pretty good again. In time, I myself once more began to fly among the falcons.

Congratulations to all the wonderful writers who made the list. My apologies for not reading more small journals and sites, but the time and money of any one person, myself included, is limited.


1) “Latina Hotel Workers Harness Force of Labor and of Politics in Las Vegas” and 2) “A Fighter’s Hour of Need(Dan Barry, New York Times).

As good as any newspaper writer–or whatever you call such people now–Barry reports and composes like a dream. The first piece has as good a kicker as anyone could come up with–even if life subsequently kicked back in a shocking way–and the second is a heartbreaker about the immediate aftermath of a 2013 boxing match in which Magomed Abdusalamov suffered severe brain damage.

Even when Barry shares a byline, I still feel sure I can pick out his sentences, so flawless and inviting they are. One example of that would be…

3) “An Alt-Right Makeover Shrouds the Swastikas by Barry, Serge F. Kovaleski, Julie Turkewitz and Joseph Goldstein.

An angle used to dismiss the idea that the Make America Great White Again message resonated with a surprising, depressing number of citizens has been to point out that some Trump supporters also voted for Obama. That argument seems simplistic. Some bigots aren’t so far gone that they can’t vote for a person of a race they dislike if they feel it’s in their best interests financially or otherwise. That is to say, some racially prejudiced whites voted for President Obama. Trump appealed to them to find their worst selves. Many did.

Likewise the Trump campaign emboldened far worse elements, including white nationalists and separatists and anti-Semites. Thinking they’d been perhaps permanently marginalized, these hate groups are now updating their “brand,” hiding yesterday’s swastikas and burning crosses and other “bad optics,” and referring to themselves not as neo-Nazis but by more vaguely appealing monikers like “European-American advocates.” It’s the same monster wrapped in a different robe, the mainstreaming of malevolence, and they won’t again be easily relegated to the fringe regardless of Trump’s fate.

This group of NYT journalists explores a beast awakened and energized by Trump’s ugly campaign. It’s a great piece, though we should all probably stop calling these groups by their preferred KKK 2.0 alias of “alt-right.”

4) “No, Trump, We Can’t Just Get Along” (Charles Blow, New York Times)

In the hours after America elected, if barely, a Ku Klux Kardashian, most pundits and talk-show hosts encouraged all to support this demagogue, as if we could readily forget that he was a racist troll who demanded the first African-American President show his birth certificate, a deadbeat billionaire who didn’t pay taxes or many of his contracted workers, a draft-dodger who mocked our POWs while praising Putin, a sexual predator who boasted about his assaults, a xenophobe who blamed Mexicans and Muslims, a bigot who had a long history of targeting African-Americans with the zeal of a one-man lynching bee. In a most passionate and lucid shot across the bow, Blow said “no way,” penning an instant classic, speaking for many among the disenfranchised majority. 

5) “Lunch with the FT: Burning Man’s Larry Harvey (Tim Bradshaw, Financial Times)

If self-appointed Libertarian overlord Grover Norquist, a Harvard graduate with a 13-year-old’s understanding of government and economics, ever had his policy preferences enacted fully, it would lead to worse lifestyles and shorter lifespans for the majority of Americans. In fact, we now get to see many of his idiotic ideas played out in real-life experiments. He’s so eager to Brownback the whole country he’s convinced himself, despite being married to a Muslim woman, there’s conservative bona fides in Trump’s Mussolini-esque stylings and suspicious math.

In 2014, Norquist made his way to the government-less wonderland known as Burning Man, free finally from those bullying U.S. regulations, the absence of which allows Chinese business titans to breathe more freely, if not literally. Norquist’s belief that the short-term settlement in the Nevada desert is representative of what the nation could be every day is no less silly than considering Spring Break a template for successful marriage. He was quote as saying: “Burning Man is a refutation of the argument that the state has a place in nature.” Holy fuck, who passed him the peyote?

In his interview piece, Bradshaw broke bread in San Francisco with Harvey, co-founder of Burning Man and its current “Chief Philosophic Officer,” who speaks fondly of rent control and the Bernie-led leftward shift of the Democratic Party. Norquist would not approve, even if Harvey is a contradictory character, insisting he has a “conservative sensibility” and lamenting the way many involved in social justice fixate on self esteem.

6) “The World Wide Cageand 7)Humans Have Long Wished to Fly Like Birds: Maybe We Shall” (Nicholas Carr, Aeon)

One of the best critics of our technological society keeps getting better.

The former piece is the introduction to Carr’s essay collection Utopia Is Creepy. The writer argues (powerfully) that we’ve defined “progress as essentially technological,” even though the Digital Age quickly became corrupted by commercial interests, and the initial thrill of the Internet faded as it became “civilized” in the most derogatory, Twain-ish use of that word. To Carr, the something gained (access to an avalanche of information) is overwhelmed by what’s lost (withdrawal from reality). The critic applies John Kenneth Galbraith’s term “innocent fraud” to the Silicon Valley marketing of techno-utopianism. 

You could extrapolate this thinking to much of our contemporary culture: binge-watching endless content, Pokémon Go, Comic-Con, fake Reality TV shows, reality-altering cable news, etc. Carr suggests we use the tools of Silicon Valley while refusing the ethos. Perhaps that’s possible, but I doubt you can separate such things.

The latter is a passage about biotechnology which wonders if science will soon move too fast not only for legislation but for ethics as well. The “philosophy is dead” assertion that’s persistently batted around in scientific circles drives me bonkers because we dearly need consideration about our likely commandeering of evolution. Carr doesn’t make that argument but instead rightly wonders if ethics is likely to be more than a “sideshow” when garages aren’t used to just hatch computer hardware or search engines but greatly altered or even new life forms. The tools will be cheap, the “creativity” decentralized, the “products” attractive. As Freeman Dyson wrote nearly a decade ago: “These games will be messy and possibly dangerous.”

8) “Calum Chace: Ask Me Anything” (Chace, Reddit)

The writer, an all-around interesting thinker, conducted an AMA based on his book, The Economic Singularity, which envisions a future–and not such a far-flung one–when human labor is a thing of the past. It’s certainly possible since constantly improving technology could make fleets of cars driverless and factories workerless. In fact, there’s no reason why they can’t also be ownerless. 

What happens then? How do we reconcile a free-market society with an automated one? In the long run, it could be a great victory for humanity, but getting from here to there will be bumpy.

9) “England’s Post-Imperial Stress Disorder(Andrew Brown, Boston Globe)

Not being intimately familiar with the nuances of the U.K.’s politics and culture, I’m wary of assigning support for Brexit to ugly nativist tendencies, but it does seem a self-harming act provoked by the growing pains of globalism. It’s not nearly as dumb a move as a President Trump, for instance, but some of the same forces are at play, particularly when it comes to the pro-Brexit, anti-immigration UKIP party.

It’s not shocking that Britain and the U.S. are trying to dodge the arrival of a new day and greater competition, a time when empires can’t merely strike back at will. We’re richer now, we have better things, but the distribution is very uneven and we feel poor inside. For some, maybe a surprising number, blame must be assigned to the “others.” As Randy Newman sang: “The end of an empire is messy at best.”

10) My President Was Black” (Ta-Nahesi Coates, The Atlantic)

It wasn’t the color of President Obama’s suit that so bothered his critics but the color of his skin. Sure, Bill Clinton was impeached and John Kerry swiftboated, but there was something so deeply disqualifying about the antagonism that faced 44, something beyond mere partisanship, which boiled over into Birtherism, interruptions during the State of the Union, denial of his Christian faith and vicious insults hurled at his gorgeous wife.

The old adage that black people have to be twice as good at a job as white people proved to be mathematically refutable: The Obamas were a million times better, and it wasn’t nearly enough for their detractors. When Obama even mildly suggested that institutional racism still existed, something he rarely did, he was labeled a “jerk” by prominent Republicans. Worse yet, his most overtly bigoted tormentor will succeed him in the White House. 

That raises an obvious question: If the perfect son isn’t good enough, then what kind of chance do his siblings have?

In a towering essay, Coates reflects on Obama’s history and the “fitful spasmodic years” of his White House tenure, which had pluses and minuses but were a gravity-defying time of true accomplishment which will never happen the same way again. In addition to macro ideas about race and identity, Coates’ writing on the Justice Department under this Administration is of particular importance.

11) “The Problem With Obama’s Faith in White America (Tressie McMillan Cottom, The Atlantic)

Hope is usually audacious but sometimes misplaced.

Without that feeling of expectation in a country founded on white supremacy that has never erased institutional racism, Barack Hussein Obama would certainly have never been elected President of the United States, not once, let alone twice. But his hope has also served as an escape hatch for white Americans who wanted to not only ignore the past but also the present. By stressing the best in us, Obama overlooked the worst of us, and that worst has never gone away.

It’s doubtful he behaved this way merely due to political opportunism: Obama seems a true believer in America and the ideals it espouses but has never lived up to. I love him and Michelle and think they’re wonderful people, but the nation has never been as good as they are, and even on a good day I’m unsure we even aspire to be. A painfully true Atlantic essay by Cottom meditates on these ideas.

12) “We’re Coming Close to the Point Where We Can Create People Who Are Superior to Others” (Hannah Devlin, The Guardian)

Devlin interviews novelist Kazuo Ishiguro, who wonders if liberal democracy will be doomed by a new type of wealth inequality, the biological kind, in which gene editing and other tools make enhancement and improved health available only to the haves. Ishiguro isn’t a fatalist on the topic, encouraging more public engagement.

Some believe exorbitantly priced technologies created for the moneyed few will rapidly decrease in price and make their way inside everyone’s pockets (and bodies and brains), the same distribution path blazed by consumer electronics. That’s possible but certainly not definite. Of course, as the chilling political winds of 2016 have demonstrated, liberal democracy may be too fragile to even survive to that point.

13) “The Privacy Wars Are About to Get a Whole Lot Worse” (Cory Doctorow, Locus Magazine)

Read the fine print. That’s always been good advice, but it’s never been taken seriously when it comes to the Internet, a fast-moving, seemingly ephemeral medium that doesn’t invite slowing down to contemplate. So companies attach a consent form to their sites and apps about cookies. No one reads it, and there’s no legal recourse from having your laptop or smartphone from being plundered for all your personal info. It quietly removes legal recourse from surveillance capitalism.

In an excellent piece, Doctorow explains how this oversight, which has already had serious consequences, will snake its way into every corner of our lives once the Internet of Things turns every item into a computer, cars and lamps and soda machines and TV screens. “Notice and consent is an absurd legal fiction,” he writes, acknowledging that it persists despite its ridiculous premise and invasive nature.

14) “The Green Universe: A Vision” (Freeman Dyson, New York Review of Books)

I’ve probably enjoyed Dyson’s “pure speculation” writings as much as anything I’ve read during my life, particularly the Imagined Worlds lecture and his NYRB essays and reviews. In this piece, the physicist goes far beyond his decades-old vision of an “Astrochicken” (a spacecraft that’s partly biological), conjuring a baseball-sized, biotech Noah’s Ark that can “seed” the Universe with millions of species of life. “Sometime in the next few hundred years, biotechnology will have advanced to the point where we can design and breed entire ecologies of living creatures adapted to survive in remote places away from Earth,” he writes. It’s a spectacular dream, though we may bury ourselves beneath water or ash long before it can come to fruition, especially with the threat of climate change.

15) “The Augmented Human Being: A Conversation With George Church” (Edge)

CRISPR’s surprising success has swept us into an age when it all seems possible: the manipulation of humans, animals and plants, even perhaps of extinct species. Which way forward?

The geneticist Church, who has long had visions of rejuvenated woolly mammoths and augmented humans, realizes some bristle at manipulation of the Homo sapiens germline because it calls into question all we are, but apart from metaphors, there are also very real practical concerns over the games getting messy and possibly dangerous. The good (diseases being edited out of existence, organs being tailored to transplantees, etc.) shouldn’t be dreams permanently deferred, but it is difficult to understand how bad applications will be contained. Of course, the negative will probably unfold regardless, so we owe it ourselves to pursue the positive, if carefully. Church himself is on board with a cautious approach but not one that’s unduly so.

16) “The Empty Brain(Robert Epstein, Aeon)

Since the 16th century, the human brain has often been compared to a machine of one sort or another, with it being likened to a computer today. The idea that the brain is a machine seems true, though the part about gray matter operating in a similar way to the gadgets that currently sit atop our laps or in our palms is likely false. 

In a wonderfully argumentative and provocative essay, psychologist Epstein says this reflexive labeling of human brains as information processors is a “story we tell to make sense of something we don’t actually understand.” He doesn’t think the brain is tabula rasa but asserts that it doesn’t store memories like an Apple would.

It’s a rich piece of writing full of ideas and examples, though I wish Epstein would have relied less on the word “never” (e.g., “we will never have to worry about a human mind going amok in cyberspace”), because while he’s almost certainly correct about the foreseeable future, given enough time no one knows how the machines in our heads and pockets will change.

17) “North Korea’s One-Percenters Savor Life in ‘Pyonghattan‘” (Anna Fifield, The Washington Post)

Even in Kim Jong-un’s totalitarian state there are haves and have-nots who experience wildly different lifestyles. In the midst of the politically driven arrests and murders, military parades and nuclear threats, there exists a class of super rich kids familiar with squash courts, high-end shopping and fine dining. “Pyonghattan,” it’s called, this sphere of Western-ish consumerist living, which is, of course, just a drop in the bucket when compared to the irresponsible splurges of the Rodman-wrangling “Outstanding Leader.” Still weird, though.

18) “Being Leonard Cohen’s Rabbi (Rabbi Mordecai Finley, Jewish Journal)

The poet of despair, who lived for a time in a monastery, spent some of his last decade discussing spirituality and more earthly matters with the Los Angeles-based rabbi, who explains how the Jewish tradition informed Cohen’s work. “We shared a common language, a common nightmare,” he writes. One remark the prophet of doom made to Finley hits especially hard with the demons awakened during this election season: “You won’t like what comes next after America.”

19) Five Books Interview: Ellen Wayland-Smith Discusses Utopias (Five Books)

In a smart Q&AWayland-Smith, author of Oneida, talks about a group of titles on the topic of Utopia. She surmises that attempts at such communities aren’t prevalent like they were in the 1840s or even the 1960s because most of us realize they don’t normally end well, whether we’re talking about the bitter financial and organizational failures of Fruitlands and Brook Farm or the utter madness of Jonestown. That’s true on a micro-community level, though I would argue that there have never been more people dreaming of large-scale Utopias–and corresponding dystopias–then there are right now. The visions have just grown significantly in scope.

In macro visions, Silicon Valley technologists speak today of an approaching post-scarcity society, an automated, quantified, work-free world in which all basic needs are met and drudgery has disappeared into a string of zeros and ones. These thoughts were once the talking points of those on the fringe, say, a teenage guru who believed he could levitate the Houston Astrodome, but now they (and Mars settlements, a-mortality and the computerization of every object) are on the tongues of the most important business people of our day, billionaires who hope to shape the Earth and beyond into a Shangri-La. 

Perhaps much good will come from these goals, and maybe a few disasters will be enabled as well. 

20) “Sam Altman’s Manifest Destiny” (Tad Friend, New Yorker)

Friend’s “Letter from California” articles in the New Yorker are probably the long-form journalism I most anticipate, because he’s so good at understanding distinct milieus and those who make them what they are, revealing the micro and macro of any situation or subject and sorting through psychological motivations that drive the behavior of individuals or groups. To put it concisely: He gets ecosystems.

The writer’s latest effort, a profile of Y Combinator President Sam Altman, a stripling yet a strongman, reveals someone who has almost no patience for or interest in most people yet wants to save the world–or something.

It’s not a hit job, as Altman really has no intent to offend or injure, but it vivisects Silicon Valley’s Venture Capital culture and the outrageous hubris of those insulated inside its wealth and privilege, the ones who nod approvingly while watching Steve Jobs use Mahatma Gandhi’s image to sell wildly marked-up electronics made by sweatshop labor, and believe they also can think different.

When envisioning the future, Altman sees perhaps a post-scarcity, automated future where a few grand a year of Universal Basic Income can buy the jobless a bare existence (certainly not the big patch of Big Sur he owns), or maybe there’ll be complete societal collapse. Either or. More or less. If the latter occurs, the VC wunderkind plans to flee the carnage by jetting to the safety of his New Zealand spread with Peter Thiel, who has a moral blind spot reminiscent of Hitler’s secretary. A grisly death seems preferable. 

21) “The Secret Shame of Middle-Class Americans” (Neal Gabler, The Atlantic)

The term “middle class” was not always a nebulous one in America. It meant that you had arrived on solid ground and only the worst luck or behavior was likely to shake the earth beneath your feet. That’s become less and less true for four decades, as a number of factors (technology, globalization, tax codes, the decline of unions, the 2008 economic collapse, etc.) have conspired to hollow out this hallowed ground. You can’t arrive someplace that barely exists.

Middle class is now what you think you would be if you had any money. George Carlin’s great line that “the reason they call it the American Dream is because you have to be asleep to believe it” seems truer every day. It’s not so much a fear of falling anymore, but the fear of never getting up, at least not within the current financial arrangement. Those hardworking, decent people you see every day? They’re just as afraid as you are. They are you.

In the spirit of the great 1977 Atlantic article “The Gentle Art of Poverty” and William McPherson’s recent Hedgehog Review piece “Falling,” the excellent writer and film critic Gabler has penned an essay about his “secret shame” of being far poorer than appearances would indicate.

22) “Nate Parker and the Limits of Empathy(Roxane Gay, The New York Times)

We have to separate the art and the artist or we’ll end up without a culture, but it’s not always so easy to do. There was likely no more creative person who ever walked the Earth than David Bowie, whose death kicked off an awful 2016, yet the guy did have sex with children. And Pablo Picasso beat women, Louis-Ferdinand Céline was an anti-Semite, Anne Sexton molested her daughter and so on. In Gay’s smart, humane op-ed, she looks at the controversy surrounding Birth of a Nation writer-director Parker, realizing she can’t compartmentalize her feelings about creators and creations. Agree with her or not, but it’s certainly a far more suitable response than Stephen Galloway’s shockingly amoral Hollywood Reporter piece on the firestorm.

23) “The Case Against Reality (Amanda Gefter, The Atlantic)

A world in which Virtual Reality is in wide use would present a different way to see things, but what if reality is already not what we think it is? It’s usually accepted that we don’t all see things exactly the same way–not just metaphorically–and that our individual interpretation of stimuli is more a rough cut than an exact science. It’s a guesstimate. But things may be even murkier than we believe. Gefter interviews cognitive scientist Donald D. Hoffman who thinks our perception isn’t even a reliable simulacra, that what we take in is nothing like what actually is. It requires just a few minutes to read and will provoke hours of thought.

24) “Autocracy: Rules for Survival” (Masha Gessen, New York Review of Books)

For many of us the idea of a tyrant in the White House is unthinkable, but for some that’s all they can think about. These aren’t genuinely struggling folks in the Rust Belt whose dreams have been foreclosed on by the death rattle of the Industrial Age and made a terrible decision that will only deepen their wounds, but a large number of citizens with fairly secure lifestyles who want to unleash their fury on a world not entirely their own anymore. 

I’ve often wondered how Nazi Germany was possible, and I think this election has finally provided me with the answer. There has to be pervasive prejudice, sure, and it helps if there is a financially desperate populace, but I also think it’s the large-scale revenge of mediocrity, of people wanting to establish an order where might, not merit, will rule.

Gessen addresses the spooky parallels between Russia and this new U.S. as we begin what looks to be a Trump-Putin bromance. Her advice to those wondering if they’re being too paranoid about what may now occur: “Believe the autocrat.”

25) “The Future of Privacy” (William Gibson, New York Times)

What surprises me most about the new abnormal isn’t that surveillance has entered our lives but that we’ve invited it in.

For a coupon code or a “friend,” we’re willing to surrender privacy to a corporate state that wants to engage us, know us, follow us, all to better commodify us. In fact, we feel sort of left out if no one is watching.

It may be that in a scary world we want a brother looking after us even if it’s Big Brother, so we’ve entered into an era of likes and leaks, one that will only grow more profoundly challenging when the Internet of Things becomes the thing.

In a wonderful essay, Gibson considers privacy, history and encryption, those thorny, interrelated topics.

26) “Why You Should Believe in the Digital Afterlife” (Michael Graziano, The Atlantic)

When Russian oligarch Dmitry Itskov vows that by 2045 we’ll be able to upload our consciousness into a computer and achieve a sort of immortality, I’m perplexed. Think about the unlikelihood: It’s not a promise to just create a general, computational brain–difficult enough–but to precisely simulate particular human minds. That ups the ante by a whole lot. While it seems theoretically possible, this process may take awhile.

The Princeton neuroscientist Graziano plots the steps required to encase human consciousness, to create a second life that sounds a bit like Second Life. He acknowledges opinions will differ over whether we’ve generated “another you” or some unsatisfactory simulacrum, a mere copy of an original. Graziano’s clearly excited, though, by the possibility that “biological life [may become] more like a larval stage.”

27) “Big Data, Google and the End of Free Will” (Yuval Noah Harari, The Financial Times)

First we slide machines into our pockets, and then we slide into theirs.

As long as humans have roamed the Earth, we’ve been part of a biological organism larger than ourselves. At first, we were barely connected parts, but gradually we became a Global Village. In order for that connectivity to become possible, the bio-organism gave way to a technological machine. As we stand now, we’re moving ourselves deeper and deeper into a computer, one with no OFF switch. We’ll be counted, whether we like it or not. Some of that will be great, and some not.

The Israeli historian examines this new normal, one that’s occurred without close study of what it will mean for the cogs in the machine–us. As he writes, “humanism is now facing an existential challenge and the idea of ‘free will’ is under threat.”

28) “How Howard Stern Owned Donald Trump(Virginia Heffernan, Politico Magazine)

Whether it’s Howard Stern or that other shock jock Vladimir Putin, Donald Trump’s deep-seated need for praise has made him a mark for those who know how to push his buttons. In the 1990s, when the hideous hotelier was at a career nadir, he was a veritable Wack Packer, dropping by the Stern show to cruelly evaluate women and engage in all sorts of locker-room banter. Trump has dismissed these un-Presidential comments as “entertainment,” but his vulgarity off-air is likewise well-documented. He wasn’t out of his element when with the King of All Media but squarely in it. And it wasn’t just two decades ago. Up until 2014, Trump was still playing right along, allowing himself to be flattered into conversation he must have realized on some level was best avoided.

For Stern, who’s become somewhat less of an asshole as Trump has become far more of one, the joke was always that ugly men were sitting in judgement of attractive women. The future GOP nominee, however, was seemingly not aware he was a punchline. He’s a self-described teetotaler who somehow has beer goggles for himself. During this Baba Booey of an election season, Heffernan wrote knowingly of the dynamic between the two men.

29) “I’m Andrew Hessel: Ask Me Anything” (Hessel, Reddit)

If you like your human beings to come with fingers and toes, you may be disquieted by this undeniably heady AMA conducted by a futurist and a “biotechnology catalyst” at Autodesk. The researcher fields questions about a variety of flaws and illnesses plaguing people that biotech may be able to address, even eliminate. Of course, depending on your perspective, humanness itself can be seen as a failing, something to be “cured.”

30) “What If the Aliens We Are Looking For Are AI? (Richard Hollingham, BBC Future) 

If there are aliens out there, Sir Martin Rees feels fairly certain they’re conscious machines, not oxygen-hoarding humans. It’s just too inhospitable for carbon beings to travel beyond our solar system. He allows that perhaps cyborgs, a form of semi-organic post-humans, could possibly make a go of it. But that’s as close a reflection of ourselves we may be able to see in space. Hollingham explores this theory, wondering if a lack of contact can be explained by the limits we put on our search by expecting a familiar face in the final frontier.

31) “We Are Nowhere Close to the Limits of Athletic Performance” (Stephen Hsu, Nautilus

If performance-enhancing drugs weren’t at all dangerous to the athletes using them, should they be banned?

I bet plenty of people would say they should, bowing before some notion of competitive purity which has never existed. It’s also a nod to “god-given ability,” a curious concept in an increasingly agnostic world. Why should those born with the best legs and lungs be the fastest? Why should the ones lucky enough to have the greatest gray matter at birth be our best thinkers? Why should those fortunate to initially get the healthiest organs live the longest? It doesn’t make much sense to hold back the rest of the world out of respect for a few winners of the genetics lottery.

Hsu relates how genetic engineering will supercharge athletes and the rest of us, making widely available the gifts of Usain Bolt, who gained his from hard work, sure, but also a twist of fate. In fact, extrapolating much further, he believes “speciation seems a definite possibility.”

32) “How Democracies Fall Apart(Andrea Kendall-Taylor and Erica Frantz, Foreign Affairs

If we are hollow men (and women), American liberty, that admittedly unevenly distributed thing, may be over after 240 years. And it could very well end not with a bang but a whimper.

Those waiting for the moment when autocracy topples the normal order of things are too late. Election Day was that time. It’s not guaranteed that the nation transforms into 1930s Europe or that we definitely descend into tyranny, but the conditions have never been more favorable in modern times for the U.S. to capitulate to autocracy. The creeps are in office, and the creeping will be a gradual process. Don’t wait for an explosion; we’re living in its wake.

Kendall-Taylor and Frantz analyze how quietly freedom can abandon us.

33) Khizr Khan’s Speech to the 2016 Democratic National Convention (Khan, DNC)

Ever since Apple’sThink Different ad in 1997, the one in which Steve jobs used Gandhi’s image to sell marked-up consumer electronics made by sweatshop labor, Silicon Valley business titans have been celebrated the way astronauts used to be. Jobs, who took credit for that advertising campaign which someone else created, specifically wondered why we put on a pedestal those who voyage into space when he and his clever friends were changing the world–or something–with their gadgets. He believed technologists were the best and brightest Americans. He was wrong.

Some of the Valley’s biggest names filed dourly into Trump Tower recently in a sort of reverse perp walk. It was the same, sad spectacle of Al Gore’s pilgrimage, which was answered with Scott Pruitt, climate-change denier, being chosen EPA Chief. Perhaps they made the trek on some sort of utilitarian impulse, but I would guess there was also some element of self-preservation, not an unheard of sense of compromise for those who see their corporations as if they were countries, not only because of their elephantine “GDPs,” but also because of how they view themselves. I don’t think they’re all Peter Thiel, an emotional leper and intellectual fraud who now gets to play out his remarkably stupid theories in a large-scale manner. I’ve joked that Thiel has a moral blind spot reminiscent of Hitler’s secretary, but the truth is probably far darker. 

What would have been far more impressive would have been if Musk, Cook, Page, Sandberg, Bezos and the rest stopped downstairs in front of the building and read a statement saying that while they would love to aid any U.S. President, they could not in this case because the President-Elect has displayed vicious xenophobia, misogyny and callous disregard for non-white people throughout the campaign and in the election’s aftermath. He’s shown totalitarian impulses and has disdain for the checks and balances that make the U.S. a free country. In fact, with his bullying nastiness he continues to double down on his prejudices, which has been made very clear by not only his words but through his cabinet appointments. They could have stated their dream for the future doesn’t involve using Big Data to spy on Muslims and Mexicans or programming 3D printers to build internment camps on Mars. They might have noted that Steve Bannon, whom Trump chose as his Chef Strategist, just recently said that there were too many Asian CEOs in Silicon alley, revealing his white-nationalistic ugliness yet again. They could have refused to normalize Trump’s odious vision. They could have taken a stand.

They didn’t because they’re not our absolute finest citizens. Khizr and Ghazala Khan, who understand the essence of the nation in a way the tech billionaires do not, more truly represent us at our most excellent. They possess a wisdom and moral courage that’s as necessary as the Constitution itself. The Silicon Valley folks lack these essential qualities, and without them, you can’t be called our best and brightest.

And maybe Khan’s DNC speech is our ultimate Cassandra moment, when we didn’t listen, or maybe we did but when we looked deep inside for our better angels we came up empty. Regardless, he told the truth beautifully and passionately. When we went low, he went high.

34) “The Perfect Weapon: How Russian Cyberpower Invaded the U.S.” (Eric Lipton, David E. Sanger and Scott Shane)

It was thought that the Russian hacking of the U.S. Presidential election wasn’t met with an immediate response because no one thought Trump really had a chance to win, but the truth is the gravity of this virtual Watergate initially took even many veteran Washington insiders by surprise. This great piece of reportage provides deep and fascinating insight into one of the jaw-dropping scandals of an outrageous election season, which has its origins in the 1990s.

35) “Goodbye to Barack Obama’s World” (Edward Luce, The Financial Times

He must be taken seriously,” Luce wrote in the Financial Times in December 2015 of Donald Trump, as the anti-politician trolled the whole of America with his Penthouse-Apartment Pinochet routine, which seems to have been more genuine than many realized.

Like most, the columnist believed several months earlier that the Reality TV Torquemada was headed for a crash, though he rightly surmised the demons Trump had so gleefully and opportunistically awakened, the vengeful pangs of those who longed to Make America Great White Again, were not likely to dissipate.

But the dice were kind to the casino killer, and a string of accidents and incidents enabled Trump and the mob he riled to score enough Electoral College votes to turn the country, and world, upside down. It’s such an unforced error, one which makes Brexit seem a mere trifle, that it feels like we’ve permanently surrendered something essential about the U.S., that more than an era has ended.

In this post-election analysis, Luce looks forward for America and the whole globe and sees possibilities that are downright ugly.

36) “The Writer Who Was Too Strong To Live” (Dave McKenna, Deadspin)

A postmortem about Jennifer Frey, a journalistic prodigy of the 1990s who burned brilliantly before burning out. A Harvard grad who was filing pieces for newspapers before she was even allowed to drink–legally, that is–Frey was a full-time sportswriter for the New York Times by 24, out-thinking, out-hustling and out-filing even veteran scribes at a clip that was all but impossible. Frey seemed to have it all and was positioned to only get more.

Part of what she had, though, that nobody knew about, was bipolar disorder, which she self-medicated with a sea of alcohol. Career, family and friends gradually floated away, and she died painfully and miserably at age 47. The problem with formidable talent as much as with outrageous wealth is that it can be forceful enough to insulate a troubled soul from treatment. Then, when the fall finally occurs, as it must, it’s too late to rise once more.

37) “United States of Paranoia: They See Gangs of Stalkers” (Mike McPhate, The New York Times)

Sometimes mental illness wears the trappings of the era in which it’s experienced. Mike Jay has written beautifully in the last couple of years about such occurrences attending the burial of Napoleon Bonaparte and the current rise of surveillance and Reality TV. The latter is something of a Truman Show syndrome, in which sick people believe they’re being observed, that they’re being followed. To a degree, they’re right, we all are under much greater technological scrutiny now, though these folks have a paranoia which can drive such concerns into crippling obsessions.

Because we’re all connected now, the “besieged” have found one another online, banning together as “targeted individuals” who’ve been marked by the government (or some other group entity) for observation, harassment and mind control. McPhate’s troubling article demonstrates that the dream of endless information offering lucidity has been dashed for a surprising amount of people, that the inundation of data has served to confuse rather than clarify. These shaky citizens resemble those with alien abduction stories, except they seem to have been “shanghaied” by the sweep of history.

38) “The Long-Term Jobs Killer Is Not China. It’s Automation. (Claire Cain Miller, The New York Times)

Many people nowadays wonder what will replace capitalism, but I believe capitalism will be just fine.

You and me, however, we’re fucked.

The problem is that an uber technologized version of capitalism may not require as many of us or value as highly those who’ve yet to be relieved of their duties. Perhaps a thin crust at the very top will thrive, but without sound policy the rest may be Joads with smartphones. In this scenario, we’d be tracked and commodified, given virtual trinkets rather than be paid. Our privacy, like many of our jobs, will disappear into the zeros and ones.

While the orange supremacist was waving his penis in America’s face during the campaign, the thorny question of what to do should widespread automation be established was left unexplored. That’s terrifying, since more and more outsourcing won’t refer to work moved beyond borders but beyond species. Certainly great investment in education is required, but that won’t likely be enough. Not every freshly unemployed taxi driver can be upskilled into a driverless car software engineer. There’s not enough room on that road.

Miller, a reporter who understands both numbers and people in a way few do, analyzes how outsourcing will increasingly refer to work not moved beyond borders but beyond species.

39) “Nothing To Fear But Fear Itself(Sasha Von Oldershausen, Texas Monthly)

Surveillance is a murky thing almost always attended by a self-censorship, quietly encouraging citizens to abridge their communication because perhaps someone is watching or listening. It’s a chilling of civil rights that happens in a creeping manner. Nothing can be trusted, not even the mundane, not even your own judgement. That’s the goal, really, of such a system–that everyone should feel endlessly observed.

The West Texas border reporter finds similarities between her stretch of America, which feverishly focuses on security from intruders, and her time spent living under theocracy in Iran.

40) “Madness” (Eyal Press, The New Yorker)

“By the nineties, prisons had become America’s dominant mental-health institutions,” writes Press in this infuriating study of a Florida correctional facility in which guards tortured, brutalized, even allegedly murdered, inmates–and employed retaliatory measures against mental health workers who complained. Prison reform is supposedly one of those issues that has bipartisan support, but very little seems to get done in rehabilitating a system that warehouses many nonviolent offenders and mentally ill people among those who truly need to be incarcerated. It seems a breakdown of the institution but is more likely a perpetuation of business as it was intended to be. Either way, the situation needs all the scrutiny and investigation journalists can muster.

41) It May Not Feel Like Anything To Be an Alien(Susan Schneider, Nautilus)

Until deep into the twentieth century, most popular dreams of ETs usually centered on biology. We wanted new friends that reminded us of ourselves or were even cuter. When we accepted we had no Martian doppelgangers, a dejected resignation set in. Perhaps some sort of simple cellular life existed somewhere, but what thin gruel to digest.

Then a new reality took hold: Maybe advanced intelligence exists in space as silicon, not carbon. It’s postbiological.

If there are aliens out there, maybe they’re conscious machines, not oxygen-hoarding humans. It’s just too inhospitable for beings like us to travel beyond our solar system. He allows that cyborgs, a form of semi-organic post-humans, could possibly make a go of it. But that’s as close a reflection of ourselves we may be able to see in space. 

Soon enough, that may be true as well on Earth, a relatively young planet on which intelligence may be in the process of shedding its mortal coil. Another possibility: Perhaps intelligence is also discarding consciousness.

Schneider’s smart article asserts that “soon, humans will no longer be the measure of intelligence on Earth” and tries to surmise what that transition will mean.

42) “Schadenfreude with Bite(Richard Seymour, London Review of Books)

The problem with anarchy is that it has a tendency to get out of control.

In 2013, Eric Schmidt, the most perplexing of Googlers, wrote (along with Jared Cohen) the truest thing about our newly connected age: “The Internet is the largest experiment involving anarchy in history.”

Yes, indeed.

California was once a wild, untamed plot of land, and when people initially flooded the zone, it was exciting if harsh. But then, soon enough: the crowds, the pollution, the Adam Sandler films. The Golden State became civilized with laws and regulations and taxes, which was a trade-off but one that established order and security. The Web has been commodified but never been truly domesticated, so while the rules don’t apply it still contains all the smog and noise of the developed world. Like Los Angeles without the traffic lights.

Our new abnormal has played out for both better and worse. The fan triumphed over the professional, a mixed development that, yes, spread greater democracy on a surface level, but also left truth attenuated. Into this unfiltered, post-fact, indecent swamp slithered the troll, that witless, cowardly insult comic.

The biggest troll of them all, Donald Trump, the racist opportunist who stalked our first African-American President demanding his birth certificate, is succeeding Obama in the Oval Office, which is terrible for the country if perfectly logical for the age. His Lampanelli-Mussolini campaign also emboldened all manner of KKK 2.0, manosphere and neo-Nazi detritus in their own trolling, as they used social media to spread a discombobulating disinformation meant to confuse and distract so hate could take root and grow. No water needed; bile would do.

In the wonderfully written essay, Seymour analyzes the discomfiting age of the troll.

43) “An American Tragedy(David Remnick, The New Yorker)

It happened here, and Remnick, who spent years covering the Kremlin and many more thinking about the White House, was perfectly prepared to respond to a moment he hoped would never arrive. As the unthinkable was still unfolding and most felt paralyzed by the American embrace of a demagogue, the New Yorker EIC urgently warned of the coming normalization of the incoming Administration, instantly drawing a line that allowed for myriad voices to demand decency and insist on truth and facts, which is our best safeguard against the total deterioration of liberal governance.

44) “This Is New York in the Not-So-Distant Future” (Andrew Rice, New York)

Some sort of survival mechanism allows us to forget the full horror of a tragedy, and that’s a good thing. That fading of facts makes it possible for us to go on. But it’s dangerous to be completely amnesiac about disaster.

Case in point: In 2014, Barry Diller announced plans to build a lavish park off Manhattan at the pier where Titanic survivors came to shore. Dial back just a little over two years ago to another waterlogged disaster, when Hurricane Sandy struck the city, and imagine such an island scheme even being suggested then. The wonder at that point was whether Manhattan was long for this world. Diller’s designs don’t sound much different than the captain of a supposedly unsinkable ship ordering a swimming pool built on the deck just after the ship hit an iceberg.

Rice provides an excellent profile of scientist Klaus Joseph, who believes NYC, as we know it, has no future. The academic could be wrong, but if he isn’t, his words about the effects of Irene and Sandy are chilling: “God forbid what’s next.”

45) “The Newer Testament” (Robyn Ross, Texas Monthly)

A Lone Star State millennial using apps and gadgets to disrupt Big Church doesn’t really seem odder than anything else in this hyperconnected and tech-happy entrepreneurial age, when the way things have been are threatened at every turn. At Experience Life in Lubbock, Soylent has yet to replace wine and there’s no Virtual Reality confessionals, but self-described “computer nerd” Chris Galanos has done his best to take the “Old” out of the Old Testament with his buzzing, whirring House of God 2.0. Is nothing sacred anymore?

46) “The New Nationalism Of Brexit And Trump Is A Product Of The Digital Age” (Douglas Rushkoff, Fast Company)

“We are flummoxed by today’s nationalist, regressively anti-global sentiments only because we are interpreting politics through that now-obsolete television screen,” writes Rushkoff in this excellent piece about the factious nature of the Digital Age. The post-TV landscape is a narrowcasted one littered with an infinite number of granular choices and niches. It’s empowering in a sense, an opportunity to vote “Leave” to everything, even a future that’s arriving regardless of popular consensus. It’s a far cry from not that long ago when an entire world sat transfixed by Neil Armstrong’s giant leap. Now everyone is trying to land on the moon at the same time–and no one can agree where it is. It’s more democratic this way, but maybe to an untenable degree, perhaps to the point where it’s a new form of anarchy.

47) “The Incredible Fulk(Alexandra Suich, The Economist 1843)

The insanity of our increasingly scary wealth inequality is chronicled expertly in this richly descriptive article, even though it seems in no way intended as a hit piece. The title refers to Ken Fulk, Silicon Valley’s go-to “lifestyle designer,” who charges billionaires millions to create loud interiors, rooms stuffed with antique doors from shuttered mental institutions and musk-ox taxidermy, intended to “evoke feelings” or some such shit.

As the article says: “His spaces, when completed, have a theatrical quality to them, which Fulk plays up. Once he’s finished a project he often brings clients to their homes to show them the final product, a ceremony which he calls the ‘big reveal.’ For the Birches’ home in San Francisco, he hired men dressed as beefeaters to stand outside the entrance and musicians to play indoors. For another set of clients in Palm Springs, he hired synchronized swimmers, a camel and an impersonator to dress up and sing like Dean Martin.” It’s all good, provided a bloody revolution never occurs.

Fulk acknowledges a “tension between high and low” in his work. Know what else has tension? Nooses.

48) “Truth Is a Lost Game in Turkey. Don’t Let the Same Thing Happen to You.(Ece Temelkuran, The Guardian)

Nihilism is sometimes an end but more often a means.

Truth can be fuzzy and facts imprecise, but an honest pursuit of these precious goods allows for a basic decency, a sense of order. Bombard such efforts for an adequate length of time, convince enough people that veracity and reality are fully amorphous, and opportunities for mischief abound.

Break down the normal rules (written and unwritten ones), create an air of confusion with shocking behaviors and statements, blast an opening where anything is possible–even “unspeakable things”–and a democracy can fall and tyranny rise. The timing has to be right, but sooner or later that time will arrive.

Has such a moment come for America? The conditions haven’t been this ripe for at least 60 years, and nothing can now be taken for granted.

Temelkuran explains how Turkey became a post-truth state, a nation-sized mirage, and how the same fate may befall Europe and the U.S. She certainly shares my concerns about the almost non-stop use of the world “elites” to neutralize the righteous into paralysis.

49) “Prepping for Doomsday: Bunkers, Panic Rooms, and Going Off the Grid” (Clare Trapasso, Realtor.com)

Utter societal collapse in the United States may not occur in the immediate future, but it’s certainly an understandable time for a case of the willies. In advance of the November elections, the bunker business boomed, as some among us thought things would soon fall apart and busied themselves counting their gold coins and covering their asses. In a shocking twist, the result of the Presidential election has calmed many of the previously most panicked among us and activated the fears of the formerly hopeful.

50) “The 100-Year-Old Man Who Lives in the Future” (Caroline Winter, Bloomberg Businessweek)

Jacque Fresco, one of those fascinating people who walks through life building a world inside his head, hoping it eventually influences the wider one, is now into his second century of life. A futurist and designer who’s focused much of his work on sustainable living, technology and automation, Fresco is the brains behind the Venus Project, which encourages a post-money, post-scarcity, post-politician utopia. He’s clearly a template for many of today’s Silicon Valley aspiring game-changers.

Winter traveled to Middle-of-Nowhere, Florida (pop: Fresco + girlfriend and collaborator Roxanne Meadows), to write this smart portrait of the visionary after ten decades of reimagining the world according to his own specifications. He doesn’t think the road to a computer-governed utopia will be smooth, however. As Winter writes: “Once modern life gets truly hard, Fresco believes there will be a revolution that will clear the way for the Venus Project to be built. ‘There will be a lot of people getting shot, including me,’ he says wryly.” Well, he’s had a good run.•

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

mcluhannewspaper

Jeff Jarvis, theorist or something, was one of the most gleeful of public figures in celebrating the demise of traditional media. Having made his bones in the business, he wanted the new tools to feast on the flesh of print publications and network TV, believing there would emerge a democratic revolution. In ways he couldn’t anticipate, he was correct.

Jarvis grew apoplectic as a Trump Presidency seemed increasingly possible, spending great personal time volunteering for Hillary Clinton in Pennsylvania and making desperate appeals to traditional media personalities like Howard Stern, hoping, belatedly, that the new abnormal could somehow be tamed by phone banks and talk radio. Not possible. The ethical standards and common decency that had washed away easier than ink helped make sure of that. What seemed an evolution to him turned out to be a devolution. 

From “Meet the New Gatekeeper, Worse Than the Old Gatekeeper,” Nicholas Carr’s astute Rough Type post:

We celebrated our emancipation from filters, and we praised the democratization brought about by “new media.” The “people formerly known as the audience” had taken charge, proclaimed one herald of the new order, as he wagged his finger at the disempowered journalistic elites. “You were once (exclusively) the editors of the news, choosing what ran on the front page. Now we can edit the news, and our choices send items to our own front pages.”

“The means of media are now in the hands of the people,” declared another triumphalist:

So now anyone can control, create, market, distribute, find, and interact with anything they want. The barrier to entry to media is demolished. Media, always a one-way pipe, now becomes an open pool. . . . Whenever citizens can exercise control, they will. Today they are challenging and changing media — where bloggers now fact-check Dan Rather’s ass — but tomorrow they will challenge and change politics, government, marketing, and education as well. This isn’t just a media revolution, though that’s where we are seeing the impact first. This is a chain-reaction of revolutions. It has just begun.

And the pundits were right — the old media filters dissolved, and “we” took control — though the great disruption has not played out in quite the way they anticipated.•

Tags: ,

amazonbooks54

Nobody shops in brick-and-mortar stores anymore, if you don’t count about 90% of purchases.

Because so many of the physical businesses we connected to on an emotional level were killed by the Internet (book and video stores, record shops, newsstands, etc.), it seems online is predominant in retail. But that’s not nearly true, at least not yet. In order to keep expanding market share, Silicon Valley powers like Amazon are venturing off into the real world, a phenomenon that may increase exponentially. I doubt it will work very well with Amazon Books stores and their shallow selections, but perhaps the planned convenience store chain will make a go of it? Tough to say: Corporations great at one type of platform often flounder in others.

In a Technology Review piece, Nicholas Carr visits a new Amazon Books and explains the key role of the smartphone in this surprising turn of events. An excerpt:

Amazon Books may be just the vanguard of a much broader push into brick-and-mortar retailing by the company. In October, the Wall Street Journal revealed that Amazon is planning to open a chain of convenience stores, mainly for groceries, along with drive-in depots where consumers will be able to pick up merchandise ordered online. It has also begun rolling out small “pop-up” stores to hawk its electronic devices. It already has more than two dozen such kiosks in malls around the country, and dozens more are said to be in the works.

Even after 20 years of rapid growth, e-commerce still accounts for less than 10 percent of total retail sales. And now the rise of mobile computing places new constraints on Web stores. They can’t display or promote as many products as they could when their wares were spread across desktop or laptop monitors. That limits the stores’ cross-selling and upselling opportunities and blunts other merchandising tactics.

At the same time, the smartphone, with its apps, its messaging platforms, and its constant connectivity, gives retailers more ways to communicate with and influence customers, even when they’re shopping in stores. This is why the big trend in retailing today is toward “omnichannel” strategies, which blend physical stores, Web stores, and mobile apps in a way that makes the most of the convenience of smartphones and overcomes their limitations. Some omnichannel pioneers, like Sephora and Nordstrom, come from the brick-and-mortar world. But others, like Warby Parker and Bonobos, come from the Web world. Now, with its physical stores, Amazon is following in their tracks. “Pure-play Web retailing is not sustainable,” New York University marketing professor Scott Galloway told me. He points out that the deep discounting and high delivery costs that characterize Web sales have made it hard for Amazon to turn a profit. If Amazon were to remain an online-only merchant, he says, its future success would be in jeopardy. He believes the company will end up opening “hundreds and then thousands of stores.”•

Tags: ,

nyc_subway_riders_with_their_newspapers-travis-ruse

subway

The photo at top is from 2005, which might as well be a million years ago. Commuters on the NYC subway were that recently digesting every kind of printed matter, with newspapers especially prominent. We will never witness that scene again, as we’ve transitioned into the age of smartphones, a medium that has disappeared the broadsheet and tabloid and paperback. These tools are wonderfully portable and can hold far more information, though some things have been lost in the changeover. That’s not to say America was wonderful in 2005 and isn’t now–both times were rather grim–but not much good can come of making words shrink, eliminating them, even.

To paraphrase Norma Desmond: News *is* big. It’s the *gadgets* that got small. Reading on smartphones isn’t easy, so skimming headlines about current events is about the best anyone can do now. It’s not just the size of the characters that’s daunting but also the speed with which they travel, as they ping, prompt and interrupt us nonstop. News is always breaking until it feels broken.

Nicholas Carr, one of our time’s preeminent critics (cultural, social and media), has penned a really wonderful Nieman Reports piece on the “nowness” of the news, the concept of fast and first run amok. As he writes, “for 500 years the medium of print has been training us to pay attention.” Not any longer. The opening:

“Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other—sudden, instantaneous, burning with the fervor of the soul from which it burst forth.”

Those opening words would seem to describe, with the zeal typical of the modern techno-utopian, the arrival of our new online media environment with its feeds, streams, texts and tweets. What is the Web if not sudden, instantaneous and burning with fervor? But French poet and politician Alphonse de Lamartine wrote these words in 1831 to describe the emergence of the daily newspaper. Journalism, he proclaimed, would soon become “the whole of human thought.” Books, incapable of competing with the immediacy of morning and evening papers, were doomed: “Thought will not have time to ripen, to accumulate into the form of a book—the book will arrive too late. The only book possible from today is a newspaper.”

Lamartine’s prediction of the imminent demise of books didn’t pan out. Newspapers did not take their place. But he was a prophet nonetheless. The story of media, particularly the news media, has for the last two centuries been a story of the pursuit of ever greater immediacy. From broadsheet to telegram, radio broadcast to TV bulletin, blog to Twitter, we’ve relentlessly ratcheted up the velocity of information flow.

To Shakespeare, ripeness was all. Today, ripeness doesn’t seem to count for much. Nowness is all.•

Tags:

old-school-flying-airplane-work-typewriter-people-pic-1335218357-e1419282636723-4

Aeon, which already presented a piece from Nicholas Carr’s new book, Utopia Is Creepy, has another, a passage about biotechnology which wonders if science will soon move too fast not only for legislation but for ethics as well.

The “philosophy is dead” assertion that’s persistently batted around in scientific circles drives me bonkers because we dearly need consideration about our likely commandeering of evolution. Carr doesn’t make that argument but instead rightly wonders if ethics is likely to be more than a “sideshow” when garages aren’t used to just hatch computer hardware or search engines but greatly altered or even new life forms. The tools will be cheap, the “creativity” decentralized, the “products” attractive. As Freeman Dyson wrote nearly a decade ago: “These games will be messy and possibly dangerous.”

From Carr:

If to be transhuman is to use technology to change one’s body from its natural state, then we are all already transhuman. But the ability of human beings to alter and augment themselves might expand enormously in the decades ahead, thanks to a convergence of scientific and technical advances in such areas as robotics, bioelectronics, genetic engineering and pharmacology. Progress in the field broadly known as biotechnology promises to make us stronger, smarter and fitter, with sharper senses and more capable minds and bodies. And scientists can already use the much discussed gene-editing tool CRISPR, derived from bacterial immune systems, to rewrite genetic code with far greater speed and precision, and at far lower cost, than was possible before. In simple terms, CRISPR pinpoints a target sequence of DNA on a gene, uses a bacterial enzyme to snip out the sequence, and then splices a new sequence in its place. The inserted genetic material doesn’t have to come from the same species. Scientists can mix and match bits of DNA from different species, creating real-life chimeras.

As long ago as 1923, the English biologist J B S Haldane gave alecturebefore the Heretics Society in Cambridge on how science would shape humanity in the future. ‘We can already alter animal species to an enormous extent,’ he observed, ‘and it seems only a question of time before we shall be able to apply the same principles to our own.’ Society would, Haldane felt sure, defer to the scientist and the technologist in defining the boundaries of the human species. ‘The scientific worker of the future,’ he concluded, ‘will more and more resemble the lonely figure of Daedalus as he becomes conscious of his ghastly mission, and proud of it.’

The ultimate benefit of transhumanism, argues Nick Bostrom, professor of philosophy at the University of Oxford, and one of the foremost proponents of radical human enhancement, is that it expands human potential, giving individuals greater freedom ‘to shape themselves and their lives according to their informed wishes’.Transhumanismunchains us from our nature. Critics take a darker view, suggesting that biological and genetic tinkering is more likely to demean or even destroy the human race than elevate it.

The ethical debate is profound, but it seems fated to be a sideshow.•

Tags:

images (2)

The introduction to Nicholas Carr’s soon-to-be published essay collection, Utopia Is Creepy, has been excerpted at Aeon, and it’s a beauty. The writer argues (powerfully) that we’ve defined “progress as essentially technological,” even though the Digital Age quickly became corrupted by commercial interests, and the initial thrill of the Internet faded as it became “civilized” in the most derogatory, Twain-ish use of that word. To Carr, the something gained (access to an avalanche of information) is overwhelmed by what’s lost (withdrawal from reality). The critic applies John Kenneth Galbraith’s term “innocent fraud” to the Silicon Valley marketing of techno-utopianism. 

You could extrapolate this thinking to much of our contemporary culture: binge-watching endless content, Pokémon Go, Comic-Con, fake Reality TV shows, reality-altering cable news, etc. Carr suggests we use the tools of Silicon Valley while refusing the ethos. Perhaps that’s possible, but I doubt you can separate such things.

An excerpt:

The greatest of the United States’ homegrown religions – greater than Jehovah’s Witnesses, greater than the Church of Jesus Christ of Latter-Day Saints, greater even than Scientology – is the religion of technology. John Adolphus Etzler, a Pittsburgher, sounded the trumpet in his testament The Paradise Within the Reach of All Men (1833). By fulfilling its ‘mechanical purposes’, he wrote, the US would turn itself into a new Eden, a ‘state of superabundance’ where ‘there will be a continual feast, parties of pleasures, novelties, delights and instructive occupations’, not to mention ‘vegetables of infinite variety and appearance’.

Similar predictions proliferated throughout the 19th and 20th centuries, and in their visions of ‘technological majesty’, as the critic and historian Perry Miller wrote, we find the true American sublime. We might blow kisses to agrarians such as Jefferson and tree-huggers such as Thoreau, but we put our faith in Edison and Ford, Gates and Zuckerberg. It is the technologists who shall lead us.

Cyberspace, with its disembodied voices and ethereal avatars, seemed mystical from the start, its unearthly vastness a receptacle for the spiritual yearnings and tropes of the US. ‘What better way,’ wrote the philosopher Michael Heim inThe Erotic Ontology of Cyberspace’ (1991), ‘to emulate God’s knowledge than to generate a virtual world constituted by bits of information?’ In 1999, the year Google moved from a Menlo Park garage to a Palo Alto office, the Yale computer scientist David Gelernter wrote a manifesto predicting ‘the second coming of the computer’, replete with gauzy images of ‘cyberbodies drift[ing] in the computational cosmos’ and ‘beautifully laid-out collections of information, like immaculate giant gardens’.

The millenarian rhetoric swelled with the arrival of Web 2.0. ‘Behold,’ proclaimed Wired in an August 2005 cover story: we are entering a ‘new world’, powered not by God’s grace but by the web’s ‘electricity of participation’. It would be a paradise of our own making, ‘manufactured by users’. History’s databases would be erased, humankind rebooted. ‘You and I are alive at this moment.’

The revelation continues to this day, the technological paradise forever glittering on the horizon. Even money men have taken sidelines in starry-eyed futurism. In 2014, the venture capitalist Marc Andreessen sent out a rhapsodic series of tweets – he called it a ‘tweetstorm’ – announcing that computers and robots were about to liberate us all from ‘physical need constraints’. Echoing Etzler (and Karl Marx), he declared that ‘for the first time in history’ humankind would be able to express its full and true nature: ‘we will be whoever we want to be.’ And: ‘The main fields of human endeavour will be culture, arts, sciences, creativity, philosophy, experimentation, exploration, adventure.’ The only thing he left out was the vegetables.•

Tags: ,

mcluhan1234

The messenger is supposed to bring the truth, not his or her wishes. It was more than 50 years ago when Marshall McLuhan predicted a Global Village, and those who believed the theorist was happy about this development were listening, at best, with one ear. The prospect frightened him

McLuhan feared the whole world being connected, thought it an invitation for mayhem, rightly believing local skirmishes would be played out on a gigantic stage. Believing a flatter world will be a more peaceful one assumes that everyone is driven by money, not ideology, not madness. 

Everything seems to arrive with more speed and regularity now, social justice and sorties alike. The whole world is in you pocket now, and it’s exploding.

Excerpts from 1)  Mathieu von Rohr’s Spiegel essay “Apocalypse Now,” and 2) Nicholas Carr’s Rough Type post “The Global Village of Violence.”


From von Rohr:

We are living in an age of shocks and crises that could well be traumatizing in their rapid succession and concentration, since it’s not yet clear whether they’re only a temporary jolt or the beginning of a trend with no end in sight. Of course, the sheer number of conflicts has remained constant in recent years. But there is much indication that we find ourselves in a new era of global instability. The biggest geopolitical stories of our time are the destabilization in the Middle East, the European security order and the European Union. In addition, there has been a societal shift in many Western countries: Many citizens are angry at the elites, because they see themselves as victims of globalization, free trade and migration. This anger has enabled the rise of political movements from the fringe to the mainstream in only a few years: Donald Trump, the Brexit movement, Front National and the Alternative for Germany, or AfD. The classic political camps are dissolving as the battle between the political left and the right is replaced by one between Isolationists and Internationalists.

Every now and then, there are phases in international politics during which more happens in the span of a few weeks than would otherwise happen in decades. Do 2014 and 2016 fall into that category? They’re not comparable to the most dramatic phases of the past century, when both World Wars broke out; nor are they anything like 1989, when the Cold War ended and the world order was rearranged. It’s also unclear whether this year will end with the same chaotic violence it started with.

But it is rather likely that global insecurity will become the new status quo.•


From Carr:

We assume that communication and harmony go hand in hand, like a pair of flower children on a garden path. If only we all could share our thoughts and feelings with everyone else all the time, we’d overcome our distrust and fear and live together peaceably. We’d see that we are all one. Facebook and other social media disabuse us of this notion. To be “all one” is to be dissolved — and for many people that is a threat that requires a reaction.

Eamonn Fitzgerald points to a recently uploaded video of a Canadian TV interview with Marshall McLuhan that aired in 1977. By the mid-seventies, a decade after his allotted minutes of fame, McLuhan had come to be dismissed as a mumbo-jumbo-spewing charlatan by the intelligentsia. What the intelligentsia found particularly irritating was that the mumbo jumbo McLuhan spewed fit no piety and often hit uncomfortably close to the mark.

Early on in the clip, the interviewer notes that McLuhan had long ago predicted that electronic communication systems would turn the world into a global village. Most of McLuhan’s early readers had taken this as a utopian prophecy. “But it seems,” the interviewer says, with surprise, “that this tribal world is not very friendly.”•

Tags: , ,

helmet77777 (1)Attempting to narrow the wealth gap by having corporations make micropayments to citizens for their information seems to me a morally bankrupt system even if it achieves the unlikely and saves some from actual bankruptcy. There has to be a better way, though whether we’re unwittingly working for Facebook and Google for free or accepting bits of coins for our efforts, it’s hard to see how we avoid this privacy-obliterating system we’ve built. We live in a very anti-government time, but corporations are far more pervasive and invasive and will only grow more so as the Internet of Things becomes the thing. We may eventually miss Big Brother.

I’m looking forward to reading Nicholas Carr’s forthcoming book, Utopia Is Creepy, which has the best title ever, and I credit him with pointing me toward Shoshana Zuboff’s Frankfurter Allgemeine essay “The Secrets of Surveillance Capitalism.” As she writes, “the very idea of a functional, effective, affordable product as a sufficient basis for economic exchange is dying,” and what is replacing it is spooky as hell. The Harvard professor’s article is devastating not for imagining a dark future that might be if things go horribly wrong but for laying out where we’re headed if we just incrementally build on the status quo.

The opening:

Google surpassed Apple as the world’s most highly valued company in January for the first time since 2010.  (Back then each company was worth less than 200 billion. Now each is valued at well over 500 billion.)  While Google’s new lead lasted only a few days, the company’s success has implications for everyone who lives within the reach of the Internet. Why? Because Google is ground zero for a wholly new subspecies of capitalism in which profits derive from the unilateral surveillance and modification of human behavior.  This is a new surveillance capitalism that is unimaginable outside the inscrutable high velocity circuits of Google’s digital universe, whose signature feature is the Internet and its successors.  While the world is riveted by the showdown between Apple and the FBI, the real truth is that the surveillance capabilities being developed by surveillance capitalists are the envy of every state security agency.  What are the secrets of this new capitalism, how do they produce such staggering wealth, and how can we protect ourselves from its invasive power?

“Most Americans realize that there are two groups of people who are monitored regularly as they move about the country.  The first group is monitored involuntarily by a court order requiring that a tracking device be attached to their ankle. The second group includes everyone else…”

Some will think that this statement is certainly true. Others will worry that it could become true. Perhaps some think it’s ridiculous.  It’s not a quote from a dystopian novel, a Silicon Valley executive, or even an NSA official. These are the words of an auto insurance industry consultant intended as a defense of  “automotive telematics” and the astonishingly intrusive surveillance capabilities of the allegedly benign systems that are already in use or under development. It’s an industry that has been notoriously exploitative toward customers and has had obvious cause to be anxious about the implications of self-driving cars for its business model. Now, data about where we are, where we’re going, how we’re feeling, what we’re saying, the details of our driving, and the conditions of our vehicle are turning into beacons of revenue that illuminate a new commercial prospect.•

Tags: ,

journalistcar (1)

Here are 50 ungated pieces of wonderful journalism from 2015, alphabetized by author name, which made me consider something new or reconsider old beliefs or just delighted me. (Some selections are from gated publications that allow a number of free articles per month.) If your excellent work isn’t on the list, that’s more my fault than yours.

  • Who Runs the Streets of New Orleans?” (David Amsden, The New York Times Magazine) As private and public sector missions increasingly overlap, here’s an engaging look at the privatization of some policing in the French Quarter.
  • In the Beginning” (Ross Andersen, Aeon) A bold and epic essay about the elusive search for the origins of the universe.
  • Ask Me Anything (Anonymous, Reddit) A 92-year-old German woman who was born into Nazism (and participated in it) sadly absolves herself of all blame while answering questions about that horrible time.
  • Rethinking Extinction” (Stewart Brand, Aeon) The Whole Earth Catalog founder thinks the chance of climate-change catastrophe overrated, arguing we should utilize biotech to repopulate dwindling species.
  • Anchorman: The Legend of Don Lemon” (Taffy Brodesser-Akner, GQ) A deeply entertaining look into the perplexing facehole of Jeff Zucker’s most gormless word-sayer and, by extension, the larger cable-news zeitgeist.
  • How Social Media Is Ruining Politics(Nicholas Carr, Politico) A lament that our shiny new tools have provided provocative trolls far more credibility than a centralized media ever allowed for.
  • Clans of the Cathode” (Tom Carson, The Baffler) One of our best culture critics looks at the meaning of various American sitcom families through the medium’s history.
  • The Black Family in the Age of Mass Incarceration” (Ta-Nehisi Coates, The Atlantic) The author examines the tragedy of the African-American community being turned into a penal colony, explaining the origins of the catastrophic policy failure.
  • Perfect Genetic Knowledge” (Dawn Field, Aeon) The essayist thinks about a future in which we’ve achieved “perfect knowledge” of whole-planet genetics.
  • A Strangely Funny Russian Genius” (Ian Frazier, The New York Review of Books) Daniil Kharms was a very funny writer, if you appreciate slapstick that ends in a body count.
  • Tomorrow’s Advance Man” (Tad Friend, The New Yorker) Profile of Silicon Valley strongman Marc Andreessen and his milieu, an enchanted land in which adults dream of riding unicorns.
  • Build-a-Brain” (Michael Graziano, Aeon) The neuroscientist’s ambitious thought experiment about machine intelligence is a piece I thought about continuously throughout the year.
  • Ask Me Anything (Stephen Hawking, Reddit) Among other things, the physicist warns that the real threat of superintelligent machines isn’t malice but relentless competence.
  • Engineering Humans for War” (Annie Jacobsen, The Atlantic) War is inhuman, it’s been said, and the Pentagon wants to make it more so by employing bleeding-edge biology and technology to create super soldiers.
  • The Wrong Head” (Mike Jay, London Review of Books) A look at insanity in 1840s France, which demonstrates that mental illness is often expressed in terms of the era in which it’s experienced.
  • Death Is Optional” (Daniel Kahneman and Noah Yuval Harari, Edge) Two of my favorite big thinkers discuss the road ahead, a highly automated tomorrow in which medicine, even mortality, may not be an egalitarian affair.
  • Where the Bodies Are Buried,” (Patrick Radden Keefe, The New Yorker) Ceasefires, even treaties, don’t completely conclude wars, as evidenced by this haunting revisitation of the heartbreaking IRA era.
  • Porntopia” (Molly Lambert, Grantland) The annual Adult Video News Awards in Las Vegas, the Oscars of oral, allows the writer to look into a funhouse-mirror reflection of America.
  • The Robots Are Coming” (John Lanchester, London Review of Books) A remarkably lucid explanation of how quickly AI may remake our lives and labor in the coming decades.
  • Last Girl in Larchmont” (Emily Nussbaum, The New Yorker) The great TV critic provides a postmortem of Joan Rivers and her singular (and sometimes disquieting) brand of feminism.
  • “President Obama & Marilynne Robinson: A Conversation, Part 1 & Part 2” (Barack Obama and Marilynne Robinson, New York Review of Books) Two monumental Americans discuss the state of the novel and the state of the union.
  • Ask Me Anything (Elizabeth Parrish, Reddit) The CEO of BioViva announces she’s patient zero for the company’s experimental age-reversing gene therapies. Strangest thing I read all year.
  • Why Alien Life Will Be Robotic” (Sir Martin Rees, Nautilus) The astronomer argues that ETs in our inhospitable universe have likely already transitioned into conscious machines.
  • Ask Me Anything (Anders Sandberg, Reddit) Heady conversation about existential risks, Transhumanism, economics, space travel and future technologies conducted by the Oxford researcher. 
  • Alien Rights” (Lizzie Wade, Aeon) Manifest Destiny will, sooner or later, became a space odyssey. What ethics should govern exploration of the final frontier?
  • Peeling Back the Layers of a Born Salesman’s Life” (Michael Wilson, The New York Times) The paper’s gifted crime writer pens a posthumous profile of a protean con man, a Zelig on the make who crossed paths with Abbie Hoffman, Otto Preminger and Annie Leibovitz, among others.
  • The Pop Star and the Prophet” (Sam York, BBC Magazine) Philosopher Jacques Attali, who predicted, back in the ’70s, the downfall of the music business, tells the writer he now foresees similar turbulence for manufacturing.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

The trouble with everyone being connected and quantified isn’t only that we’re sharing, intentionally or otherwise, so much personal information, but also what that data can further reveal once algorithms have had their way with it. It’s like an inverse game of telephone in the Smartphone Age, the information becoming more precise as it travels.

From “You Are Your Phone,” a sharp Rough Type post by Nicholas Carr:

The Wall Street Journal reports today that Silicon Valley lending startups are looking to base personal loan decisions on analyses of data from individuals’ phones. The apps running on a person’s device, entrepreneurs have found, “generate huge amounts of data — texts, emails, GPS coordinates, social-media posts, retail receipts, and so on — indicating thousands of subtle patterns of behavior that correlate with repayment or default.” How you use your phone reveals more than you think:

Even obscure variables such as how frequently a user recharges the phone’s battery, how many incoming text messages they receive, how many miles they travel in a given day or how they enter contacts into their phone — the decision to add last name correlates with creditworthiness — can bear on a decision to extend credit.

Meanwhile, the New York Times today reports on a new study published in Science that reveals how a person’s economic status can be determined through a fairly simple analysis of phone use. The researchers, working in Africa, collected details “about when calls were made and received and the length of the calls” as well as “when text messages were sent, and which cellphone towers the texts and calls were routed through.” They analyzed this metadata to “build an algorithm that predicts how wealthy or impoverished a given cellphone user is. Using the same model, the researchers were able to answer even more specific questions, like whether a household had electricity.”

I am not a number, you declare. I am more than a credit score. You may well be. But the tell-tale phone reveals more than one’s financial standing and trustworthiness.•

Tags:

Nicholas Carr’s The Glass Cage, a must-read if you want to understand all sides of this new machine age, is now out in paper. I like Carr’s thinking when I agree with him, and I like it when I don’t. He always makes me see things in a fresh way, and he’s a miraculously graceful writer. Carr put an excerpt from the book, one about the history of automation, on his blog. Here’s a smaller section from that: 

Automated machines existed before World War II. James Watt’s steam engine, the original prime mover of the Industrial Revolution, incorporated an ingenious feedback device — the fly-ball governor — that enabled it to regulate its own operation. The Jacquard loom, invented in France around 1800, used steel punch cards to control the movements of spools of different-colored threads, allowing intricate patterns to be woven automatically. In 1866, a British engineer named J. Macfarlane Gray patented a steamship steering mechanism that was able to register the movement of a boat’s helm and, through a gear-operated feedback system, adjust the angle of the rudder to maintain a set course.

But the development of fast computers, along with other sensitive electronic controls, opened a new chapter in the history of machines. It vastly expanded the possibilities of automation. As the mathematician Norbert Wiener, who helped write the prediction algorithms for the Allies’ automated antiaircraft gun, explained in his 1950 book The Human Use of Human Beings, the advances of the 1940s enabled inventors and engineers to go beyond “the sporadic design of individual automatic mechanisms.” The new technologies, while designed with weaponry in mind, gave rise to “a general policy for the construction of automatic mechanisms of the most varied type.” They opened the way for “the new automatic age.”

Beyond the pursuit of progress and productivity lay another impetus for the automatic age: politics.•

Tags:

dt2

If Donald Trump grew a small, square mustache above his lip, would his poll numbers increase yet again? For a candidate running almost purely on attention, can any shock really be deleterious?

Howard Dean was the first Internet candidate and Barack Obama the initial one to ride those new rules to success. But things have already markedly changed: That was a time of bulky machines on your lap, and the new political reality rests lightly in your pocket. A smartphone’s messages are brief and light on details, and its buzzing is more important than anything it delivers.

The diffusion of media was supposed to make it impossible for a likable incompetent like George W. Bush to rise. How could such a person survive the scrutiny of millions of “citizen journalists” like us? If anything, it’s made it easier, even for someone who’s unlikable and incompetent. For a celeb with a Reality TV willingness to be ALL CAPS all the time, facts get lost in the noise, at least for awhile.

That doesn’t mean Donald Trump, an adult baby with an attention span that falls somewhere far south of 15 months, will be our next President, but it does indicate that someone ridiculously unqualified and hugely bigoted gets to be on the national stage and inform our political discourse. The same way Jenny McCarthy used her platform to play doctor and spearhead the anti-vaccination movement, Trump gets to be a make-believe Commander-in-Chief for a time.

Unsurprisingly, Nicholas Carr has written the best piece on the dubious democracy the new tools have delivered, a Politico Magazine article that analyzes election season in a time that favors a provocative troll, a “snapchat personality,” as he terms it. The opening:

Our political discourse is shrinking to fit our smartphone screens. The latest evidence came on Monday night, when Barack Obama turned himself into the country’s Instagrammer-in-Chief. While en route to Alaska to promote his climate agenda, the president took a photograph of a mountain range from a window on Air Force One and posted the shot on the popular picture-sharing network. “Hey everyone, it’s Barack,” the caption read. “I’ll be spending the next few days touring this beautiful state and meeting with Alaskans about what’s going on in their lives. Looking forward to sharing it with you.” The photo quickly racked up thousands of likes.

Ever since the so-called Facebook election of 2008, Obama has been a pacesetter in using social media to connect with the public. But he has nothing on this year’s field of candidates. Ted Cruz live-streams his appearances on Periscope. Marco Rubio broadcasts “Snapchat Stories” at stops along the trail. Hillary Clinton and Jeb Bush spar over student debt on Twitter. Rand Paul and Lindsey Graham produce goofy YouTube videos. Even grumpy old Bernie Sanders has attracted nearly two million likers on Facebook, leading the New York Times to dub him “a king of social media.”

And then there’s Donald Trump. If Sanders is a king, Trump is a god. A natural-born troll, adept at issuing inflammatory bulletins at opportune moments, he’s the first candidate optimized for the Google News algorithm.•

Tags: ,

In one of his typically bright, observant posts, Nicholas Carr wryly tackles Amazon’s new scheme of paying Kindle Unlimited authors based on how many of their pages are read, a system which reduces the written word to a granular level of constant, non-demanding engagement. 

There’s an argument to be made that like systems have worked quite well in the past: Didn’t Charles Dickens publish under similar if not-as-precisely-quantified circumstances when turning out his serial novels? Sort of. Maybe not to the same minute degree, but he was usually only as good as his last paragraph (which, thankfully, was always pretty good).

The difference is while it worked for Dickens, this process hasn’t been the motor behind most of the great writing in our history. James Joyce would not have survived very well on this nano scale. Neither would have Virginia Woolf, William Faulkner, Marcel Proust, etc. Their books aren’t just individual pages leafed together but a cumulative effect, a treasure that comes only to those who clear obstacles.

Shakespeare may have had to pander to the groundlings to pay the theater’s light bill, but what if the lights had been turned off mid-performance if he went more than a page without aiming for the bottom of the audience?

Carr’s opening:

When I first heard that Amazon was going to start paying its Kindle Unlimited authors according to the number of pages in their books that actually get read, I wondered whether there might be an opportunity for an intra-Amazon arbitrage scheme that would allow me to game the system and drain Jeff Bezos’s bank account. I thought I might be able to start publishing long books of computer-generated gibberish and then use Amazon’s Mechanical Turk service to pay Third World readers to scroll through the pages at a pace that would register each page as having been read. If I could pay the Turkers a fraction of a penny less to look at a page than Amazon paid me for the “read” page, I’d be able to get really rich and launch my own space exploration company.

Alas, I couldn’t make the numbers work. Amazon draws the royalties for the program from a fixed pool of funds, which serves to cap the upside for devious scribblers.

So much for my Mars vacation. Still, even in a zero-sum game that pits writer against writer, I figured I might be able to steal a few pennies from the pockets of my fellow authors. (I hate them all, anyway.) I would just need to do a better job of mastering the rules of the game, which Amazon was kind enough to lay out for me:

Under the new payment method, you’ll be paid for each page individual customers read of your book, the first time they read it. … To determine a book’s page count in a way that works across genres and devices, we’ve developed the Kindle Edition Normalized Page Count (KENPC). We calculate KENPC based on standard settings (e.g. font, line height, line spacing, etc.), and we’ll use KENPC to measure the number of pages customers read in your book, starting with the Start Reading Location (SRL) to the end of your book.

The first thing that has to be said is that if you’re a poet, you’re screwed.•

 

Tags:

Wow, this is wonderful: Nicholas Carr posted a great piece from a recent lecture in which he addressed Marshall McLuhan’s idea of automation as media. In this excerpt, he tells a history of how cartography, likely the first medium, went from passive to active player as we transitioned from paper to software:

I’m going to tell the story through the example of the map, which happens to be my all-time favorite medium. The map was, so far as I can judge, the first medium invented by the human race, and in the map we find a microcosm of media in general. The map originated as a simple tool. A person with knowledge of a particular place drew a map, probably in the dirt with a stick, as a way to communicate his knowledge to another person who wanted to get somewhere in that place. The medium of the map was just a means to transfer useful knowledge efficiently between a knower and a doer at a particular moment in time.

Then, at some point, the map and the mapmaker parted company. Maps started to be inscribed on pieces of hide or stone tablets or other objects more durable and transportable than a patch of dirt, and when that happened the knower’s presence was no longer necessary. The map subsumed the knower. The medium became the knowledge. And when a means of mechanical reproduction came along — the printing press, say — the map became a mass medium, shared by a large audience of doers who wanted to get from one place to another.

For most of recent history, this has been the form of the map we’ve all been familiar with. You arrive in some new place, you go into a gas station and you buy a map, and then you examine the map to figure out where you are and to plot a route to get to wherever you want to be. You don’t give much thought to the knower, or knowers, whose knowledge went into the map. As far as you’re concerned, the medium is the knowledge.

Something very interesting has happened to the map recently, during the course of our own lives. When the medium of the map was transferred from paper to software, the map gained the ability to speak to us, to give us commands. With Google Maps or an in-dash GPS system, we no longer have to look at a map and plot out a route for ourselves; the map assumes that work. We become the actuators of the map’s instructions: the assistants who, on the software’s command, turn the wheel. You might even say that our role becomes that of a robotic apparatus controlled by the medium.

So, having earlier subsumed the knower, the map now begins to subsume the doer. The medium becomes the actor.

In the next and ultimate stage of this story, the map becomes the vehicle. The map does the driving.•

Tags:

It’s perplexing the American school system (and no other that I know of) doesn’t employ video games as teaching tools, since they’re both satisfying and edifying and can allow students to pursue knowledge at a personalized pace. It’s a real lost opportunity to think learning can’t be vibrant and fun.

Beyond the classroom, Nicholas Carr wonders why software is created to pose no obstacles to us, to not challenge us but replace us. He addresses this point, among others, in an excellent discussion with Tom Chatfield of BBC Future. An excerpt:

Should life be more like a video game?

Tom Chatfield:

I was glad to see that you use video games in the book as an example of human-machine interactions where the difficulty is the point rather than the problem. Successful games are like a form of rewarding work, and can offer the kind of complex, constant, meaningful feedback that we have evolved to find deeply satisfying. Yet there is also a bitter irony, for me, in the fact that the work some people do on a daily basis is far-less skilled and enjoyable and rewarding. 

Nicholas Carr:

Video games are very interesting because in their design they go against all of the prevailing assumptions about how you design software. They’re not about getting rid of friction, they’re not about making sure that the person using them doesn’t have to put in much effort or think that much. The reason we enjoy them is because they don’t make it easy for us. They constantly push us up against friction – not friction that simply frustrates us, but friction that leads to ever-higher levels of talent.

If you look at that and compare it to what we know about how people gain expertise, how we build talent, it’s very, very similar. We know that in order to gain talent you have to come up against hard challenges in which you exercise your skills to the utmost, over and over again, and slowly you gain a new level of skill, and then you are challenged again. 

And also I think, going even further, that the reason people enjoy videogames is the same reason that people enjoy building expertise and overcoming challenges. It’s really fundamentally enjoyable to be struggling with a hard challenge that we then ultimately overcome, and that gives us the talent necessary to tackle an even harder challenge.

One of the fundamental concerns of the book is the fear that we are creating a world based on the assumption that the less we have to engage in challenging tasks, the better. It seems to me that that is antithetical to everything we know about what makes us satisfied and fulfilled and happy.•

Tags: ,

I wish everyone writing about technology could turn out prose as sparkling and lucid as Nicholas Carr. In a New York Times opinion piece, he stresses that while people are flawed, so are computers, and our silicon counterparts thus far lack the dexterity we possess to react to the unforeseen. He suggests humans and machines permanently remain a team, allowing us to benefit from the best of both.

I think that’s the immediate future, but I still believe market forces will ultimately cede to robots anything they can do as well (or nearly as well) as humans. And I’m curious as to the effects of Deep Learning on the impromptu responses of machinery.

From Carr:

While our flaws loom large in our thoughts, we view computers as infallible. Their scripted consistency presents an ideal of perfection far removed from our own clumsiness. What we forget is that our machines are built by our own hands. When we transfer work to a machine, we don’t eliminate human agency and its potential for error. We transfer that agency into the machine’s workings, where it lies concealed until something goes awry.
 
Computers break down. They have bugs. They get hacked. And when let loose in the world, they face situations that their programmers didn’t prepare them for. They work perfectly until they don’t.
 
Many disasters blamed on human error actually involve chains of events that are initiated or aggravated by technological failures. Consider the 2009 crash of Air France Flight 447 as it flew from Rio de Janeiro to Paris. The plane’s airspeed sensors iced over. Without the velocity data, the autopilot couldn’t perform its calculations. It shut down, abruptly shifting control to the pilots. Investigators later found that the aviators appeared to be taken by surprise in a stressful situation and made mistakes. The plane, with 228 passengers, plunged into the Atlantic.

The crash was a tragic example of what scholars call the automation paradox. Software designed to eliminate human error sometimes makes human error more likely. When a computer takes over a job, the workers are left with little to do. Their attention drifts. Their skills, lacking exercise, atrophy. Then, when the computer fails, the humans flounder.

Tags:

In her NYRB piece on Nicholas Carr’s The Glass Cage, Sue Halpern runs through periods of the twentieth century when fears of technological unemployment were raised before receding, mentioning a 1980 Time cover story about the Labor-destabilizing force of machines. These projections seemed to have been proved false as job creation increased considerably during the Reagan Administration, but as Halpern goes on to note, that feature article may have been prescient in ways we didn’t then understand. Income inequality began to boom during the last two decades of the previous century, a worrying trajectory that’s only been exacerbated as we’ve moved deeper into the Digital Revolution. Certainly there are other causes but automation is likely among them, with the new wealth in the hands of fewer, algorithms and robots managing a good portion of the windfall-creating toil. And if you happen to be working in many of the fields likely to soon be automated (hotels, restaurants, warehouses, etc.), you might want to ask some former travel agents and record-store owners for resume tips. 

Halpern zeroes in on a Carr topic often elided by economists debating whether the next few decades will be boon or bane for the non-wealthy: the hole left in our hearts when we’re “freed” of work. Is that something common to us because we were born on the other side of the transformation, or are humans marked indelibly with the need to produce beyond tweets and likes? Maybe it’s the work, not the play, that’s the thing. From Halpern:

Here is what that future—which is to say now—looks like: banking, logistics, surgery, and medical recordkeeping are just a few of the occupations that have already been given over to machines. Manufacturing, which has long been hospitable to mechanization and automation, is becoming more so as the cost of industrial robots drops, especially in relation to the cost of human labor. According to a new study by the Boston Consulting Group, currently the expectation is that machines, which now account for 10 percent of all manufacturing tasks, are likely to perform about 25 percent of them by 2025. (To understand the economics of this transition, one need only consider the American automotive industry, where a human spot welder costs about $25 an hour and a robotic one costs $8. The robot is faster and more accurate, too.) The Boston group expects most of the growth in automation to be concentrated in transportation equipment, computer and electronic products, electrical equipment, and machinery.

Meanwhile, algorithms are writing most corporate reports, analyzing intelligence data for the NSA andCIA, reading mammograms, grading tests, and sniffing out plagiarism. Computers fly planes—Nicholas Carr points out that the average airline pilot is now at the helm of an airplane for about three minutes per flight—and they compose music and pick which pop songs should be recorded based on which chord progressions and riffs were hits in the past. Computers pursue drug development—a robot in the UK named Eve may have just found a new compound to treat malaria—and fill pharmacy vials.

Xerox uses computers—not people—to select which applicants to hire for its call centers. The retail giant Amazon “employs” 15,000 warehouse robots to pull items off the shelf and pack boxes. The self-driving car is being road-tested. A number of hotels are staffed by robotic desk clerks and cleaned by robotic chambermaids. Airports are instituting robotic valet parking. Cynthia Breazeal, the director of MIT’s personal robots group, raised $1 million in six days on the crowd-funding site Indiegogo, and then $25 million in venture capital funding, to bring Jibo, “the world’s first social robot,” to market. …

There is a certain school of thought, championed primarily by those such as Google’s Larry Page, who stand to make a lot of money from the ongoing digitization and automation of just about everything, that the elimination of jobs concurrent with a rise in productivity will lead to a leisure class freed from work. Leaving aside questions about how these lucky folks will house and feed themselves, the belief that most people would like nothing more than to be able to spend all day in their pajamas watching TV—which turns out to be what many “nonemployed” men do—sorely misconstrues the value of work, even work that might appear to an outsider to be less than fulfilling. Stated simply: work confers identity. When Dublin City University professor Michael Doherty surveyed Irish workers, including those who stocked grocery shelves and drove city buses, to find out if work continues to be “a significant locus of personal identity,” even at a time when employment itself is less secure, he concluded that “the findings of this research can be summed up in the succinct phrase: ‘work matters.’”

How much it matters may not be quantifiable, but in an essay in The New York Times, Dean Baker, the codirector of the Center for Economic and Policy Research, noted that there was

a 50 to 100 percent increase in death rates for older male workers in the years immediately following a job loss, if they previously had been consistently employed.

One reason was suggested in a study by Mihaly Csikszentmihalyi, the author of Flow: The Psychology of Optimal Experience (1990), who found, Carr reports, that “people were happier, felt more fulfilled by what they were doing, while they were at work than during their leisure hours.”

Tags: , , , ,

Because of computerized autopilot systems and a greater understanding of wind shears, flying has never been safer than it is right now. Boarding a domestic carrier in the United States is a particularly low-risk means of travel. But increasingly automated aviation can cause human pilots to experience skill fade, something which has alarmed Nicholas Carr, and now Steve Casner of Slate is concerned about two-pilot cockpits being halved. My assumption is that if accidents remain the rare exception, the automation process will continue apace. An excerpt:

Now that we’ve gone from four pilots to two, and with more automation on the way, you don’t need to be a mind reader to know what the industry is thinking next. The aircraft manufacturer Embraer has already revealed plans for a single-pilot regional jet, and Cessna has produced several small single-pilot jets. (I’m rated to fly this one.) And as my colleagues at NASA are busy studying the feasibility of large single-pilot airliners, a Delta Air Lines pilot made it look easy a few weeks ago when the other pilot was accidentally locked out of the cockpit. But should we be a little nervous about the idea of having just one pilot up there in the front office? The research says maybe so.

Studies show that pilots make plenty of errors. That’s why we have two pilots in the airline cockpit—to construct a sort of human safety net. While one pilot operates the aircraft’s controls, the other pilot keeps watch for occasional errors and tries to point them out before they cause any harm. NASA engineer Everett Palmer likes to sum up the idea with a quip: “To err is human, to be error-tolerant is divine.” Keeping the error-maker and getting rid of the error-catcher may not prove to be very error-tolerant.

Besides, automation doesn’t eliminate human error—it just relocates it. The engineers and programmers who design automation are humans, too. They write complex software that contains bugs and nuances. Pilots often speak of automation surprises in which the computers do something unexpected, occasionally resulting in accidents. Having only one pilot in the cockpit might compromise our ability to make sense of these technological noodle-scratchers when they pop up.•

Tags: ,

The Penguin blog has a Nicholas Carr essay about modern navigation devices and the effect they have on the “maps” in our brains, “Welcome to Nowheresville,” which is adapted from a piece of his most recent book, The Glass Cage. Carr is one of those blessed thinkers I always enjoy reading whether I agree with him or not. I don’t necessarily share his concerns about how GPS is redefining what it is to be human (we’ve always been and always will be fluidly defined) or “skill fade” causing transportation fatalities (the net number of such deaths will likely decline as travel becomes more autonomous), but it’s certainly worth considering the unknown neurological consequences of offloading our piloting skills. Are we unwittingly creating a new mismatch disease? An excerpt:

A loss of navigational acumen can have dire consequences for airline pilots and lorry drivers. Most of us, in our daily routines of driving and walking and otherwise getting around, are unlikely to find ourselves in such perilous spots. Which raises the obvious question: Who cares? As long as we arrive at our destination, does it really matter whether we maintain our navigational sense or offload it to a machine? Those of us living in lands crisscrossed by well marked roads and furnished with gas stations, motels, and 7-Elevens long ago lost both the custom of and the capacity for prodigious feats of wayfinding. Our ability to perceive and interpret topography, especially in its natural state, is already much reduced. Paring it away further, or dispensing with it altogether, doesn’t seem like such a big deal, particularly if in exchange we get an easier go of it.

But while we may no longer have much of a cultural stake in the conservation of our navigational prowess, we still have a personal stake in it. We are, after all, creatures of the earth. We’re not abstract dots proceeding along thin blue lines on computer screens. We’re real beings in real bodies in real places. Getting to know a place takes effort, but it ends in fulfillment and in knowledge. It provides a sense of personal accomplishment and autonomy, and it also provides a sense of belonging, a feeling of being at home in a place rather than passing through it. …

The harder people work at building cognitive maps of space, the stronger their underlying memory circuits seem to become. They can actually grow grey matter in the hippocampus—a phenomenon documented in cab drivers—in a way that’s analogous to the building of muscle mass through physical exertion.

But when they simply follow turn-by-turn instructions in “a robotic fashion,” Bohbot warns, they don’t “stimulate their hippocampus” and as a result may leave themselves more susceptible to memory loss. Bohbot worries that, should the hippocampus begin to atrophy from a lack of use in navigation, the result could be a general loss of memory and a growing risk of dementia. “Society is geared in many ways toward shrinking the hippocampus,” she told an interviewer. “In the next twenty years, I think we’re going to see dementia occurring earlier and earlier.”•

Tags:

A debate (by proxy) between Nicholas Carr and Andrew McAfee, two leading thinkers about the spreed of automation, takes place in Zoë Corbyn’s Guardian article about Carr’s most-recent book, The Glass Cage. I doubt the proliferation of Weak AI will ultimately be contained much beyond niches despite any good dissenting arguments. An excerpt:

As doctors increasingly follow automated diagnostic templates and architects use computer programs to generate their building plans, their jobs become duller. “At some point you turn people into computer operators – and that’s not a very interesting job,” Carr says. We now cede even moral choices to our technology, he says. The Roomba vacuum cleaning robot, for example, will unthinkingly hoover up a spider that we may have saved.

Not everyone buys Carr’s gloomy argument. People have always lamented the loss of skills due to technology: think about the calculator displacing the slide rule, says Andrew McAfee, a researcher at the MIT Sloan School of Management. But on balance, he says, the world is better off because of automation. There is the occasional high-profile crash – but greater automation, not less, is the answer to avoiding that.

Carr counters that we must start resisting the urge to increase automation unquestioningly. Reserving some tasks for humans will mean less efficiency, he acknowledges, but it will be worth it in the long run.•

Tags: , ,

Via Nicholas Carr’s blog, Rough Type, I came across “HAL, Mother, and Father,” Jason Z. Resnikoff’s Paris Review post about his father’s generation, who, in 1968, viewed Stanley Kubrick’s sci-fi future, even his rogue computer, with techno-optimism, a feeling that short-circuited within a decade. An excerpt:

2001 is the brainchild of Stanley Kubrick and Arthur C. Clarke, who intended the film as a vision of things that seemed destined to come. In large part this fact has been lost on more recent generations of viewers who regard the movie as almost entirely metaphorical. Not so. The film was supposed to describe events that were really about to happen—that’s why Kubrick and Clarke went to such lengths to make it realistic, dedicating months to researching the ins and outs of manned spaceflight. They were so successful that a report written in 2005 from NASA’s Scientific and Technical Information Program Office argues that 2001 is today still “perhaps the most thoroughly and accurately researched film in screen history with respect to aerospace engineering.” Kubrick shows the audience exactly how artificial gravity could be maintained in the endless free-fall of outer space; how long a message would take to reach Jupiter; how people would eat pureed carrots through a straw; how people would poop in zero G. Curious about extraterrestrial life, Kubrick consulted Carl Sagan (evidently an expert) and made changes to the script accordingly.

It’s especially ironic because anyone who sees the film today will be taken aback by how unrealistic it is. The U.S. is not waging the Cold War in outer space. We have no moon colonies, and our supercomputers are not nearly as super as the murderous HAL. Pan Am does not offer commercial flights into high-Earth orbit, not least because Pan-Am is no more. Based on the rate of inflation, a video-payphone call to a space station should, in theory, cost far more than $1.70, but that wouldn’t apply when the payphone is a thing of the past. More important, everything in 2001 looks new. From heavy capital to form-fitting turtlenecks—thank goodness, not the mass fashion phenomenon the film anticipated—it all looks like it was made yesterday. But despite all of that, when you see the movie today you see how 1968 wasn’t just about social and political reform; people thought they were about to evolve, to become something wholly new, a revolution at the deepest level of a person’s essence.•

Tags: ,

I love reading Nicholas Carr, so bright he is and such a blessedly lucid writer, though I don’t always find myself agreeing with him. I won’t blame him for the headline of his latest WSJ piece, “Automation Makes Us Dumb,” but I do take issue with his idea that we should be alarmed that AI is causing “skill fade” in airline pilots, making it dangerous to fly. It’s no less scary for a plane to crash by human hand rather than because of a computer failure (or because of some combined failure of the two). It’s bad regardless. But accidents on domestic airlines in America have become almost non-existent as the crafts have become more computerized and we’ve learned to navigate wind shears. That wouldn’t be the case without machines aiding planes, which are, you know, machines. I think Carr’s enthusiasm for “adaptive automation” makes sense, at least in the short and medium terms, though ultimately I favor whatever most often prevents plane noses from touching earth. From Carr:

“In the 1950s, a Harvard Business School professor named James Bright went into the field to study automation’s actual effects on a variety of industries, from heavy manufacturing to oil refining to bread baking. Factory conditions, he discovered, were anything but uplifting. More often than not, the new machines were leaving workers with drabber, less demanding jobs. An automated milling machine, for example, didn’t transform the metalworker into a more creative artisan; it turned him into a pusher of buttons.

Bright concluded that the overriding effect of automation was (in the jargon of labor economists) to ‘de-skill’ workers rather than to ‘up-skill’ them. ‘The lesson should be increasingly clear,’ he wrote in 1966. ‘Highly complex equipment’ did not require ‘skilled operators. The ‘skill’ can be built into the machine.’

We are learning that lesson again today on a much broader scale. As software has become capable of analysis and decision-making, automation has leapt out of the factory and into the white-collar world. Computers are taking over the kinds of knowledge work long considered the preserve of well-educated, well-trained professionals: Pilots rely on computers to fly planes; doctors consult them in diagnosing ailments; architects use them to design buildings. Automation’s new wave is hitting just about everyone.”

Tags: ,

With computers so small they all but disappear, the infrastructure silently becoming more and more automated, what else will vanish from our lives and ourselves? I’m someone who loves the new normal of decentralized, free-flowing media, who thinks the gains are far greater than the losses, but it’s a question worth asking. Via Longreads, an excerpt from The Glass Cage, a new book by that Information Age designated mourner Nicholas Carr:

“There’s a big difference between a set of tools and an infrastructure. The Industrial Revolution gained its full force only after its operational assumptions were built into expansive systems and networks. The construction of the railroads in the middle of the nineteenth century enlarged the markets companies could serve, providing the impetus for mechanized mass production. The creation of the electric grid a few decades later opened the way for factory assembly lines and made all sorts of home appliances feasible and affordable. These new networks of transport and power, together with the telegraph, telephone, and broadcasting systems that arose alongside them, gave society a different character. They altered the way people thought about work, entertainment, travel, education, even the organization of communities and families. They transformed the pace and texture of life in ways that went well beyond what steam-powered factory machines had done.

The historian Thomas Hughes, in reviewing the arrival of the electric grid in his book Networks of Power, described how first the engineering culture, then the business culture, and finally the general culture shaped themselves to the new system. ‘Men and institutions developed characteristics that suited them to the characteristics of the technology,’ he wrote. ‘And the systematic interaction of men, ideas, and institutions, both technical and nontechnical, led to the development of a supersystem—a sociotechnical one—with mass movement and direction.’ It was at this point that what Hughes termed ‘technological momentum’ took hold, both for the power industry and for the modes of production and living it supported. ‘The universal system gathered a conservative momentum. Its growth generally was steady, and change became a diversification of function.’ Progress had found its groove.

We’ve reached a similar juncture in the history of automation. Society is adapting to the universal computing infrastructure—more quickly than it adapted to the electric grid—and a new status quo is taking shape. …

The science-fiction writer Arthur C. Clarke once asked, ‘Can the synthesis of man and machine ever be stable, or will the purely organic component become such a hindrance that it has to be discarded?’ In the business world at least, no stability in the division of work between human and computer seems in the offing. The prevailing methods of computerized communication and coordination pretty much ensure that the role of people will go on shrinking. We’ve designed a system that discards us. If unemployment worsens in the years ahead, it may be more a result of our new, subterranean infrastructure of automation than of any particular installation of robots in factories or software applications in offices. The robots and applications are the visible flora of automation’s deep, extensive, and invasive root system.”

Tags: ,

The robot-aided piloting of airplanes has been around longer than many people may realize. And soon it will be in cars as well. For the most part, that’s a great thing. Plane crashes in U.S. commercial airliners aren’t exactly a thing of the past, but almost, as the autonomous function combined with knowledge of wind shears has reduced dangers markedly. Roboplanes also wrestled the controls from often-autocratic lead pilots, whose refusal to listen to dissent led to many air crashes.

But there’s a new peril attendant to autonomous steering: As Nicholas Carr outlined last year, pilots are no longer as practiced should a technological glitch happen (and they will occur, if rarely).

The question is: Since the big-picture of safety has been greatly improved in aviation, how concerned should we be about technology causing some human pilot skills to atrophy? The same question can be applied to robocars and drivers going forward.

From “The Human Factor,” William Langewiesche’s Vanity Fair article about the safety measures that can occasionally making flying unsafe:

“These are generally known as ‘fourth generation’ airplanes; they now constitute nearly half the global fleet. Since their introduction, the accident rate has plummeted to such a degree that some investigators at the National Transportation Safety Board have recently retired early for lack of activity in the field. There is simply no arguing with the success of the automation. The designers behind it are among the greatest unheralded heroes of our time. Still, accidents continue to happen, and many of them are now caused by confusion in the interface between the pilot and a semi-robotic machine. Specialists have sounded the warnings about this for years: automation complexity comes with side effects that are often unintended. One of the cautionary voices was that of a beloved engineer named Earl Wiener, recently deceased, who taught at the University of Miami. Wiener is known for ‘Wiener’s Laws,’ a short list that he wrote in the 1980s. Among them:

  • Every device creates its own opportunity for human error.
  • Exotic devices create exotic problems.
  • Digital devices tune out small errors while creating opportunities for large errors.
  • Invention is the mother of necessity.
  • Some problems have no solution.
  • It takes an airplane to bring out the worst in a pilot.
  • Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
  • You can never be too rich or too thin (Duchess of Windsor) or too careful about what you put into a digital flight-guidance system (Wiener).

Wiener pointed out that the effect of automation is to reduce the cockpit workload when the workload is low and to increase it when the workload is high. Nadine Sarter, an industrial engineer at the University of Michigan, and one of the pre-eminent researchers in the field, made the same point to me in a different way: ‘Look, as automation level goes up, the help provided goes up, workload is lowered, and all the expected benefits are achieved. But then if the automation in some way fails, there is a significant price to pay. We need to think about whether there is a level where you get considerable benefits from the automation but if something goes wrong the pilot can still handle it.’

Sarter has been questioning this for years and recently participated in a major F.A.A. study of automation usage, released in the fall of 2013, that came to similar conclusions. The problem is that beneath the surface simplicity of glass cockpits, and the ease of fly-by-wire control, the designs are in fact bewilderingly baroque—all the more so because most functions lie beyond view. Pilots can get confused to an extent they never would have in more basic airplanes. When I mentioned the inherent complexity to Delmar Fadden, a former chief of cockpit technology at Boeing, he emphatically denied that it posed a problem, as did the engineers I spoke to at Airbus. Airplane manufacturers cannot admit to serious issues with their machines, because of the liability involved, but I did not doubt their sincerity. Fadden did say that once capabilities are added to an aircraft system, particularly to the flight-management computer, because of certification requirements they become impossibly expensive to remove. And yes, if neither removed nor used, they lurk in the depths unseen. But that was as far as he would go.”

Tags: , , , ,

« Older entries