Ah, to be a fly on the wall in the White House in the aftermath of 9/11, once President Bush finally rested his copy of The Pet Goat and returned to the business at hand. If Al-Qaeda’s destruction of the World Trade Center was merely Step 1 of its plan to damage America, it was a scheme ultimately realized on a grand level. Our decisions in response to the large-scale terrorism, for the better part of the decade, did more harm to us than even the initial attack. Of course, in retrospect, there were potential reactions with even more far-reaching implications that went unrealized.

In a Spiegel Q&A, René Pfister and Gordon Repinski ask longtime German diplomat Michael Steiner about an alternative history that might have unfolded in the wake of September 11. An excerpt:

Spiegel:

The attacks in the United States on Sept. 11, 2001 came during your stint as Chancellor Gerhard Schröder’s foreign policy advisor. Do you remember that day?

Michael Steiner:

Of course, just like everybody, probably. Schröder was actually supposed to hold a speech that day at the German Council on Foreign Relations in Berlin. My people had prepared a nice text for him, but when he was supposed to head out, he — like all of us — couldn’t wrest himself away from the TV images of the burning Twin Towers. Schröder said: “Michael, you go there and explain to the people that I can’t come today.”

Spiegel:

What was it like in the days following the attacks?

Michael Steiner:

Condoleezza Rice was George W. Bush’s security advisor at the time. I actually had quite a good relationship with her. But after Sept. 11, the entire administration positively dug in. We no longer had access to Rice, much less to the president. It wasn’t just our experience, but also that of the French and British as well. Of course that made us enormously worried.

Spiegel:

Why?

Michael Steiner:

Because we thought that the Americans would overreact in response to the initial shock. For the US, it was a shocking experience to be attacked on their own soil.

Spiegel:

What do you mean, overreact? Were you afraid that Bush would attack Afghanistan with nuclear weapons?

Michael Steiner:

The Americans said at the time that all options were on the table. When I visited Condoleezza Rice in the White House a few days later, I realized that it was more than just a figure of speech.

Spiegel:

The Americans had developed concrete plans for the use of nuclear weapons in Afghanistan?

Michael Steiner:

They really had thought through all scenarios. The papers had been written.•

Tags: , , ,

The future seldom arrives in a hurry, which is usually a good thing from a practical standpoint. Today and tomorrow don’t always mix so well.

In an opinion piece at The Conversation, David Glance of the University of Western Australia argues that fears of near-term technological unemployment are overstated. He may be right in the big picture, but if just one significant area is realized in short order, defying business-as-usual stasis–driverless cars is the most obvious example–a large swath of Labor will be blown sideways. 

From Glance:

The trouble with predicting the future is that the more dramatic the prediction the more likely the media will pick it up and amplify it in the social media-fed echo chamber. What is far less likely to be reported are the predictions that emphasise that it is unlikely that things will change that radically because the of the massive inertia that is built into industry, governments and the general workers’ appetite for change.

Economists at the OECD may have another explanation for why it is unwise to equate the fact that something “could” be done with the fact that it “will” be done. In a report on the future of productivity, the authors detail how it is only a small number of “frontier companies” have managed to implement changes to achieve high levels of productivity growth. The companies that haven’t achieved anywhere near the same productivity growth are the “non-frontier companies” or simply “laggards.” The reasons for this are probably many but lack of leadership, vision, skills or ability may factor into it.

The point is that since 2000 many companies didn’t adopt technology and change their business processes to see improvements in productivity even though they clearly “could” have done.•

Tags:

From the July 8, 1889 Brooklyn Daily Eagle:

Tags: ,

Some people don’t know how to accept a gift. America has many such people among its government, as apparently do numerous other developed nations. 

One of the few upsides to the colossal downside of the 2008 economic collapse is the rock-bottom interest rates that offer countries the opportunity to rebuild their infrastructure at virtually no added cost. It’s a tremendous immediate stimulus that also pays long-term dividends. But deficit hawks have made it impossible for President Obama to take advantage of this rare and relatively short-term opportunity. While some of it is certainly partisanship, it does seem like a large number of elected officials have pretty much no idea of basic economics.

From the Economist:

IT IS hard to exaggerate the decrepitude of infrastructure in much of the rich world. One in three railway bridges in Germany is over 100 years old, as are half of London’s water mains. In America the average bridge is 42 years old and the average dam 52. The American Society of Civil Engineers rates around 14,000 of the country’s dams as “high hazard” and 151,238 of its bridges as “deficient”. This crumbling infrastructure is both dangerous and expensive: traffic jams on urban highways cost America over $100 billion in wasted time and fuel each year; congestion at airports costs $22 billion and another $150 billion is lost to power outages.

The B20, the business arm of the G20, a club of big economies, estimates that the global backlog of spending needed to bring infrastructure up to scratch will reach $15 trillion-20 trillion by 2030. McKinsey, a consultancy, reckons that in 2007-12 investment in infrastructure in rich countries was about 2.5% of GDP a year when it should have been 3.5%. If anything, the problem is becoming more acute as some governments whose finances have been racked by the crisis cut back. In 2013 in the euro zone, general government investment—of which infrastructure constitutes a large part—was around 15% below its pre-crisis peak of €3 trillion ($4 trillion), according to the European Commission, with drops as high as 25% in Italy, 39% in Ireland and 64% in Greece. In the same year government spending on infrastructure in America, at 1.7% of GDP, was at a 20-year low.

This is a missed opportunity. Over the past six years, the cost of repairing old infrastructure or building new projects has been much cheaper than normal, thanks both to rock-bottom interest rates and ample spare capacity in the construction industry.•

Sad to hear of the passing of Dr. Oliver Sacks, the neurologist and writer, who made clear in his case studies that the human brain, a friend and a stranger, was as surprising as any terrain we could ever explore. It feels like we’ve not only lost a great person, but one who was uniquely so. He became hugely famous with the publication of his 1985 collection, The Man Who Mistook His Wife For A Hat, which built upon the template of A.R. Luria’s work with better writing and a wider array of investigations. Two years prior, he published an essay In the London Review of Books that became the title piece. An excerpt: 

I stilled my disquiet, his perhaps too, in the soothing routine of a neurological exam – muscle strength, co-ordination, reflexes, tone. It was while examining his reflexes – a trifle abnormal on the left side – that the first bizarre experience occurred. I had taken off his left shoe and scratched the sole of his foot with a key – a frivolous-seeming but essential test of a reflex – and then, excusing myself to screw my ophthalmoscope together, left him to put on the shoe himself. To my surprise, a minute later, he had not done this.

‘Can I help?’I asked.

‘Help what? Help whom?’

‘Help you put on your shoe.’

‘Ach,’ he said, ‘I had forgotten the shoe,’ adding, sotto voce: ‘The shoe! The shoe?’ He seemed baffled.

‘Your shoe,’ I repeated. ‘Perhaps you’d put it on.’

He continued to look downwards, though not at the shoe, with an intense but misplaced concentration. Finally his gaze settled on his foot: ‘That is my shoe, yes?’

Did I mishear? Did he mis-see? ‘My eyes,’ he explained, and put a hand to his foot. ‘This is my shoe, no?’

‘No, it is not. That is your foot. There is your shoe.’

‘Ah! I thought that was my foot.’

Was he joking? Was he mad? Was he blind? If this was one of his ‘strange mistakes’, it was the strangest mistake I had ever come across.

I helped him on with his shoe (his foot), to avoid further complication. Dr P. himself seemed untroubled, indifferent, maybe amused. I resumed my examination. His visual acuity was good: he had no difficulty seeing a pin on the floor, though sometimes he missed it if it was placed to his left.

He saw all right, but what did he see? I opened out a copy of the National Geographic Magazine, and asked him to describe some pictures in it. His eyes darted from one thing to another, picking up tiny features, as he had picked up the pin. A brightness, a colour, a shape would arrest his attention and elicit comment, but it was always details that he saw – never the whole. And these details he ‘spotted’, as one might spot blips on a radar-screen. He had no sense of a landscape or a scene.

I showed him the cover, an unbroken expanse of Sahara dunes.

‘What do you see here?’I asked.

‘I see a river,’ he said. ‘And a little guesthouse with its terrace on the water. People are dining out on the terrace. I see coloured parasols here and there.’ He was looking, if it was ‘looking’, right off the cover, into mid-air, and confabulating non-existent features, as if the absence of features in the actual picture had driven him to imagine the river and the terrace and the coloured parasols.

I must have looked aghast, but he seemed to think he had done rather well. There was a hint of a smile on his face. He also appeared to have decided the examination was over, and started to look round for his hat. He reached out his hand, and took hold of his wife’s head, tried to lift it off, to put it on. He had apparently mistaken his wife for a hat!•

Tags:

Long before Caitlyn Jenner, there was Christine Jorgensen, a Bronx military veteran who traveled to Denmark in the early 1950s to transition surgically into a woman. It was, as you might expect, a huge sensation at the time, but Jorgensen was always above the fray, whether guesting on ur-shock jock Joe Pyne’s gleefully tasteless talk show in 1966, or visiting with Tom Snyder in 1982, as she revived her cabaret act.

Life is full of inconvenient truths, and one of them is that Theodor Geisel, better known as Dr. Seuss, the wonderful storyteller who continues to teach children to read and think, was responsible for some shockingly racist drawings and ad campaigns early in his career. In 1958, he appeared on To Tell the Truth, at the time The Cat in the Hat, his most popular work, was becoming a huge bestseller.

Col. Harland Sanders was 62 when, as the story goes, he used his first Social Security check to found his bird-slaughter enterprise, Kentucky Fried Chicken. He was 74 in 1964 when he sold the business for $2 million. Sanders appeared directly after the sale on I’ve Got a Secret.

Tags: , , ,

Before Jerry Bruckheimer was one of the world’s most successful film and TV producers, he and his partner Don Simpson were 1980s Hollywood wunderkinds, matching high energy to pop music in a handful of brash blockbuster vehicles. The most successful of them was probably Top Gun, a muscular ode to Reagan Era militarization.

But even by the lax standards of Hollywood, Simpson was a huge mess, addicted to drugs, plastic surgery, prostitutes and S&M. Bruckheimer dragged his feet on dissolving the partnership, but he knew he needed to distance himself from his toxic collaborator. As a 1996 Wall Street Journal report by Thomas R. King and John Lippman put it in the wake of Simpson’s death due to 21 different drugs in his system, the end came like this: “For Jerry Bruckheimer, the last straw was the dead doctor in the pool house.”

Another excerpt from that same WSJ piece:

Surgery and Diets

The hits, however, seemed to dry up. The producers, close friends say, were still reeling from the disappointment of Days of Thunder and were struggling to figure out how their formula went wrong.

Friends noticed that Mr. Simpson, who had a weight problem and a penchant for yo-yo dieting, seemed increasingly determined to reinvent himself. He underwent a series of plastic-surgery operations; one friend says that among the procedures he had were a chin implant, several face lifts, and placenta injections. He began disappearing for months at a time, telling friends he was at Canyon Ranch, where most visitors stay only a few days. And he began talking about finding new projects in which he could appear as an actor.

At night, he led a life that a number of people close to him thought was growing increasingly dangerous. He had always been known for his appetite for prostitutes; he was close friends with Hollywood madam Heidi Fleiss.

But Mr. Simpson was going beyond sex, sinking deeper into increasingly sadomasochistic and destructive behavior, say people who know him. His reputation was such that he is the subject of an entire chapter — titled “Don Simpson: An Education in Pain” — in a salacious new book penned by four Hollywood prostitutes. The book, You’ll Never Make Love in This Town Again, says his “serious bondage games were like something out of Marquis de Sade.”

A Prostitute and Kierkegaard

James Toback, a screenwriter who may have been the last person to talk with Mr. Simpson before his death, says his friend would frequently regale him with stories of his exploits with women.

“I know that he was obsessed with women, but it was not just sexual — it was psychological,” says Mr. Toback, a screenwriter of movies such as Bugsy. “He was never just interested in [having sex] with a girl. Even if it was a call girl, it was to get into some kind of serious philosophical discussion with her. He wanted to know what she read, what her parents were like, why she did what she did.” Mr. Toback tells of one conversation with Mr. Simpson: “He said he had met this girl, that she was fascinating and that her favorite philosopher was Kierkegaard.”

Mr. Toback says that he never saw Mr. Simpson take drugs. “But I had the feeling in many of our conversations, the last one included, that he was hyper and speeded up at the beginning,” Mr. Toback says. “But in the last hour, he’d been drinking a lot of red wine and he would wind down.”•

Tags: , , ,

Wernher von Braun wasn’t worried about helping to murder millions of people, but he was concerned about the solitude of astronauts during space travel. Odd priorities.

The philosophical spelunker Michel Siffre went so far as to embed himself in caves and icebergs for months at a time in the 1960s and 1970s to understand prolonged isolation. Time stopped having meaning for him. The pristine terrain he ultimately explored was inside his own head.

It’s perplexing in this age of robotics that extended space trips to Mars and such need to have humans at all. They’re far cheaper with just robots and can collect the same information. While colonization is the ultimate goal, it needn’t be the immediate one.

But we’re likely going up sooner than later, since peopled space flights are an easier sell. They flatter us, remind us of ourselves. Therefore, the loneliness of the long distance “runner” is a complicated problem for NASA and private programs. The longest such experiment testing human endurance in seclusion has just begun.

From the BBC:

A team of NASA recruits has begun living in a dome near a barren volcano in Hawaii to simulate what life would be like on Mars.

The isolation experience, which will last a year starting on Friday, will be the longest of its type attempted.

Experts estimate that a human mission to the Red Planet could take between one and three years.

The six-strong team will live in close quarters under the dome, without fresh air, fresh food or privacy.

They closed themselves away at 15:00 local time on Friday (01:00 GMT Saturday).

A journey outside the dome – which measures only 36ft (11m) in diameter and is 20ft (6m) tall – will require a spacesuit.

A French astrobiologist, a German physicist and four Americans – a pilot, an architect, a journalist and a soil scientist – make up the NASA team.

The men and women will each have a small sleeping cot and a desk inside their rooms. Provisions include powdered cheese and canned tuna.•

Tags:

Back when people were impressed by those who possessed lots of fairly useless facts, I was always good at trivia, and it never once made me feel smart or satisfied. Because it was just a parlor trick, really. Read a lot and in an irregular pattern and you too can be crammed with minutiae. Now that everyone can look up every last thing on their phones in just seconds, all of life has become an open-book test. Trivial knowledge is (thankfully) no longer valued.

From Douglas Coupland’s FT column about his participation in a Trivia Night contest:

The larger question for me during the trivia contest evening was, “Wait — we used to have all of this stuff stored in our heads but now, it would appear, we don’t. What happened?” The answer is that all of this crap is still inside our heads — in fact, there’s probably more crap than ever inside our heads — it’s just that we view it differently now. It’s been reclassified. It’s not trivia any more: it’s called the internet and it lives, at least for the foreseeable future, outside of us. The other thing that happened during the trivia contest is the realisation that we once had a thing called a-larger-attention-span-than-the-one-we-now-have. Combine these two factors together and we have a reasonably good reason to explain why a game of trivia in 2015 almost feels like torture. I sat there with four other reasonably bright people, not necessarily knowing the answers to all of the questions, but knowing that the answers, no matter how obtuse, could be had in a few seconds without judgment on my iPhone 6 Plus. But then I decided the evening was also a good reminder of how far things have come since the early 1980s heyday of the board game Trivial Pursuit.

Q: What country is north, east, south and west of Finland?

A: Norway.

Q: Clean, Jerk and Snatch are terms used in which sport?

A: Weightlifting.

Q: Why was trivia such a big thing in the late 20th century?

A: Because society was generating far more information than it was generating systems with which to access that information. People were left with constellations of disconnected, randomly stored facts that could leave one feeling overwhelmed. Trivia games flattered 20th-century trivia players by making them feel that there was both value to having billions of facts in one’s head, and that they were actually easily retrieved. But here in 2015 we know that facts are simply facts. We know where they’re stored and we know how to access them. If anything, we’re a bit ungrateful, given that we know the answer to just about everything.•

Tags:

 

10 search-engine keyphrases bringing traffic to Afflictor this week:

  1. howard carter finding king tut
  2. edward o. thorp on gambling
  3. diarrhea in a spaghetti pot
  4. woman swallows lizard
  5. fran lebowitz recent comments los angeles
  6. larry flynt and terry southern
  7. america’s very first freak show
  8. hugh hefner paul snider
  9. frank gifford fred exley
  10. what would aleksandr solzhenitsyn have thought of putin?
This week,

This week, President George W. Bush, who watched indifferently as New Orleans sank, returned to finish the job with a rain dance.

 

  • Evan Osnos explores the meaning of Trump’s early support.
  • Joseph Stiglitz offers a straightforward prescription for wealth inequality.
  • Biomimetics has progressed remarkably in the last decade.
  • Forrester Reports offers a relatively sanguine take on automation.
  • Julian Baggini explains why ISIS attacks on antiquities are so troubling.
  • Steve Ross rose from the funeral biz to the head of Warner Communications.
  • A brief note from 1891 about show biz.

It’s logical if not desirable that war becomes more automated, since it only takes one nation pursuing the dream of a robot army to detonate a new arms race. I’ve thought more about weapons systems discrete from human beings than I have about enhanced soldiers, but the U.S. Army Research Laboratory has already given great consideration to the latter. The recent report “Visualizing the Tactical Ground Battlefield in  the Year 2050imagines fewer of us going into battle but those that do being “super humans” augmented by exoskeletons, implants and internal sensors. It certainly ranges into what currently would be considered sci-fi territory.

From Patrick Tucker at Defense One:

People, too, will be getting a technological upgrade. “The battlefield of the future will be populated by fewer humans, but these humans would be physically and mentally augmented with enhanced capabilities that improve their ability to sense their environment, make sense of their environment, and interact with one another, as well as with ‘unenhanced humans,’ automated processes, and machines of various kinds,” says the report.

What exactly constitutes an enhanced human is a matter of technical dispute. After all, night-vision goggles represent a type of enhancement, as does armor. The military has no problem discussing future plans in those areas, but what the workshop participants anticipate goes well beyond flak jackets and gear. …

The report envisions enhancement taking several robotic steps forward. “To enable humans to partner effectively with robots, human team members will be enhanced in a variety of ways. These super humans will feature exoskeletons, possess a variety of implants, and have seamless access to sensing and cognitive enhancements. They may also be the result of genetic engineering. The net result is that they will have enhanced physical capabilities, senses, and cognitive powers. The presence of super humans on the battlefield in the 2050 timeframe is highly likely because the various components needed to enable this development already exist and are undergoing rapid evolution,” says the report.•

Tags:

Attempting to reverse aging–even defeat death–seems like science-fiction to most, but it’s just science to big-picture gerontologist Aubrey de Grey, who considers himself a practical person. Given enough time it certainly makes sense that radical life-extension will be realized, but the researcher is betting the march toward a-mortality will begin much sooner than expected. It frustrates him to no end that governments and individuals alike usually don’t accept death as a sickness to be treated. Some of those feelings boiled over when he was interviewed by The Insight. An excerpt:

The Insight:

I’m interested in the psychology of people, I guess you can put them into two camps: one doesn’t have an inherent understanding of what you’re doing or saying, and the other camp willingly resign themselves to living a relatively short life.

You’ve talked to a whole wealth of people and come across many counter-opinions, have any of them had any merit to you, have any of them made you take a step back and question your approach?

Aubrey de Grey:

Really, no. It’s quite depressing. At first, really, I was my own only affective critic for the feasibility – certainly never a case or example of an opinion that amounted to a good argument against the desirability of any of this work; that was always 100% clear to me, that it would be crazy to consider this to be a bad idea. It was just a question of how to go about it. All of the stupid things that people say, like, “Where would we put all the people?” or, “How would we pay the pensions?” or, “Is it only for the rich?” or, “Wont dictators live forever?” and so on, all of these things… it’s just painful. Especially since most of these things have been perfectly well answered by other people well before I even came along. So, it’s extraordinarily frustrating that people are so wedded to the process of putting this out of their minds, by however embarrassing their means; coming up with the most pathetic arguments, immediately switching their brains off before realising their arguments might indeed be pathetic.

The Insight:

It might be a very obvious question, but it just sprung to mind – maybe you’ve been asked this before, it’s extremely philosophical and speculative – what do you think happens when you die?

Aubrey de Grey:

Oh, fuck off. I don’t give a damn. I’m a practical kind of guy – I’m not intending to be that experiment.•

Tags:

The main difference between rich people and poor people is that rich people have more money. 

That’s it, really. Those with wealth are just as likely to form addictions, get divorces and engage in behaviors we deem responsible for poverty. They simply have more resources to fall back on. People without that cushion often land violently, land on the streets. Perhaps they should be extra careful since they’re in a more precarious position, but human beings are human beings: flawed. 

In the same ridiculously simple sense, homeless people are in that condition because they don’t have homes. A lot of actions and circumstances may have contributed to that situation, but the home part is the piece of the equation we can actually change. The Housing First initiative has proven thus far that it’s good policy to simply provide homes to people who have none. It makes sense in both human and economic terms. But it’s unpopular in the U.S. because it falls under the “free lunch” rubric, despite having its roots in the second Bush Administration. Further complicating matters is the shortage of urban housing in general.

In a smart Aeon essay, Susie Cagle looks at the movement, which has notably taken root in the conservative bastion of Utah, a state which has reduced homelessness by more than 90% in just ten years. An excerpt:

A new optimistic ideology has taken hold in a few US cities – a philosophy that seeks not just to directly address homelessness, but to solve it. During the past quarter-century, the so-called Housing First doctrine has trickled up from social workers to academics and finally to government. And it is working. On the whole, homelessness is finally trending down.

The Housing First philosophy was first piloted in Los Angeles in 1988 by the social worker Tanya Tull, and later tested and codified by the psychiatrist Sam Tsemberis of New York University. It is predicated on a radical and deeply un-American notion that housing is a right. Instead of first demanding that they get jobs and enroll in treatment programmes, or that they live in a shelter before they can apply for their own apartments, government and aid groups simply give the homeless homes.

Homelessness has always been more a crisis of empathy and imagination than one of sheer economics. Governments spend millions each year on shelters, health care and other forms of triage for the homeless, but simply giving people homes turns out to be far cheaper, according to research from the University of Washington in 2009. Preventing a fire always requires less water than extinguishing it once it’s burning.

By all accounts, Housing First is an unusually good policy. It is economical and achievable.•

Tags:

The square-jawed hero astronauts of 1960s NASA went through marriages at a pretty ferocious clip, as you might expect from careerist monomaniacs, but none has had a more colorful, complicated life than Buzz Aldrin, who successfully walked on the moon but failed at selling used cars after he fell to Earth with a thud. Dr. Aldrin, as he prefers to be called, is now spearheading a plan to build Mars colonies. 

From Marcia Dunn at the AP:

MELBOURNE, Fla. (AP) — Buzz Aldrin is teaming up with Florida Institute of Technology to develop “a master plan” for colonizing Mars within 25 years.

The second man to walk on the moon took part in a signing ceremony Thursday at the university, less than an hour’s drive from NASA’s Kennedy Space Center. The Buzz Aldrin Space Institute is set to open this fall.

The 85-year-old Aldrin, who followed Neil Armstrong onto the moon’s surface on July 20, 1969, will serve as a research professor of aeronautics as well as a senior faculty adviser for the institute.

He said he hopes his “master plan” is accepted by NASA and the country, with international input. NASA already is working on the spacecraft and rockets to get astronauts to Mars by the mid-2030s.

Aldrin is pushing for a Mars settlement by approximately 2040. More specifically, he’s shooting for 2039, the 70th anniversary of his own Apollo 11 moon landing, although he admits the schedule is “adjustable.”

He envisions using Mars’ moons, Phobos and Deimos, as preliminary stepping stones for astronauts. He said he dislikes the label “one-way” and imagines tours of duty lasting 10 years.•

Tags: ,

From the February 10, 1910 Brooklyn Daily Eagle:

Tags:

There are many reasons, some more valid than others, that people are wary of so-called free Internet services like Facebook and Google, those companies of great utility which make money not through direct fees but by collecting our information and encouraging us to create content we’re not paid for.

Foremost, there are fears about surveillance, which I think are very valid. Hacks have already demonstrated how porous the world is now and what’s to come. More worrisome, beyond the work of rogue agents it’s clear the companies themselves cooperated in myriad ways with the NSA in handing over intel, some of which may have been necessary and most of which is troubling. Larry Page has said that we should trust the “good companies” with our information, but we shouldn’t trust any of them. Of course, there’s almost no alternative but to allow them into our lives and play by their rules.

Of course, the government isn’t alone in desiring to learn more about us. Advertisers certainly want to and long have, but there’s never before been this level of accessibility, this collective brain to be picked. These companies aren’t just looking to peddle information but also procure it. Their very existence depends on coming up with better and subtler ways of quantifying us.

I think another reason these businesses give pause isn’t because of something they actually do but what they remind us of: Our anxieties about the near-term future of Labor. By getting us to “work” for nothing and create content, they tell us that even information positions have been reduced, discounted. The dwindling of good-paying jobs, the Gig Economy and the fall of the middle class all seem to be encapsulated in this new arrangement. When a non-profit like Wikipedia does it, it can be destabilizing but doesn’t seem sinister. They same can’t be said for Silicon Valley giants.

In his latest Financial Times blog post, Andrew McAfee makes an argument in favor of these zero-cost services, which no doubt offer value, though I believe he gives short shrift to privacy concerns.

An excerpt:

Web ads are much more precisely targeted at me because Google and Facebook have a lot of information about me. This thrills advertisers, and it’s also OK with me; once in a while I actually see something interesting. Yes, we are “the product” in ad-supported businesses. Only the smallest children are unaware of this.

The hypothetical version of the we’re-being-scammed argument is that the giant tech companies are doing or planning something opaque and sinister with all that data that we’re giving them. As law professor Tim Wu wrote recently about Facebook: “[T]he data is also an asset. The two-hundred-and-seventy-billion-dollar valuation of Facebook, which made a profit of three billion dollars last year, is based on some faith that piling up all of that data has value in and of itself… One reason Mark Zuckerberg is so rich is that the stock market assumes that, at some point, he’ll figure out a new way to extract profit from all the data he’s accumulated about us.”

It’s true that all the information about me and my social network that these companies have could be used to help insurers and credit-card companies pick customers and price discriminate among them. But they already do that, and do it within the confines of a lot of regulation and consumer protection. I’m just not sure how much “worse” it would get if Google, Facebook and others started piping them our data.•

 

Tags:

Impresario is what they used to call those like Steve Ross of Warner Communications, whose mania for mergers allowed him a hand in a large number of media and entertainment ventures, making him boss and handler at different times to the Rolling Stones, Pele and Dustin Hoffman. One of those businesses the erstwhile funeral-parlor entrepreneur became involved with was Qube, an interactive cable-TV project that was a harbinger if a money-loser. That enterprise and many others are covered in a brief 1977 People profile. The opening:

In our times, the courtships and marriages that make the earth tremble are no longer romantic but corporate. The most legendary (or lurid) figures are not the Casanovas today. They are the conglomerateurs, and for sheer seismic impact on the popular culture, none approaches Steven J. Ross, 50, the former slacks salesman who married into a mortuary chain business that he parlayed 17 years later into Warner Communications Inc. (WCI). In founder-chairman Ross’s multitentacled clutch are perhaps the world’s predominant record division (with artists like the Eagles, Fleetwood Mac, the Rolling Stones, Led Zeppelin and Joni Mitchell); one of the Big Three movie studios (its hot fall releases include Oh, God! and The Goodbye Girl); a publishing operation (the paperback version of All the President’s Men, which was also a Warner Bros, film); the Atari line of video games like Pong, which inadvertently competes with Warner’s own TV producing arm, whose credits include Roots, no less. The conglomerate is furthermore not without free-enterprising social consciousness (WCI put up $1 million and owns 25 percent of Ms. magazine) or a redeeming sense of humor (it disseminates Mad).

Warner’s latest venturesome effort is bringing the blue-sky of two-way cable TV down to earth in a limited experiment in Columbus, Ohio. There, subscribers are able to talk back to their TV sets (choosing the movie they want to see or kibitzing the quarterback on his third-down call). An earlier Ross vision—an estimated $4.5 million investment in Pelé by Warner’s New York Cosmos—was, arguably, responsible for soccer’s belated breakthrough in the U.S. this year after decades of spectator indifference. Steve is obviously in a high-rolling business—Warners’ estimated annual gross is approaching a billion—and so the boss is taking his. Financial writer Dan Dorfman pegs Ross’s personal ’77 earnings at up to $5 million. That counts executive bonuses but not corporate indulgences. On a recent official trip to Europe in the Warner jet, Steve brought along his own barber for the ride.

En route to that altitude back in the days of his in-laws’ funeral parlor operation, Ross expanded into auto rentals (because he observed that their limos were unprofitably idle at night) and then into Kinney parking lots. “The funeral business is a great training ground because it teaches you service,” he notes, though adding: “It takes as much time to talk a small deal as a big deal.” So, come the ’70s, Ross dealt away the mortuary for the more glamorous show world. Alas, too, he separated from his wife. •

Tags:

“Mission Accomplished,” the words that hung like nooses behind our most unwitting President, George W. Bush, in 2003, as he prematurely announced “victory” in Iraq after invading the wrong country for no good reason, may be the phrase most associated with the gross incompetence of his Administration, but “Brownie, you’re doing a heckuva job” is just as damning and telling.

That’s what Dubya uttered in support of FEMA administrator Michael D. Brown as New Orleans sank not only in the waters of Hurricane Katrina but also in federal fecklessness and neglect. All while the President fiddled–or, more accurately, strummed.

Historian Douglas Brinkley believes it was the latter tragedy that was Bush’s greatest undoing. An excerpt from his writing in Vanity Fair:

What a weird moment in U.S. presidential history.

Hurricane Katrina, a Category 3 storm, had smashed into the Gulf South. People were drowning. And the president of the United States played guitar in San Diego, egged on by country singer Mark Wills.

Even George W. Bush’s most stalwart supporters cringed at his disconnect from reality. Bush, like Michael Jackson in his days at Neverland Ranch, was living in a bubble. By contrast, when Hurricane Betsy had struck the Louisiana coast in 1965, President Lyndon B. Johnson had immediately flown to New Orleans to see the flood zone firsthand. The difference was glaring. Bush was, quite simply—as Coast Guard first-responder Jimmy Duckworth phrased it—“out of the game.”

On the 10th anniversary of Katrina, with the advantage of hindsight, it’s clear that Bush’s lack of leadership in late summer of 2005 cost his presidency mightily. Unlike Ronald Reagan, after the Challenger explosion, or Bill Clinton, after the Oklahoma City bombing, Bush had failed to feel the profound implications of the moment as his predecessors had. He didn’t scramble into action. He didn’t touch the nation’s heartstrings by using epic oratory to inform the disaster. What we got, instead, were guitar chords and terse speeches void of human pathos. No matter how the Bush library in Dallas tries to spin Bush’s Katrina performance, we all know he deserved an F in crisis management.•

Tags: , ,

If driverless cars were improved markedly and all vehicles were autonomous, accidents and fatalities would likely experience a steep decline. But a shift to robocars will be a gradual one, and highways and streets will long be a mix of both humans and computers at the wheel. How will those two forces learn to share the road? It’ll take time and research.

From Aviva Rutkin at New Scientist:

IN THE near future, you may have to share the road with a robot. Or perhaps we should say that a robot will have to share the road with you.

At the University of California, Berkeley, engineers are preparing autonomous cars to predict what we impulsive, unreliable humans might do next. A team led by Katherine Driggs-Campbell has developed an algorithm that can guess with up to 92 per cent accuracy whether a human driver will make a lane change. She is due to present the work next month at the Intelligent Transportation Systems conference in Las Palmas de Gran Canaria, Spain.

Enthusiasts are excited that self-driving vehicles could lead to fewer crashes and less traffic. But people aren’t accustomed to driving alongside machines, says Driggs-Campbell. When we drive, we watch for little signs from other cars to indicate whether they might turn or change lanes or slow down. A robot might not have any of the same tics, and that could throw us off.

“There’s going to be a transition phase,” she says.

Tags: ,

Steve Fainaru and Mark Fainaru-Wada, who’ve done brilliant work (here and here) on the NFL’s existential concussion problem and yet still somehow are passionate Niners fans, have written an excellent ESPN The Magazine piece about Chris Borland, the football player who retired after his rookie season to safeguard his health.

The former San Francisco linebacker’s preemptive attempt at self-preservation was a shot across the bow, a shocking move the league hadn’t experienced since the 1960s, when so-called Hippie players voluntarily left the game and its militaristic nature during Vietnam Era. Borland’s decision made news, as you might expect, and he became something of a reluctant political football.

The ESPN article reveals the NFL’s response to Borland’s decision was, well, NFL-like: tone-deaf, corporate and petty. Although he’d left the game, the former player was asked to take a “random” drug test almost immediately. It seems the league wanted to deflate his stance and prove he retired to avoid detection over illegal substances. Unless it was a remarkable coincidence, the NFL hoped to paint Borland a fraud and thereby negate his very valid concerns.

There’s no way humans can safely play football. No helmet can preserve a head from whiplash–in fact the modern one is a weapon that increases the occurrences. People who profit from the sport can make believe otherwise, but there’s no way out but down. Borland and others who’ve made a quick exit, and the stalwarts who’ve awakened to the game’s toll, have underlined that reality.

An excerpt:

Borland has consistently described his retirement as a pre-emptive strike to (hopefully) preserve his mental health. “If there were no possibility of brain damage, I’d still be playing,” he says. But buried deeper in his message are ideas perhaps even more threatening to the NFL and our embattled national sport. It’s not just that Borland won’t play football anymore. He’s reluctant to even watch it, he now says, so disturbed is he by its inherent violence, the extreme measures that are required to stay on the field at the highest levels and the physical destruction 
he has witnessed to people he loves and admires — especially to their brains.

Borland has complicated, even tortured, feelings about football that grow deeper the more removed he is from the game. He still sees it as an exhilarating sport that cultivates discipline and teamwork and brings communities and families together. “I don’t dislike football,” he insists. “I love football.” At the same time, he has come to view it as a dehumanizing spectacle that debases both the people who play it and the people who watch it.

“Dehumanizing sounds so extreme, but when you’re fighting for a football at the bottom of the pile, it is kind of dehumanizing,” he said during a series of conversations over the spring and summer. “It’s like a spectacle of violence, for entertainment, and you’re the actors in it. You’re complicit in that: You put on the uniform. And it’s a trivial thing at its core. It’s make-believe, really. That’s the truth about it.”

How one person can reconcile such opposing views of football — as both cherished American tradition and trivial activity so violent that it strips away our humanity — is hard to see. Borland, 24, 
is still working it out. He wants to be respectful to friends who are still playing and former teammates and coaches, but he knows that, in many ways, he is the embodiment of the growing conflict over football, a role that he is improvising, sometimes painfully, as he goes along.•

 

Tags: , ,

If you thought the public mourning over Steve Jobs’ death seemed outsize, just imagine what went on when Thomas Edison, whose contributions were much more foundational, was at life’s end.

While Edison didn’t create the first incandescent lamp (that was Sir Joseph Wilson Swan whom he eventually partnered with), his 1879 invention and business acumen enabled the brightness of modernity. It was this accomplishment among his many that was celebrated with “Light’s Golden Jubilee” in 1929, a live celebration of the Edison bulb that was broadcast on radio. President Hoover was there in person, and Albert Einstein, Madame Curie, Orville Wright and Will Rogers were a few guests who were patched in remotely. Edison reenacted his eureka moment and entire cities put on blinding light shows. It was a merry time that beat by just four days the arrival of the stock market crash that begat the Great Depression.

In 1931, when the inventor died, many American schools were closed and everything from lightbulbs to trains were turned off for a moment in Edison’s honor. A pair of Brooklyn Daily Eagle articles embedded below recall the elaborate expressions of gratitude.

____________________

From October 20, 1929:

From October 21, 1931:

Tags:

Capitalism run amok is a scary thing. It creates the horrid air quality and high cancer rates in China just as readily as it can create opportunities for the previously underprivileged.

In visiting a couple of the modern Gold Rush towns in fracking-friendly North Dakota, Henrietta Norton and Dan Dennison of the Guardian witnessed many of the social costs of the of rapid transformation of former farmland, but they also find a more complicated story in the wake of the oil price collapse, one in which striving Americans try to remake their fortunes in a tumultuous landscape during a time of economic uncertainty.

The opening:

“You’re going to see it all there – gang banging, sex trafficking, gambling, drugs, all the dark stuff, they’ve ruined the place,” says the man on the front desk at the Super 8 Motel where we stop for the night en route from Fargo.

The Bakken region has been at the heart of the latest oil boom since the early 2000s, when new technology enabled horizontal drilling and hydraulic fracturing to access minerals found in the layers of rock beneath the ground. It occupies 200,000 square miles, and stretches from Montana and North Dakota across into Saskatchewan and Manitoba in Canada.

In the past five years, there has been a dramatic influx of people in North Dakota. Many towns have become synonymous with the term “man camp”, as tens of thousands of men have arrived in search of work in the rigs, or to lay the pipeline. There’s a joke amongst them: “There’s a woman behind every tree in North Dakota … it’s just that there aren’t any trees.”•

Tags: ,

Prelude to layoffs in the media industry (and all others) is the influx of efficiency experts and consultants. Conde Nast employees are preparing for just such a plague of analysts, in which each of them will have to account for every hour of their day. Scary for them that they’ll be quantified, even if in such a quaint, old-fashioned way.

The new normal is, of course, to let algorithms measure us at work and, ultimately, at home. Human management in all levels of business is so godawful, plagued by pettiness, bias and incompetence, it’s valid to ask whether algorithms could really do a worse job. Maybe not. But you know when a supervisor is messing with you, and you can appeal to a sense of fairness, even if that’s sometimes futile. It’s really difficult to argue with computer code, which can certainly contain its own biases. In fact, they almost certainly do. Further, there’s no way current AI can truly judge the dynamics of office space, the little things that go into making a company successful or even just a pleasant place to be, something important to us if not our silicon brothers and sisters.

In a smart Aeon essay, Frank Pasquale wonders about the quiet insinuation into our lives of this next-level judge, jury and executioner. He has more hope than I do that these new tools of accountability will themselves be held accountable. An excerpt:

The infancy of the internet is over. As online spaces mature, Facebook, Google, Apple, Amazon, and other powerful corporations are setting the rules that govern competition among journalists, writers, coders, and e-commerce firms. Uber and Postmates and other platforms are adding a code layer to occupations like driving and service work. Cyberspace is no longer an escape from the ‘real world’. It is now a force governing it via algorithms: recipe-like sets of instructions to solve problems. From Google search to OkCupid matchmaking, software orders and weights hundreds of variables into clean, simple interfaces, taking us from query to solution. Complex mathematics govern such answers, but it is hidden from plain view, thanks either to secrecy imposed by law, or to complexity outsiders cannot unravel.

Algorithms are increasingly important because businesses rarely thought of as high tech have learned the lessons of the internet giants’ successes. Following the advice of Jeff Jarvis’s What Would Google Do, they are collecting data from both workers and customers, using algorithmic tools to make decisions, to sort the desirable from the disposable. Companies may be parsing your voice and credit record when you call them, to determine whether you match up to ‘ideal customer’ status, or are simply ‘waste’ who can be treated with disdain. Epagogix advises movie studios on what scripts to buy, based on how closely they match past, successful scripts. Even winemakers make algorithmic judgments, based on statistical analyses of the weather and other characteristics of good and bad vintage years.

For wines or films, the stakes are not terribly high. But when algorithms start affecting critical opportunities for employment, career advancement, health, credit and education, they deserve more scrutiny.•

Tags:

When the sanguine view is that only 7% of American jobs will disappear in the next ten years, we probably need to brace ourselves. Forrester Research reports that figure, saying some employment loss will be offset by the creation of new positions. Probably true enough, but there’s no guarantee low-skilled workers will be able to be retrained for them, and it’s not like 2025 is some important end date. In the longer run, the truth may end up somewhere between the Forrester number and the more troubling Oxford one of 47% jobs being susceptible to automation.

From Elizabeth Dwoskin at WSJ:

Before a robot takes your job, you’re likely to be working with one side-by-side.

That’s the takeaway from a new report by Forrester Research, Inc.

The report wades into a heady and long-running debate over whether, how, and to what extent will robots take over human jobs – a hotly discussed topic amid recent progress in robotics and artificial intelligence. Most experts agree that machines will depress the job market in coming decades, possibly by as much as 47%, according to a widely reported 2013 Oxford paper.

Forrester takes a less dire view. Examining workforces at large companies across industries, including Delta Airlines Inc., Whole Foods Market Inc., and Lowe’s Companies Inc. as well as many startups, analyst J.P. Gownder estimated that automation would erase 22.7 million US jobs by 2025 — 16% of today’s total. However, that decline would be offset somewhat by new jobs created, making for a net loss of 7%, or 9.1 million jobs.

Ultimately, robots would drive a social revolution, Mr. Gownder found, but not the one people fear.•

Tags: ,

« Older entries § Newer entries »