Neil Irwin of the “Upshot” blog at the New York Times suggests that American wage stagnation and the lag in hiring are being driven not by market conditions but by a mentality. An excerpt: 

“So any employer with a job opening should have no problem hiring. If anything, the ratio of openings to hiring should be lower than it was in the mid-2000s, not higher.

Here’s a theory to try to make sense of the disconnect: During the recession, employers got spoiled. When unemployment was near 10 percent, talented workers were lined up outside their door. The workers they did have were terrified of losing their jobs. If you put out word that you had an opening, you could fill the job almost instantly. That’s why the ratio of job openings to hires fell so low in 2009.

As the economy has gotten better the last five years, employers have had more and more job openings, but have been sorely reluctant to accept that it’s not 2009 anymore in terms of what workers they can hire and at what wage.

Yes, unemployment is still elevated, but workers aren’t in nearly as desperate a position as they were then. So to get the kind of talented people they want, employers are going to have to pay more (or offer better benefits or working conditions) than they would have not that long ago.”

Tags:

Long before Silicon Valley, Victorians gave the future a name, recognizing electricity and, more broadly, technology, as transformative, disruptive and decentralizing. We’re still borrowing from their lexicon and ideas, though we need to be writing the next tomorrow’s narratives today. From “Future Perfect,” Iwan Rhys Morus’ excellent Aeon essay:

“For the Victorians, the future, as terra incognita, was ripe for exploration (and colonisation). For someone like me – who grew up reading the science fiction of Robert Heinlein and watching Star Trek – this makes looking at how the Victorians imagined us today just as interesting as looking at the way our imagined futures work now. Just as they invented the future, the Victorians also invented the way we continue to talk about the future. Their prophets created stories about the world to come that blended technoscientific fact with fiction. When we listen to Elon Musk describing his hyperloop high-speed transportation system, or his plans to colonise Mars, we’re listening to a view of the future put together according to a Victorian rulebook. Built into this ‘futurism’ is the Victorian discovery that societies and their technologies evolve together: from this perspective, technology just is social progress.

The assumption was plainly shared by everyone around the table when, in November 1889, the Marquess of Salisbury, the Conservative prime minister of Great Britain, stood up at the Institution of Electrical Engineers’ annual dinner to deliver a speech. He set out a blueprint for an electrical future that pictured technological and social transformation hand in hand. He reminded his fellow banqueteers how the telegraph had already changed the world by working on ‘the moral and intellectual nature and action of mankind’. By making global communication immediate, the telegraph had made everyone part of the global power game. It had ‘assembled all mankind upon one great plane, where they can see everything that is done, and hear everything that is said, and judge of every policy that is pursued at the very moment those events take place’. Styling the telegraph as the great leveller was quite common among the Victorians, though it’s particularly interesting to see it echoed by a Tory prime minister.

Salisbury’s electrical future went further than that, though. He argued that the spread of electrical power systems would profoundly transform the way people lived and worked, just as massive urbanisation was the result of steam technology.”

Tags:

Mark Twain, America’s second greatest comic ever in my estimation (after George Carlin), died of a heart attack 104 years ago. He lived a life writ large, won fame and lost fortunes, and, most importantly, reminded us what we could be if we chose to live as one, traveling as he did from Confederate sympathizer to a place of enlightenment. I think of Twain what I thought of Pete Seeger and Odetta when they died: You can’t really replace such people because they have the history and promise of the nation coursing through their veins. He was eulogized in the April 22. 1910 Brooklyn Daily Eagle; the opening sections excerpted below follow him from birth to his emergence as a “stand-up” and his shift to author of books.

The future usually arrives wearing the clothes of the past, but occasionally we truly and seriously experience the shock of the new. On that topic: The 1965 Life magazine piece “Will Man Direct His Own Evolution?is a fun but extremely overwrought essay by Albert Rosenfeld about the nature of identity in a time when humans would be made by design, comprised of temporary parts. Like a lot of things written in the ’60s about science and society, it’s informed by an undercurrent of anxiety about the changes beginning to affect the nuclear family. An excerpt:

Even you and I–in 1965, already here and beyond the reach of potential modification–could live to face curious and unfamiliar problems in identity as a result of man’s increasing ability to control his own mortality after birth. As organ transplants and artificial body parts become even more available it is not totally absurd to envision any one of us walking around one day with, say, a plastic cornea, a few metal bones and Dacron arteries, with donated glands, kidney and liver from some other person, from an animal, from an organ bank, or even an assembly line, with an artificial heart, and computerized electronic devices to substitute for muscular, neural or metabolic functions that may have gone wrong. It has been suggested–though it will almost certainly not happen in our lifetime–that brains, too, might be replaceable, either by a brain transplanted from someone else, by a new one grown in tissue culture, or an electronic or mechanical one of some sort. ‘What,’ asks Dr. Lederberg, “is the moral, legal or psychiatric identity of an artificial chimera?”

Dr. Seymour Kety, an outstanding psychiatric authority now with the National Institute of Health, points out that fairly radical personality changes already have been wrought by existing techniques like brainwashing, electroshock therapy and prefrontal lobotomy, without raising serious questions of identity. But would it be the same if alien parts and substances were substituted for the person’s own, resulting in a new biochemistry and a new personality with new tastes, new talents, new political views–-perhaps even a different memory of different experiences? Might such a man’s wife decide she no longer recognized him as her husband and that he was, in fact, not? Or might he decide that his old home, job and family situation were not to his liking and feel free to chuck the whole setup that have been quite congenial to the old person?

Not that acute problems of identity need await the day when wholesale replacement of vital organs is a reality. Very small changes in the brain could result in astounding metamorphoses. Scientists who specialize in the electrical probing of the human brain have, in the past few years, been exploring a small segment of the brain’s limbic system called the amygdala–and discovering that it is the seat of many of our basic passions and drives, including the drives that lead to uncontrollable sexual extremes such as satyriasis and nymphomania. 

Suppose, at a time that may be surprisingly near at hand, the police were to trap Mr. X, a vicious rapist whose crimes had terrorized the women of a neighborhood for months. Instead of packing him off to jail, they send him in for brain surgery. The surgeon delicately readjusts the distorted amygdala, and the patient turns into a gentle soul with a sweet, loving disposition. He is clearly a stranger to the man who was wheeled into the operating room. Is he the same man, really? Is he responsible for the crimes that he–or that other person–committed? Can he be punished? Should he go free?

As time goes on, it may be necessary to declare, without the occurrence of death, that Mr. X has ceased to exist and that Mr. Y has begun to be. This would be a metaphorical kind of death and rebirth, but quite real psychologically–and thus, perhaps, legally.•

Secret Addiction — Beer & Benzos

Every night drink several beers or shots and during the day chew benzo type medications- clonozepam & alphazolam (generics of Klonopin & Xanax)… I lose count how much I am taking or drinking.. also on several anti depressants — Lexapro, Wellbrutin & gabapentin… my credit card debt is frighteningly high, maybe went into manic state and spent like that but no longer have $6,000 a month to spend on just necessities.. I am in my late 30’s and nothing to show for it… No one really knows about this twin tolerance or addiction I have either.

Like much of the pre-Internet recording-industry infrastructure, the Columbia House Music Club, an erstwhile popular method of bulk-purchasing songs through snail mail, no longer exists, having departed this world before iTunes’ unfeeling gaze, as blank and pitiless as the sun. Your penny or paper dollar will no longer secure you a dozen records or tapes, nor do you have to experience the buyer’s remorse of one who reflexively purchases media without heeding the fine print which reveals that the relationship, as the Carpenters would say, had only just begun.

Of course, music pilfering didn’t start in our digital times with Napster, and Columbia was a prime target for those who loved systems capable of gaming. Via the excellent Longreads, here’s the opening of “The Rise and Fall of the Columbia House Record Club — and How We Learned to Steal Music,” a 2011 Phoenix article by Daniel Brockman and Jason W. Smith:

On June 29, 2011, the last remnant of what was once Columbia House — the mightiest mail-order record club company that ever existed — quietly shuttered for good. Other defunct facets of the 20th-century music business have been properly eulogized, but it seems that nary a tear was shed for the record club. Perhaps no one noticed its demise. After all, by the end, Columbia House was no longer Columbia House; it had folded into its main competitor and become an online-only entity years before.

A more likely explanation, though, is that a new generation of music fans who had never known a world without the Internet couldn’t grasp the marvel that was the record club in its heyday. From roughly 1955 until 2000, getting music for free meant taping a penny to a paper card and mailing it off for 12 free records — along with membership and the promise of future purchasing.

The allure of the record club was simple: you put almost nothing down, signed a simple piece of paper, picked out some records, and voila! — a stack of vinyl arrived at your doorstep. By 1963, Columbia House was the flagship of the record-club armada, with 24 million records shipped. By 1994, they had shipped more than a billion records, accounted for 15 percent of all CD sales, and had become a $500-million-a-year behemoth that employed thousands at its Terre Haute, Indiana, manufacturing and shipping facility.

Of course, most of the record clubs’ two million customers failed to read the fine print, obligating them to purchase a certain number of monthly selections at exorbitant prices and even more exorbitant shipping costs. At the same time, consumers plotted to sign up multiple accounts under assumed names, in order to keep getting those 12-for-a-penny deals as often as possible. Record clubs may have introduced several generations of America’s youth to the concept of collection agencies — and the concept of stealing music, decades before the advent of the Internet.•

 

Tags: ,

Even though I love the hysterical contempt of Nathanael West above almost all other things, I can equally enjoy Tom Carson’s wonderfully worded Paul Thomas Anderson consideration at Grantland, which makes quick work of The Day of the Locust characterizations of Californians in the service of exalting the great filmmaker as a Wellesian master of the region. There are gorgeous, knowing passages like this one: “[There Will Be Blood] is the unofficial prequel to Roman Polanski’s Chinatown, almost the only other movie to remind us that Southern California was a paradise won, not lost, by capitalism’s version of original sin — the destruction of the natural order. The same is true of America itself, of course, but California’s role in our culture is to incarnate the New World’s own, hyperbolized promised land.” The opening: 

Is it going too far to say that Southern California is to Paul Thomas Anderson what North Mississippi was to William Faulkner? Possibly. So maybe we’re better off playing it safe and going with Flannery O’Connor’s home turf instead.

Not to worry, people. As novelistic as PTA can be, which is plenty, this isn’t about giving him some kind of misbegotten upgrade by proclaiming his movies are Just Like Literature. The point is that he’s a regional artist in a way that doesn’t have many screen equivalents. If East Coast critics often overlook this in spite of loving him to death, no wonder: Not many Americans outside the zip codes in question think of SoCal as a real place to begin with.

Neither do most of the transplants, for that matter. Reality wasn’t the attraction when they moved, after all; liberation was. One of Anderson’s great strengths is that his understanding of Los Angeles as a teeming vat of self-actualization projects doesn’t make him feel obliged to depict the volunteer lab rats as bizarre or foolish, in the hysterically contemptuous way that we’ve been used to since The Day of the Locust. Good old American transcendentalism just got all modern and DIY in SoCal, and the results are a travesty only if you mistake different methods for changed goals.

Being the local boy that most of his fellow filmmakers aren’t — he was born in Studio City, pretty much the definition of deglamorized glamour — has the effect of turning everybody else’s Oz into Anderson’s evocatively vivid Kansas.•

Tags: ,

In 1958, Disney played large-scale urban planner, imagining the world as interconnected mototopia. Cantilevered skyways and transcontinental motorways and highway escalators, anyone? Nothing so fantastical was necessary, but we should have retrofitted highways and roads to be smarter, cleaner and safer long before driverless cars were even in the conversation, but we never had the ingenuity or political will to do so.

The reason why white people, no matter how noble, can’t speak for people of other races is that human experiences vary based on color, class, gender and other categorizations. It’s just a fact. So while all manner of well-heeled New York and Beltway journalists (almost all white) decried what’s happened recently at the New Republic–and there’s certainly been something important lost in the tumult–Ta-Nehesi Coates of the Atlantic saw things somewhat differently. He recognized a publication staffed almost exclusively by caucasians which has been horribly racist toward African-Americans, pointing fingers at them and blaming them, especially during the often-odious Marty Peretz reign, marred as it was by bigotry and warmongering. And while many scoffed that the venerable periodical might now become Buzzfeed or Gawker, Coates points out that such online publications are more enlightened about race than TNR has been. An excerpt:

TNR made a habit of ‘reflecting briefly’ on matters that were life and death to black people but were mostly abstract thought experiments to the magazine’s editors. Before, during, and after Sullivan’s tenure, the magazine seemed to believe that the kind of racism that mattered most was best evidenced in the evils of Afrocentrism, the excesses of multiculturalism, and the machinations of Jesse Jackson. It’s true that TNR’s staff roundly objected to excerpting The Bell Curve, but I was never quite sure why. Sullivan was simply exposing the dark premise that lay beneath much of the magazine’s coverage of America’s ancient dilemma.

What else to make of the article that made Stephen Glass’s career possible, ‘Taxi Cabs and the Meaning of Work’? The piece asserted that black people in D.C. were distinctly lacking in the work ethic best evidenced by immigrant cab drivers. A surrealist comedy, Glass’s piece revels in the alleged exploits of a mythical Asian-American avenger—Kae Bang—who wreaks havoc on black criminals who’d rather rob taxi drivers than work. The article concludes with Glass, in the cab, while its driver is robbed by a black man. It was all lies.

What else to make of TNR sending Ruth Shalit to evaluate affirmative action atThe Washington Post in 1995? ‘She cast Post writer Kevin Merida as some kind of poster boy for affirmative action when in fact he had risen in the business for reasons far more legitimate than her own,’ David Carr wrote in 1999. Shalit’s piece wasn’t all lies. But it wasn’t all true either. Shortly after the article was published, she was revealed to be a serial plagiarist.

TNR might have been helped by having more—or merely any—black people on its staff.”

Tags:

From the May 8, 1925 Brooklyn Daily Eagle:

“Mrs. Mary Hannigan of 83 Division Ave. today mourned the death of her daughter, Julia, the little girl who wanted to be a boy. And she is filled with bitterness because, she said, her daughter died of a broken heart, not the pneumonia listed on the death certificate at St. Catherine’s Hospital.

Too much notoriety killed the little girl, the mother says. The notoriety was gained by what the mother describes as an ‘innocent prank.’ Julia, who was buried on Saturday, decided last October that she wanted to be a boy. She disappeared from her home. A week later she was found. She had cut her hair, donned boy’s clothing and earned her living caddying.

But the little girl brooded over what she thought was the disgrace she had brought on her family. Her resistance was weakened. She caught a cold a short time ago which developed into pneumonia.”

Tags: ,

That last 5% of perfecting autonomous vehicles may be more difficult than the first 95%, and driverless options will likely continue to be introduced incrementally rather than all at once, but if such a system is 100% realized, there will be all manner of ramifications. In a post on his blog, Google driverless sector consultant Brad Templeton looks at the possible outcomes in such a brave new world. An excerpt:

“When I talk about robocars, I often get quite opposite reactions:

  • Americans, in particular, will never give up car ownership! You can pry the bent steering wheel from my cold, dead hands.
  • I can’t see why anybody would own a car if there were fast robotaxi service!
  • Surely human drivers will be banned from the roads before too long.

I predict neither extreme will be true. I predict the market will offer all options to the public, and several options will be very popular. I am not even sure which will be the most popular.

  1. Many people will stick to buying and driving classic, manually driven cars. The newer versions of these cars will have fancy ADAS systems that make them much harder to crash, and their accident levels will be lower.
  2. Many will buy a robocar for their near-exclusive use. It will park near where it drops them off and always be ready. It will keep their stuff in the trunk.
  3. People who live and work in an area with robotaxi service will give up car ownership, and hire for all their needs, using a wide variety of vehicles.
  4. Some people will purchase a robocar mostly for their use, but will hire it out when they know they are not likely to use it, allowing them to own a better car. They will make rarer use of robotaxi services to cover specialty trips or those times when they hired it out and ended up needing it. Their stuff will stay in a special locker in the car.”

Tags:

An impediment to automation may be “robotic” humans willing to work for wages so low that it’s not cost efficient to replace them. From a peek inside a sprawling distribution center by Matt King of the Atlantic:

“Susan and her co-workers appeared in good spirits as the manager introduced them by name and told us how long they had been working at the company. About half of the workers had a mental or physical disability, a result of the company’s ‘inclusion’ program which mirrored similar efforts at other major retailers. In a news segment about a DC in South Carolina, one disabled worker said hers was ‘the coolest job in the world.’

These programs are viewed as leading examples of combined corporate and social success, but that success may be short-sighted. Pickers and low-skill jobs of the sort represent a pain point for DCs and the e-commerce executives who are managing their evolution. The jobs appear simple (one Amazon executive referred to the workers as like ‘robots in human form’), but the tasks are difficult to automate at scale: ‘Because products vary so much in size and shape and because of the way they sit on shelves, robotic manipulators still can’t beat real arms and hands,’ explains Erico Guizo on Spectrum, the blog for the Institute of Electrical and Electronic Engineers (IEEE).

Unlike Susan and her co-workers, who were salaried and long-time employees of the company, a growing number of ‘pickers’ at DCs across the country are hired through staffing agencies and classified as ‘non-permanent’ or ‘temporary.’ This means no health care coverage or benefits, pay that’s usually barely above the minimum wage, and employment that can be voided at a whim when the workers are no longer needed.

This tenuous labor arrangement is partly the result of an honest fluctuation in the demand for these jobs: The biggest influx of DC workers occurs just before the holiday season, when online retailers conduct a majority of their annual business. But like retail jobs, the arrangement is also an acknowledgement of the underlying economic reality: The jobs are utterly low-skill, and there exists a large supply of unemployed Americans willing to do the work.

‘In a way, because low-wage jobs are so cheap, we haven’t seen as much automation as you could,’ Joseph Foudy, a professor of economics at NYU’s Stern School of Business, told me.”

Tags: , ,

We’ve been able to feed millions of images into social networks for “free,” armies of servers our seeming supplicants, but with facial-recognition software coming of age, the bill is nearly due. Will the surprising acceptance of surveillance online translate to the physical world? From Paul Rubens at the BBC:

“Imagine walking into a shop you’ve never been in before, to be greeted by name by a sales assistant you don’t know.

If you’re David Beckham or Lily Allen you may be used to this type of VIP treatment, but if your fame is more limited, or even non-existent then you might find this attention rather disconcerting.

Despite this, thanks to facial recognition software you don’t need to be a celebrity for sales assistants to know your name the moment you enter a shop.

That’s because companies such as Japanese technology giant NEC and FaceFirst, a California-based company, offer systems that use cameras placed at the entrances to shops to identify people as they come in.

If your face fits

When important existing or potential customers are spotted, a text message can be sent to appropriate sales staff to ensure they provide personal attention.

‘Someone could approach you and give you a cappuccino when you arrive, and then show you the things they think you will be interested in buying,’ says Joel Rosenkrantz, FaceFirst’s chief executive.

Before a system such as FaceFirst’s can be put into operation, it has to be loaded up with photos. So an obvious question to ask is where would they come from?”

Tags:

 

Here are 25 pieces of journalism from this year, alphabetized by author name, which made me consider something new or reconsider old beliefs or just delighted me.

  • Exodus” (Ross Andersen, Aeon) A brilliant longform piece that lifts off with Elon Musk’s mission to Mars and veers in deep and mysterious directions.
  • Barack Obama, Ferguson, and the Evidence of Things Unsaid” (Ta-Nehisi Coates, The Atlantic) Nobody speaks truth to race in America quite like Coates, and the outrage of Ferguson was the impetus for this spot-on piece about the deeply institutionalized prejudice of government, national and local, in the U.S.
  • The Golden Age of Journalism?” (Tom Engelhardt, TomDispatch) The landscape has never been more brutal for news nor more promising. The author luxuriates in the richness destabilization has wrought.
  • Amazon Must Be Stopped” (Franklin Foer, The New Republic) Before things went completely haywire at the company, Foer returned some sanity to the publication in the post-Peretz period. This lucid article argues that Amazon isn’t becoming a monopoly but already qualifies as one.
  • America in Decay” (Francis Fukuyama, Foreign Affairs) Strong argument that the U.S. public sector is so dysfunctional because of a betrayal of meritocracy in favor of special interests and lobbyists. The writer’s idea of what constitutes a merit-based system seems flawed, but he offers many powerful ideas.
  • What’s the Matter With Russia?” (Keith Gessen, Foreign Affairs) An insightful meditation about Putin’s people, who opt to to live in a fairy tale despite knowing such a thing can never have a happy ending.
  • The Dying Russians(Masha Gessen, New York Review of Books) Analysis of Russia’s high mortality rate suggests that the root cause is not alcohol, guns or politics, but simply hopelessness.  
  • Soak the Rich” (David Graeber, Thomas Piketty) Great in-depth exchange between two thinkers who believe capitalism has run amok, but only one of whom thinks it’s run its course.
  • The First Smile(Michael Graziano, Aeon) The Princeton psychology and neuroscience professor attempts to explain why facial expressions appear to be natural and universal.
  • The Creepy New Wave of the Internet” (Sue Halpern, New York Review of Books) The author meditates on the Internet of Things, which may make the world much better and much worse, quantifying us like never before.
  • Super-Intelligent Humans Are Coming” (Stephen Hsu, Nautilus) A brisk walk through the process of genetic modification, which would lead to heretofore unknown brain power.
  • All Dressed Up For Mars and Nowhere to Go” (Elmo Keep, Matter) A sprawling look at the seeming futility of the MarsOne project ultimately gets at a more profound pointlessness–pursuing escape in a dying universe.
  • The Myth of AI” (Jaron Lanier, Edge) Among other things, this entry draws a neat comparison between the religionist’s End of Days and the technologist’s Singularity, the Four Horseman supposedly arriving in driverless cars.
  • The Disruption Machine” (Jill Lepore, The New Yorker) The “D” word, its chief promulgator, Clayton M. Christensen, and its circuitous narratives, receive some disruption of their own.
  • The Longevity Gap(Linda Marsa, Aeon) A severely dystopian thought experiment: Will the parallels of widening income disparity and innovations in medicine lead to two very different lifespans for the haves and have-nots?
  • The Genetics Epidemic” (Jamie F. Metzl, Foreign Affairs) Genetic modification studied from an uncommon angle, that of national-security concerns.
  • My Captivity(Theo Padnos, The New York Times Magazine) A harrowing autobiographical account of an American journalist’s hostage ordeal in the belly of the beast in Syria.
  • We Are a Camera” (Nick Paumgarten, The New Yorker) In a time of cheap, ubiquitous cameras, the image, merely an imitation, is ascendant, and any event unrecorded seemingly has less currency. The writer examines the strangeness of life in the GoPro flow.
  • A Goddamn Death Dedication” (Alex Pappademas, Grantland) A knowing postmortem about Casey Kasem, America’s deejay when the world was hi-fi but before it became sci-fi.
  • In Conversation: Chris Rock” (Frank Rich, New York) The exchange about “black progress” is an example of what comedy does at its best: It points out an obvious truth that so many have missed.
  • The Mammoth Cometh” (Nathaniel Rich, The New York Times Magazine) A piece which points out that de-extinct animals won’t be exactly like their forebears, nor will augmented humans of the future be just like us. It’s progress, probably.
  • Hello, My Name Is Stephen Glass, and I’m Sorry(Hanna Rosin, The New Republic) Before the implosion of the publication, the writer wondered what it would mean to forgive her former coworker, an inveterate fabulist and liar, and what it would mean if she could not forgive.
  • Gilbert Gottfried: New York Punk” (Jay Ruttenberg, The Lowbrow Reader) Written by the only person on the list whom I know personally, but no cronyism is necessary for the inclusion of this excellent analysis of the polarizing comic, who’s likely more comfortable when at his most alienating.

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

If you put a gun to my head and asked what I thought was the best novel ever written in English, I would think you were crazy. Why are you pointing a gun at my head?!? Why not just ask me without the threat of murder?!? Do you want me to call the police?!?

After you were disarmed and arrested, I would think about the question again and just as likely choose Lolita, Vladimir Nabokov’s tale of monstrous love, as anything else. The language is impeccable, amazingly weighty and nimble all at once, and the book overall both profoundly funny and sad.

Art is one thing, however, but life another. The book’s main inspiration may have been von Lichberg or it may have been a very real horror, a 1940s NYC child abduction perpetrated by a felon in a fedora named Frank La Salle. (Or perhaps it was a combination of the two.) Via Longreads, a passage from “The Real Lolita,” an historical inquiry by Sarah Weinman published at the Penguin Random House blog:

Nabokov said he conjured up the germ of the novel—a cultured European gentleman’s pedophilic passion for a 12-year-old girl resulting in a madcap, satiric cross-country excursion—’late in 1939 or early in 1940, in Paris, at a time when I was laid up with a severe attack of intercostal neuralgia.’ At that point it was a short story set in Europe, written in his first language, Russian. Not pleased with the story, however, he destroyed it. By 1949, Nabokov had emigrated to America, the neuralgia raged anew, and the story shifted shape and nagged at him further, now as a longer tale, written in English, the cross-country excursion transplanted to America.

Lolita is a nested series of tricks. Humbert Humbert, the confessing pervert, tries so hard to obfuscate his monstrosities that he seems unaware when he truly gives himself away, despite alleging the treatise is a full accounting of his crimes. Nabokov, however, gives the reader a number of clues to the literary disconnect, the most important being the parenthetical. It works brilliantly early on in Lolita, when Humbert describes the death of his mother—’My very photogenic mother died in a freak accident (picnic, lightning) when I was three’—or when he sights Dolores Haze in the company of her own mother, Charlotte, for the first time: ‘And, as if I were the fairy-tale nurse of some little princess (lost, kidnaped, discovered in gypsy rags through which her nakedness smiled at the king and his hounds), I recognized the tiny dark-brown mole on her side.’ The unbracketed narrative is what Humbert wants us to see; the asides reveal what is really inside his mind.

Late in Lolita, one of these digressions gives away the critical inspiration. Humbert, once more in Lolita’s hometown after five years away, sees Mrs. Chatfield, the “stout, short woman in pearl-gray,” in his hotel lobby, eager to pounce upon him with a “fake smile, all aglow with evil curiosity.” But before she can, the parenthetical appears like a pop-up thought balloon for the bewildered Humbert: “Had I done to Dolly, perhaps, what Frank Lasalle [sic], a fifty-year-old mechanic, had done to eleven-year-old Sally Horner in 1948?”•

_______________________________

“I think the book is shocking…I’m glad that it’s shocking.”


Tags: , , ,

It wasn’t a commercial triumph like the organ named for him, but Laurens Hammond’s “Teleview” projection system was a critical triumph in early 3D films. The set-up was installed in Manhattan’s Selwyn Theater in the early 1920s, and moviegoers were treated to screenings of The Man From Mars, a stereoscopic film made especially for Teleview, which was shown on a large screen and on individual viewing devices attached at each seat. It apparently looked pretty great. Alas, the equipment and installation was costly, and no other cinemas adopted the technology. An article follows about the apparatus from the December 17, 1922 Brooklyn Daily Eagle.

Tags:

I’ll use the graph below, from a post by Andrew Sullivan at the Dish, as possible proof of my contention that although police body-cameras may not instantly bring about a higher degree of justice, the images will effect public consciousness, which may in turn be brought to bear on race and policing.

Tags:

Andrew McAfee, co-author with Eric Brynjolfsson of The Second Machine Age, believes that Weak AI will destabilize employment for decades, but he doesn’t think species-threatening Artificial Intelligence is just around the bend. From his most recent Financial Times blog post:

“AI does appear to be taking off: after decades of achingly slow progress, computers have in the past few years demonstrated superhuman ability, from recognising street signs in pictures and diagnosing cancer to discerning human emotions and playing video games. So how far off is the demon?

In all probability, a long, long way away; so long, in fact, that the current alarmism is at best needless and at worst counterproductive. To see why this is, an analogy to biology is helpful.

It was clear for a long time that important characteristics of living things (everything from the colour of pea plant flowers to the speed of racehorses) was passed down from parents to their offspring, and that selective breeding could shape these characteristics. Biologists hypothesised that units labelled ‘genes’ were the agents of this inheritance, but no one knew what genes looked like or how they operated. This mystery was solved in 1953 when James Watson and Francis Crick published their paper describing the double helix structure of the DNA molecule. This discovery shifted biology, giving scientists almost infinitely greater clarity about which questions to ask and which lines of inquiry to pursue.

The field of AI is at least one ‘Watson and Crick moment’ away from being able to create a full artificial mind (in other words, an entity that does everything our brain does). As the neuroscientist Gary Marcus explains: ‘We know that there must be some lawful relation between assemblies of neurons and the elements of thought, but we are currently at a loss to describe those laws.’ We also do not have any clear idea how a human child is able to know so much about the world — that is a cat, that is a chair — after being exposed to so few examples. We do not know exactly what common sense is, and it is fiendishly hard to reduce to a set of rules or logical statements. The list goes on and on, to the point that it feels like we are many Watson and Crick moments away from anything we need to worry about.”

Tags: ,

China has quietly surpassed the U.S. this year as the world’s largest economic power, and that’s not a situation likely to reverse itself anytime soon, even if that nation should suffer a large-scale financial downturn. But what is the significance of America being number two? From Joseph Stiglitz at Vanity Fair:

Now China is the world’s No. 1 economic power. Why should we care? On one level, we actually shouldn’t. The world economy is not a zero-sum game, where China’s growth must necessarily come at the expense of ours. In fact, its growth is complementary to ours. If it grows faster, it will buy more of our goods, and we will prosper. There has always, to be sure, been a little hype in such claims—just ask workers who have lost their manufacturing jobs to China. But that reality has as much to do with our own economic policies at home as it does with the rise of some other country.

On another level, the emergence of China into the top spot matters a great deal, and we need to be aware of the implications.

First, as noted, America’s real strength lies in its soft power—the example it provides to others and the influence of its ideas, including ideas about economic and political life. The rise of China to No. 1 brings new prominence to that country’s political and economic model—and to its own forms of soft power. The rise of China also shines a harsh spotlight on the American model. That model has not been delivering for large portions of its own population. The typical American family is worse off than it was a quarter-century ago, adjusted for inflation; the proportion of people in poverty has increased. China, too, is marked by high levels of inequality, but its economy has been doing some good for most of its citizens. China moved some 500 million people out of poverty during the same period that saw America’s middle class enter a period of stagnation. An economic model that doesn’t serve a majority of its citizens is not going to provide a role model for others to emulate. America should see the rise of China as a wake-up call to put our own house in order.

Second, if we ponder the rise of China and then take actions based on the idea that the world economy is indeed a zero-sum game—and that we therefore need to boost our share and reduce China’s—we will erode our soft power even further. This would be exactly the wrong kind of wake-up call. If we see China’s gains as coming at our expense, we will strive for “containment,” taking steps designed to limit China’s influence. These actions will ultimately prove futile, but will nonetheless undermine confidence in the U.S. and its position of leadership. U.S. foreign policy has repeatedly fallen into this trap.•

Tags:

The whole world is a city, or becoming one, we’ve been told repeatedly, but a new Economist report pushes back at the idea, arguing that China, India and Brazil, three ascendant powers, are embracing the sprawl. Measures must be taken to ensure that the environmental costs of non-density are minimized. The opening:

“IN THE West, suburbs could hardly be less fashionable. Singers and film-makers lampoon them as the haunts of bored teenagers and desperate housewives. Ferguson, Missouri, torched by its residents following the police shooting of an unarmed black teenager, epitomises the failure of many American suburbs. Mayors like boasting about their downtown trams or metrosexual loft dwellers not their suburbs.

But the planet as a whole is fast becoming suburban. In the emerging world almost every metropolis is growing in size faster than in population. Having bought their Gucci handbags and Volkswagens, the new Asian middle class is buying living space, resulting in colossal sprawl. Many of the new suburbs are high-rise, though still car-oriented; others are straight clones of American suburbs (take a look at Orange County, outside Beijing). What should governments do about it?

The space race

Until a decade or two ago, the centres of many Western cities were emptying while their edges were spreading. This was not for the reasons normally cited. Neither the car nor the motorway caused suburban sprawl, although they sped it up: cities were spreading before either came along. Nor was the flight to the suburbs caused by racism. Whites fled inner-city neighbourhoods that were becoming black, but they also fled ones that were not. Planning and zoning rules encouraged sprawl, as did tax breaks for home ownership—but cities spread regardless of these. The real cause was mass affluence. As people grew richer, they demanded more privacy and space. Only a few could afford that in city centres; the rest moved out.

The same process is now occurring in the developing world, but much more quickly.”

The standing desk, a truly bad idea, is not likely to be the furniture of tomorrow’s office. The Dutch firm, RAAAF, has come up with an alternative proposal that’s even battier. It’s ergonomics run amok. From “The Weirdest Proposal Yet for the ‘Office of the Future,'” a Wired piece by Margaret Rhodes:

“The designers are especially interested in supported standing, which standing desks don’t offer. Supported standing, like upright leaning, can engage the muscles—hopefully enough to prevent the drop in fat-burning enzymes that occurs during long periods of sitting—without tiring out the employee’s legs and lower back quite so much. The maze-like series of angled and tapered frames create an infinite number of leaning spots, for workers of any height. There are no fixed desks, so employees might find it natural to roam around and be active.

That feature is also one of the obvious impracticalities of ‘The End of Sitting.’ Without desks, how do staffers keep track of supplies, notes, or work documents? Without offices or conference rooms, how can people have meetings that don’t disrupt everyone else’s concentration? ‘The End of Sitting’ is both an art installation and an experiment, so it’s not actually concerned with answering those questions. Instead, Rietveld says this is “about showing a different way of thinking.'”

______________________________

“Sitting kills”:

Tags:

From Kit Buchan at Guardian, a little more about the Lowe’s robotic shopping assistant, OSHbot, one realized idea from the chain store’s Innovation Labs, and one which won’t be replacing human workers, not yet at least:

“According to [Innovation Lab’s Executive Director Kyle] Nel, OSHbot is the product of an extraordinary innovation scheme in which Lowe’s Innovation Labs ask published science-fiction writers to produce stories predicting futuristic scenarios for the store. Lowe’s then seek out what Nel calls ‘uncommon partners’ to help make the stories reality; in OSHbot’s case, the trendy Silicon Valley learning hub Singularity University and the startup robotics firm Fellow Robots.

OSHbot is a 4ft-something, pear-shaped character; limbless, with nothing but a vague green glow for a face, and a screen slanted in front like a starched pinny. ‘It’s basically a roving kiosk; we definitely didn’t want it to have arms or anything like that,’ says Nel. ‘But there’s still lots to figure out, for instance: what voice should the robot have? Should it be male, should it be female? There are so many things we can’t know until we try it.’

Nel is quick to clarify that OSHbot is not a replacement for human beings – rather it is there to ‘augment [the] store associates.'”

Tags: ,

Santa: poll taxes, McMansions, corporate welfare.

 

You probably had a hectic Black Friday and so did Santa Claus. He was busy overseeing his new automated workshop in the North Pole. The elves can go fuck themselves. Robots work for free and they’re not a bunch of sassy little bitches. Good luck in the world of fetish porn, you tiny ingrates.

But don’t think Santa is giving your asshole children those toys for free. You’ll pay retail. You see, Santa has been working with Goldman Sachs to prepare Christmas Inc. for an IPO. He wants Bezos money. In fact, the big guy has gone right-wing like David Mamet and his interests now include making wealth inequality worse and spending like a Koch brother to prevent Obama from winning a third term. (Yeah, I know, but don’t tell him.)

Santa’s actually feeling pretty good these days. Thanks to those occasionally useful Tea Party dipshits, conservatives now run the House and Senate and are only a Mitt Romney Presidency from making America a corporatocracy. Santa is very happy that Romney is considering running again in 2016, though he thinks ol’ Mitt should probably keep his creepy-eyed kid in the attic until all the votes have been counted.

You don’t like Santa’s vision? Well, he thinks you should go scratch your ass with a broken eggnog bottle. Santa’s a pimp and you’re a ho ho ho!

When Father becomes emperor, all the squirrels shall be my minions.

When Father becomes Emperor, all the squirrels shall be my minions.

From the November 10, 1925 Brooklyn Daily Eagle:

Ralph H. Baer, who just passed away, began dreaming of designing games for TV sets in 1951, but it wasn’t until 15 years later that he started to fully flesh out the idea, eventually creating the first home video-game system, the Odyssey. From his New York Times obituary by Douglas Martin:

Flash back to the sultry late summer of 1966: Mr. Baer is sitting on a step outside the Port Authority Bus Terminal in Manhattan waiting for a colleague. By profession, he is an engineer overseeing 500 employees at a military contractor. Today, a vision has gripped him, and he begins scribbling furiously on a yellow legal pad with a No. 2 pencil.

The result was a detailed four-page outline for a “game box” that would allow people to play board, action, sports and other games on almost any American television set. An intrigued boss gave him $2,000 for research and $500 for materials and assigned two men to work with him. For all three, as they plowed through prototype after prototype in a secret workshop, the project became an obsession.

In March 1971, Mr. Baer and his employer, Sanders Associates in Nashua, N.H., filed for the first video game patent, which was granted in April 1973 as Patent No. 3,728,480. It made an extraordinarily large claim to a legal monopoly for any product that included a domestic television with circuits capable of producing and controlling dots on a screen.

Sanders Associates licensed its system to Magnavox, which began selling it as Odyssey in the summer of 1972 as the first home video game console. It sold 130,000 units the first year.

Odyssey consisted of a master control unit containing all the electronic gear, two player control units that directed players on the TV screen, and a set of electronic program cards, each of which supported a different game. Plastic overlays that clung to the screen to supply color were included. To supplement the electronic action, a deck of playing cards, poker chips and a pair of dice were included.

But the guts of the device were what mattered: 40 transistors and 40 diodes. That hardware ran everything. Odyssey, often called the first home computing device, had no software.

Several months after Odyssey hit the market, Atari came out with the first arcade video game, Pong. Though Pong became better known than Odyssey and was in some ways more agile, Sanders and Magnavox immediately saw it as an infringement on their patent.•

Tags: ,

« Older entries § Newer entries »