Excerpts

You are currently browsing the archive for the Excerpts category.

The Economist has a piece about the so-called “Obesity Penalty,” which is supported by a new Swedish study which argues that overweight people earn less than their weed-like co-workers. Probably a good idea to be circumspect about the whole thing–or at least the causes if the effect is real. An excerpt:

“BEING obese is the same as not having an undergraduate degree. That’s the bizarre message from a new paper that looks at the economic fortunes of Swedish men who enlisted in compulsory military service in the 1980s and 1990s. They show that men who are obese aged 18 grow up to earn 16% less than their peers of a normal weight. Even people who were overweight at 18—that is, with a body-mass index from 25 to 30—see significantly lower wages as an adult.

At first glance, a sceptic might be unconvinced by the results. After all, within countries the poorest people tend to be the fattest. One study found that Americans who live in the most poverty-dense counties are those most prone to obesity. If obese people tend to come from impoverished backgrounds, then we might expect them to have lower earnings as an adult.

But the authors get around this problem by mainly focusing on brothers. Every person included in their final sample—which is 150,000 people strong—has at least one male sibling also in that sample. That allows the economists to use ‘fixed-effects,’ a statistical technique that accounts for family characteristics (such as poverty). They also include important family characteristics like the parents’ income. All this statistical trickery allows the economists to isolate the effect of obesity on earnings.

So what does explain the ‘obesity penalty’?”

My guess is that even if we have cars that are 90% autonomous (at least on highways) by 2015 and fully robotic in a half-dozen years as Elon Musk promises, it will take substantially longer than that to modify infrastructure to meet the demand. If no retrofitting is required, then that’s a whole different conversation. From Mike Ramsey at WSJ:

“Tesla Motors Inc. plans to unveil features that enable more computer-controlled driving of its Model S electric sedan on Thursday, following up on tweets sent by the company’s founder last week, according to a person familiar with the matter.

At an event scheduled for Thursday in Hawthorne, Calif., the Silicon Valley auto maker will announce the latest upgrades, about a week after Chief Executive Elon Musk posted a pair of tweets suggesting the auto maker soon would announce a product he referred to as ‘D.’

A Tesla spokeswoman declined to comment on the specifics of this week’s announcement.

Tesla’s foray into features that allow autonomous driving reflects a wider push among auto makers to produce cars that can handle more driving functions on their own. Mr. Musk recently said Tesla will have a fully autonomous car ready in five to six years.”

Do people still consider Marshall McLuhan to be so many mumbles the way they did when he fell from grace, without cause, by 1980 or so? He wasn’t always right, but the theorist was no Nostradamus, whose writing needs to be spun like an angel on the head of a pin to appear to be right. McLuhan was more correct about the looming Information Age than anyone. From Paul Herbert’s Pacific-Standard piece, “The Medium Is the Message: 50 Years Later“:

“TWENTY YEARS AGO, IN the introduction to a re-print of Understanding Media, renowned editor Lewis H. Lapham wrote that much of what McLuhan had to say made a lot more sense in 1994 than it did in 1964, what with two terms of Reagan and the creation of MTV. Twenty years after that, the banality of McLuhan’s ideas have solidified their merit. When Yahoo! CEO Marissa Mayer, for example, compared the expansion of big data to the planet developing a central nervous system, that’s McLuhan. When Chief Justice John Roberts opined that an alien from Mars might mistake the smartphone as an integral feature of human anatomy, that’s McLuhan, too. In 2014, it’s hard to overstate McLuhan’s prescience.

‘People who don’t like McLuhan in the academic world are either lazy, stupid, jealous, or some combination,’ says Paul Levinson, a professor of communication and media studies at Fordham University, where McLuhan taught for a year in the late ’60s. ‘McLuhan wasn’t into commonsense, reasonable propositions. He liked looking at things in a poetic, metaphoric way.’

And it’s true: McLuhan had a penchant for speaking in riddles and rhymes that might baffle at first, but grow into epiphany if given the chance. His rhetorical style was hyperbole. He didn’t shy away from playing the holy fool, as Wired would later call him, and on a number of occasions claimed his mission was simply to probe the new terrain, not come back to camp with answers.”

__________________________

McLuhan with Tom Wolfe, one of his champions, in 1970:

Tags: ,

Aubrey de Grey operates at the extreme edges of gerontology, believing we won’t just extend life a little but essentially defeat death. But if we could live hundreds of years–or forever–what would this endless summer mean for global population? From Factor-Tech:

In a world where getting old is no longer an issue, concerns will arise about population levels and resources that the planet can provide.

De Grey admits that the world will change dramatically and that the transformation will not necessarily be a smooth one. “There may be some turbulence and obviously the more we can forward plan to minimise that turbulence the better,” he adds.

One UN report, from 2003, predicts that the world’s population could increase to more than 36bn people by 2300 – and that forecast is based on regular life expectancy. If everyone is living for hundreds of years then the resources needed to sustain them would drastically increase.

But this view does not give credit to other technologies that are developing at a faster implementation rate than anti-ageing, and people can have a blinkered view about this.

“They just don’t look at the problem properly so for example, one thing that people hardly ever acknowledge is that the other new technology is going to be around a great deal sooner than this technology, or at least sooner than this technology will have any demographic impact,” de Grey says.

“For example we will have much less carbon footprint because we will have things like better renewable energy and nuclear fusion and so on, so that it will actually be increasing the carrying capacity of the planet far faster than the defeat of ageing could increase the number of people on the planet.”•

Tags:

Those who still believe privacy can be preserved by legislation either haven’t thought through the realities or are deceiving themselves. Get ready for your close-up because it’s not the pictures that are getting smaller, but the cameras. Tinier and Tinier. Soon you won’t even notice them. And they can fly.

I have no doubt the makers of the Nixie, the wristwatch-drone camera, have nothing but good intentions, but not everyone using it will. From Joseph Flaherty at Wired:

“Being able to wear the drone is a cute gimmick, but it’s powerful software packed into a tiny shell could set Nixie apart from bargain Brookstone quadcopters. Expertise in motion-prediction algorithms and sensor fusion will give the wrist-worn whirlybirds an impressive range of functionality. A ‘Boomerang mode’ allows Nixie to travel a fixed distance from its owner, take a photo, then return. ‘Panorama mode’ takes aerial photos in a 360° arc. ‘Follow me’ mode makes Nixie trail its owner and would capture amateur athletes in a perspective typically reserved for Madden all-stars. ‘Hover mode’ gives any filmmaker easy access to impromptu jib shots. Other drones promise similar functionality, but none promise the same level of portability or user friendliness.

‘We’re not trying to build a quadcopter, we’re trying to build a personal photographer,’ says Jovanovic.

A Changing Perspective on Photography

[Jelena] Jovanovic and her partner Christoph Kohstall, a Stanford postdoc who holds a Ph.D. in quantum physics and a first-author credit in the journal Nature, believe photography is at a tipping point.

Early cameras were bulky, expensive, and difficult to operate. The last hundred years have produced consistently smaller, cheaper, and easier-to-use cameras, but future developments are forking. Google Glass provides the ultimate in portability, but leaves wearers with a fixed perspective. Surveillance drones offer unique vantage points, but are difficult to operate. Nixie attempts to offer the best of both worlds.”•

Tags: , ,

The director and artist Steve McQueen is a dizzying, demanding, daring talent, doing a rare thing in these times: making films from an adult perspective. Two excerpts follow from a new Financial Times profile by Peter Aspden, one about his allegedly irritable personality, and the other about his depiction of male sexuality in Shame.

______________________

The best way to describe the relationship between the two means of expression, he says, in a comparison he has made before, is that “the movie is the novel, and art is poetry. Not a lot of people appreciate poetry, and it is the same with art. It is a more specialised form. That’s the difference.”

But the two impulses are forever “expanding and contracting” in his mind, he says. I ask if it is difficult to shift between genres. It is the rarest of things for a video artist to convert to Hollywood film-making. “Not at all. It is not as if I am jumping into different states of mind. It is all about finding what you want to say, and then how you want to say it.” Is that very clear to him straightaway? “Oh yes. But these things are incubating in my mind for a long time. I am in 2007 right now.” I look for a hint of a smile as he says this but he appears deadly serious.

McQueen, who turns 45 this week, is routinely described as a prickly man who doesn’t suffer fools gladly, but I wonder if that is confusing his seriousness and unrelenting intensity for a kind of social awkwardness. He gives every impression to me of enjoying the interview process, watchful and concentrated while he is listening to the question, like a batsman steadying himself during a bowler’s run-up. When Kirsty Young brought up the same subject in a recent edition of the BBC’s Desert Island Discs, asking why he was so unfairly portrayed, he replied simply: “I am a black man. I’m used to that. If I walk into a room people make a judgment. I don’t care.

______________________

He seems to relish plunging into controversial subjects, I say. Shame, his second feature film, was an extraordinarily candid view of the unheralded extremes of male sexuality. “That’s still not sorted,” he says quickly. “That is unfinished business. I really want to come back to that.” Why was that? “It is an extremely fascinating subject. But no one talks about it. Let’s get real! So many important decisions in the world are connected with the sexual appetites of important men. Whether it is JFK, or Clinton, or Martin Luther King. That is what we are. That is part of us. But sometimes people are embarrassed by their pleasures.

‘It is a huge subject. So many people came out after that film and sent me anonymous letters, a lot of thank-yous, and some crazy stuff too.’ What did women think of it, I ask? ‘I don’t know how much women know, or want to know, about men’s sexual appetites. A friend of mine went to see it with his wife, and she asked him, ‘Do those things really happen?’ And he was, like, ‘No, no, it is just a fantasy, it is just the movies.’” McQueen’s laugh suggests otherwise.•

 

Tags: ,

I previously posted some stuff about driverless-car testing in a mock cityscape in Ann Arbor, Michigan, which might seem unnecessary given Google’s regular runs on actual streets and highways. But here’s an update on the progress from “Town Built for Driverless Cars,” by Will Knight at Technology Review:

“A mocked-up set of busy streets in Ann Arbor, Michigan, will provide the sternest test yet for self-driving cars. Complex intersections, confusing lane markings, and busy construction crews will be used to gauge the aptitude of the latest automotive sensors and driving algorithms; mechanical pedestrians will even leap into the road from between parked cars so researchers can see if they trip up onboard safety systems.

The urban setting will be used to create situations that automated driving systems have struggled with, such as subtle driver-pedestrian interactions, unusual road surfaces, tunnels, and tree canopies, which can confuse sensors and obscure GPS signals.

‘If you go out on the public streets you come up against rare events that are very challenging for sensors,’ says Peter Sweatman, director of the University of Michigan’s Mobility Transformation Center, which is overseeing the project. ‘Having identified challenging scenarios, we need to re-create them in a highly repeatable way. We don’t want to be just driving around the public roads.'”

Tags:

The Mudd Club was a cabaret institution in New York for a few years in the late-’70s and early ’80s, the edgier little cousin to Studio 54, which wasn’t exactly Disneyland. An excerpt from a 1979 People article which includes a holy shit! quote from Andy Warhol:

“Ever on the prowl for outrageous novelty, New York’s fly-by-night crowd of punks, posers and the ultra hip has discovered new turf on which to flaunt its manic chic. It is the Mudd Club, a dingy disco lost among the warehouses of lower Manhattan. By day the winos skid by without a second glance. But come midnight (the opening time), the decked-out decadents amass 13 deep. For sheer kinkiness, there has been nothing like it since the cabaret scene in 1920s Berlin.

In just six months the Mudd has made its uptown precursor, Studio 54, seem almost passé and has had to post a sentry on the sidewalk. The difference is that the Mudd doesn’t have a velvet rope but a steel chain. Such recognizable fun-lovers as David Bowie, Mariel Hemingway, Diane von Furstenburg and Dan Aykroyd are automatically waved inside. For the rest, the club picks its own like some sort of perverse trash compactor. The kind of simple solution employed by U.S. gas stations is out of the question: At the Mudd, every night is odd. Proprietor Steve Mass, 35, admits that ‘making a fashion statement’ is the criterion. That means a depraved version of the audience of Let’s Make a Deal. One man gained entrance simply by flashing the stump of his amputated arm.

The action inside varies from irreverent to raunch. Andy Warhol is happy to have found a place, he says, ‘where people will go to bed with anyone—man, woman or child.’ Some patrons couldn’t wait for bedtime, and the management has tried to curtail sex in the bathrooms.”

Tags: ,

For productivity to increase, labor costs must shrink. That’s fine provided new industries emerge to accommodate workers, but that really isn’t what we’ve seen so far in the Technological Revolution, the next great bend in the curve, as production and wages haven’t boomed. It’s been the trade of a taxi medallion for a large pink mustache. More convenient, if not cheaper (yet), for the consumer, but bad for the drivers.

Perhaps that’s because we’re at the outset of a slow-starting boom, the Second Machine Age, or perhaps what we’re going through refuses to follow the form of the Industrial Revolution. Maybe it’s the New Abnormal. The opening of the Economist feature, “Technology Isn’t Working“: 

“IF THERE IS a technological revolution in progress, rich economies could be forgiven for wishing it would go away. Workers in America, Europe and Japan have been through a difficult few decades. In the 1970s the blistering growth after the second world war vanished in both Europe and America. In the early 1990s Japan joined the slump, entering a prolonged period of economic stagnation. Brief spells of faster growth in intervening years quickly petered out. The rich world is still trying to shake off the effects of the 2008 financial crisis. And now the digital economy, far from pushing up wages across the board in response to higher productivity, is keeping them flat for the mass of workers while extravagantly rewarding the most talented ones.

Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organisation for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades.

In July 1987 Robert Solow, an economist who went on to win the Nobel prize for economics just a few months later, wrote a book review for the New York Times. The book in question, The Myth of the Post-Industrial Economy by Stephen Cohen and John Zysman, lamented the shift of the American workforce into the service sector and explored the reasons why American manufacturing seemed to be losing out to competition from abroad. One problem, the authors reckoned, was that America was failing to take full advantage of the magnificent new technologies of the computing age, such as increasingly sophisticated automation and much-improved robots. Mr Solow commented that the authors, ‘like everyone else, are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution…has been accompanied everywhere…by a slowdown in productivity growth.'”

Anne Frank famously wrote, “Despite everything, I believe that people are really good at heart,” and if she could be hopeful, how can we feel grounded by despondency? A sneakily cheerful side to the growing cadre of scientists and philosophers focused on existential risks that could drive human extinction is that if we make it through the next century, we may have unbridled abundance. From Aaron Labaree at Salon:

“When we think about general intelligence,’ says Luke Muelhauser, Executive Director at MIRI, ‘that’s a meta-technology that gives you everything else that you want — including really radical things that are even weird to talk about, like having our consciousness survive for thousands of years. Physics doesn’t outlaw those things, it’s just that we don’t have enough intelligence and haven’t put enough work into the problem … If we can get artificial intelligence right, I think it would be the best thing that ever happened in the universe, basically.’

A surprising number of conversations with experts in human extinction end like this: with great hope. You’d think that contemplating robot extermination would make you gloomy, but it’s just the opposite. As [Martin] Rees explains, ‘What science does is makes one aware of the marvelous potential of life ahead. And being aware of that, one is more concerned that it should not be foreclosed by screwing up during this century.’ Concern over humanity’s extermination at the hands of nanobots or computers, it turns out, often conceals optimism of the kind you just don’t find in liberal arts majors. It implies a belief in a common human destiny and the transformative power of technology.

‘The stakes are very large,’ [Nick] Bostrom told me. ‘There is this long-term future that could be so enormous. If our descendants colonized the universe, we could have these intergalactic civilizations with planetary-sized minds thinking and feeling things that are beyond our ken, living for billions of years. There’s an enormous amount of value that’s on the line.’

It’s all pretty hypothetical for now.•

Tags: ,

I don’t know if it will happen within ten years–though that’s not an outrageous time frame–but 3-D printing will automate much of the restaurant process, making jobs vanish, and will also be common in homes as prices fall. The opening of “Is 3-D Printing the Next Microwave?” by Jane Dornbusch at the Boston Globe:

“Picture the dinner hour in a decade: As you leave work, you pull up an app (assuming we still use apps) on your phone (or your watch!) that will direct a printer in your kitchen to whip up a batch of freshly made ravioli, some homemade chicken nuggets for the kids, and maybe a batch of cookies, each biscuit customized to meet the nutritional needs of different family members. 

It sounds like science fiction, but scientists and engineers are working on 3-D printing, and the food version of the 3-D printer is taking shape. Don’t expect it to spin out fully cooked meals anytime soon. For now, the most popular application in 3-D food printing seems to be in the decidedly low-tech area of cake decoration.

Well, not just cake decoration, but sugary creations of all kinds. The Sugar Lab is the confectionary arm of 3-D printing pioneer 3D Systems, and it expects to have the ChefJet, a 3-D food printer, available commercially later this year. Though tinkerers have been exploring the possibilities of 3-D food printing for a few years, and another food printer, Natural Machines’ Foodini, is slated to appear by year’s end, 3D Systems says the ChefJet is the world’s first 3-D food printer.

Like so many great inventions, the ChefJet came about as something of an accident, this one performed by husband-and-wife architecture grad students Liz and Kyle von Hasseln a couple of years ago. At the Southern California Institute of Architecture, the von Hasselns used a 3-D printer to create models. Intrigued by the process, Liz von Hasseln says, ‘We bought a used printer and played around with different materials to see how to push the technology. One thing we tried was sugar. We thought if we altered the machine a bit and made it all food safe and edible, we could push into a new space.’ More tweaking ensued, and the ChefJet was born.”

___________________________

Walter Cronkite presents the kitchen of 2001 in 1967:

Tags: , ,

I haven’t yet read Walter Isaacson’s new Silicon Valley history, The Innovators, but I would be indebted if it answers the question of how much Gary Kildall’s software was instrumental to Microsoft’s rise. Was Bill Gates and Paul Allen’s immense success built on intellectual thievery? Has the story been mythologized beyond realistic proportion? An excerpt from Brendan Koerner’s New York Times review of the book:

“The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities ‘shared a resistance to power elites and a desire to control their own access to information.’ Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he’s accustomed to writing biographies of men who operated inside the corridors of power — Benjamin Franklin, Henry ­Kissinger, Jobs — Isaacson seems a bit baffled by committed outsiders like ­Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of ’60s ­counterculture.

Yet this minor shortcoming is quickly forgiven when The Innovators segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk — a spoiled brat and compulsive gambler who ‘was rebellious just for the hell of it.’ Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm — in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine’s MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian.”

Tags: , ,

Selfies, the derided yet immensely popular modern portraiture, draw ire because of narcissism and exhibitionism, of course, but also because anyone can take them and do so ad nauseum. It’s too easy and available, with no expertise or gatekeeper necessary. The act is magalomania, sure, but it’s also democracy, that scary, wonderful rabble. From, ultimately, a defense of the self-directed shot by Douglas Coupland in the Financial Times:

“Selfies are the second cousin of the air guitar.

Selfies are the proud parents of the dick pic.

Selfies are, in some complex way, responsible for the word ‘frenemy.’

I sometimes wonder what selfies would look like in North Korea.

Selfies are theoretically about control – or, if you’re theoretically minded, they’re about the illusion of self-control. With a selfie some people believe you’re buying into a collective unspoken notion that everybody needs to look fresh and flirty and young for ever. You’re turning yourself into a product. You’re abdicating power of your sexuality. Or maybe you’re overthinking it – maybe you’re just in love with yourself.

I believe that it’s the unanticipated side effects of technology that directly or indirectly define the textures and flavours of our eras. Look at what Google has already done to the 21st century. So when smartphones entered the world in 2002, I think that if you gathered a group of smart media-savvy people in a room with coffee and good sandwiches, before the end of the day, the selfie could easily have been forecast as an inevitable smartphone side effect. There’s actually nothing about selfies that feels like a surprise in any way. The only thing that is surprising is the number of years it took us to isolate and name the phenomenon. I do note, however, that once the selfie phenomenon was named and shamed, selfies exploded even further, possibly occupying all of those optical-fibre lanes of the internet that were once occupied by Nigerian princes and ads for penis enlargement procedures.”

 

Tags:

The end is near, roughly speaking. A number of scientists and philosophers, most notably Martin Rees and Nick Bostrom, agonize over the existential risks to humanity that might obliterate us long before the sun dies. A school of thought has arisen over the End of Days. From Sophie McBain in the New Statesman (via 3 Quarks Daily):

“Predictions of the end of history are as old as history itself, but the 21st century poses new threats. The development of nuclear weapons marked the first time that we had the technology to end all human life. Since then, advances in synthetic biology and nanotechnology have increased the potential for human beings to do catastrophic harm by accident or through deliberate, criminal intent.

In July this year, long-forgotten vials of smallpox – a virus believed to be ‘dead’ – were discovered at a research centre near Washington, DC. Now imagine some similar incident in the future, but involving an artificially generated killer virus or nanoweapons. Some of these dangers are closer than we might care to imagine. When Syrian hackers sent a message from the Associated Press Twitter account that there had been an attack on the White House, the Standard & Poor’s 500 stock market briefly fell by $136bn. What unthinkable chaos would be unleashed if someone found a way to empty people’s bank accounts?

While previous doomsayers have relied on religion or superstition, the researchers at the Future of Humanity Institute want to apply scientific rigour to understanding apocalyptic outcomes. How likely are they? Can the risks be mitigated? And how should we weigh up the needs of future generations against our own?

The FHI was founded nine years ago by Nick Bostrom, a Swedish philosopher, when he was 32. Bostrom is one of the leading figures in this small but expanding field of study.”

Tags: , ,

Sure, we have phones that are way nicer now, but the Technological Revolution has largely been injurious to anyone in the Labor market, and things are going to get worse, at least in the near and mid term. A free-market society that is highly automated isn’t really very free. Drive for Uber until autonomous cars can take over the wheel, you’re told, or rent a spare room on Airbnb–make space for yourself on the margins through the Sharing Economy. But there’s less to share for most people. From an Economist report:

“Before the horseless carriage, drivers presided over horse-drawn vehicles. When cars became cheap enough, the horses and carriages had to go, which eliminated jobs such as breeding and tending horses and making carriages. But cars raised the productivity of the drivers, for whom the shift in technology was what economists call ‘labour-augmenting.’ They were able to serve more customers, faster and over greater distances. The economic gains from the car were broadly shared by workers, consumers and owners of capital. Yet the economy no longer seems to work that way. The big losers have been workers without highly specialised skills.

The squeeze on workers has come from several directions, as the car industry clearly shows. Its territory is increasingly encroached upon by machines, including computers, which are getting cheaper and more versatile all the time. If cars and lorries do not need drivers, then both personal transport and shipping are likely to become more efficient. Motor vehicles can spend more time in use, with less human error, but there will be no human operator to share in the gains.

At the same time labour markets are hollowing out, polarising into high- and low-skill occupations, with very little employment in the middle. The engineers who design and test new vehicles are benefiting from technological change, but they are highly skilled and it takes remarkably few of them to do the job. At Volvo much of the development work is done virtually, from the design of the cars to the layout of the production line. Other workers, like the large numbers of modestly skilled labourers that might once have worked on the factory floor, are being squeezed out of such work and are now having to compete for low-skill and low-wage jobs.

Labour has been on the losing end of technological change for several decades.”

It’s not a shocker that the late psychologist and computer scientist Dr. Christopher Evans, who presented the great 1979 TV series The Mighty Micro, combined the two disciplines he was devoted to when trying to explain why he believed people dream. From Daniel Goleman’s 1984 New York Times article about the possible causes of eyes moving rapidly:

Dr. Evans, a psychologist and computer scientist, proposes that dreams are the brain’s equivalent of a computer’s inspection of its programs, allowing a chance to integrate the experiences of the day with the memories already stored in the brain. His theory is based in part on evidence that dreaming consolidates learning and memory.

The contents of a dream, according to Dr. Evans, are fragments of events and experiences during the day which are being patched into related previous memories. “Dreaming,” he writes, “might be our biological equivalent to the computer’s process of program inspection.”•

 

Tags: ,

From the Overcoming Bias post in which economist Robin Hansen comments on Debora MacKenzie’s New Scientist article “The End of Nations,” a piece which wonders about, among other things, whether states in the modern sense predated the Industrial Revolution:

“An interesting claim: the nation-state didn’t exist before, and was caused by, the industrial revolution. Oh there were empires before, but most people didn’t identify much with empires, or see empires as much influencing their lives. In contrast people identify with nation-states, which they see as greatly influencing their lives. More:

Before the late 18th century there were no real nation states. … If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. …

Agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. … Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. … Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes.

Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. …

The industrial revolution … demanded a different kind of government. … ‘In 1800 almost nobody in France thought of themselves as French. By 1900 they all did.’ … Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split.

These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around.”

Tags:

It’s a heartbreaker watching what’s happening to the New York Times these days, and the latest layoffs are just the most recent horrible headline. The Magazine is currently a green shoot, with its bright new editor, Jake Silverstein, and a boost to staffing, but that section is the outlier. The business can’t continue to suffer without being joined by the journalism. You just hope the company is ultimately sold to someone great.

At the Washington Post, not much has changed dramatically since Jeff Bezos bought the Graham family jewel, despite some executive shuffles and new hires. Does Bezos have a long-term plan? Does he have any plan? Does it really matter in the intervening period, since he can afford to wait for everyone else to fall and position his publication as the inheritor? From Dylan Byers at Politico:

“Despite expectations, Bezos himself had never promised a reinvention. ‘There is no map, and charting a path ahead will not be easy,’ he wrote in his first memo to Post staff in August of last year. Still, his reputation preceded him: With Amazon, he had revolutionized not just the book-selling business but the very means and standards of online shopping. He was planning ambitious new initiatives like drone delivery. Surely, this man had the silver bullet to save the Washington Post, and perhaps the newspaper industry.

Bezos, who declined to be interviewed for this story, is holding his cards close to his chest. He has no influence on the editorial side, according to [Exceutive Editorm Martin] Baron, but is focused on ‘broader strategic efforts.’

If Bezos has any innovative digital initiatives in the works, they’re being formed not in Washington but in New York. In March, the Post launched a Manhattan-based design and development office called WPNYC, which is focused on improving the digital experience and developing new advertising products.

‘Jeff’s preoccupation isn’t editorial, it’s delivery,’ one Post staffer said of WPNYC. ‘He wants to change the way people receive, read and experience the news. The only problem is we still don’t know what that looks like.'”

Tags: , , ,

Aiming to make surgical invasion even more minimal, robots are being devised that can slide into small openings and perform currently messy operations. From Sarah Laskow at the Atlantic:

“In the past few years, surgeons have been pushing to make these less invasive surgeries almost entirely invisible. Instead of cutting a tiny window in the outside of the body, they thought, why not cut one inside? Surgeons would first enter a person’s body through a ‘natural orifice’ and make one small incision, through which to access internal organs. The end result of this idea was that, in 2009, a surgeon removed a woman’s kidney through her vagina.

Few surgeons were convinced this was actually an improvement though. Instead, they have focused on minimizing the number of tiny incisions needed to perform surgery. Single-site surgery requires just one ‘port’ into a body.

A team of surgeons at Columbia, for instance, is working on a small robotic arm—minuscule, when compared to the da Vinci system—that can sneak into one 15 millimeter incision. And NASA is working on a robot that can enter the abdominal cavity through a person’s belly button, Matrix-like, to perform simple surgeries. It’s meant to be used in emergencies, but we know how this story goes: Soon enough, it’ll be routine for a robot to slide into a person’s body and pull her appendix back out.”

Tags:

Elon Musk has said that Tesla will produce fully autonomous vehicles within six years, which doesn’t make complete sense to me because I think infrastructure would have to be modified before that’s possible, but he is now promising that 2015 models will reach the 90% autopilot threshold. From Chris Ziegler at The Verge:

In an excerpt from a CNNMoney interview, Tesla boss Elon Musk says that the self-driving car — or “autopilot,” the term he prefers — is basically just months away from retail. Here’s the language:

‘Autonomous cars will definitely be a reality. A Tesla car next year will probably be 90 percent capable of autopilot. Like, so 90 percent of your miles can be on auto. For sure highway travel.

How’s that going to happen?

With a combination of various sensors. You combine cameras with image recognition with radar and long-range ultrasonics, that’ll do it. Other car companies will follow.

But you guys are going to be the leader?

Of course. I mean, Tesla’s a Silicon Valley company. If we’re not the leader, shame on us.'”

Tags: ,

Sand seems limitless, something we can almost disregard. But like water, its supply is currently under stress, owing in part to a growing world population requiring basic resources. A global building boom is stripping beaches bare, the sand used to make cement, disappearing them. From Laura Höflinger at Spiegel:

“The phenomenon of disappearing beaches is not unique to Cape Verde. With demand for sand greater than ever, it can be seen in most parts of the world, including Kenya, New Zealand, Jamaica and Morocco. In short, our beaches are disappearing. ‘It’s the craziest thing I’ve seen in the past 25 years,’ says Robert Young, a coastal researcher at Western Carolina University. ‘We’re talking about ugly, miles-long moonscapes where nothing can live anymore.’

The sand on our ocean shores, once a symbol of inexhaustibility, has suddenly become scarce. So scarce that stealing it has become attractive.

Never before has Earth been graced with the prosperity we are seeing today, with countries like China, India and Brazil booming. But that also means that demand for sand has never been so great. It is used in the production of computer chips, plates and mobile phones. More than anything, though, it is used to make cement. You can find it in the skyscrapers in Shanghai, the artificial islands of Dubai and in Germany’s autobahns.”

Tags: ,

A follow-up on yesterday’s post about films being released on all screens, not just theater ones: Netflix is continuing to transform itself into a studio that streams, inking a four-picture deal with the inexplicably popular Adam Sandler. (The first three are rumored to be a trilogy about a golfer who has violent diarrhea competing against another golfer who has violent diarrhea.) From Pamela McClintock at the Hollywood Reporter:

“Netflix has signed a deal to make four feature films with Adam Sandler as the streaming service continues its empire-building and moves into producing original movies that bypass the usual theatrical release.

Sandler’s Happy Madison Productions will work alongside Netflix in developing the yet-to-be announced titles, which will premiere exclusively in the nearly 50 countries where Netflix operates. It’s a significant move for Sandler, a longtime denizen of the Hollywood studio system — a system wedded to playing films first in theaters, not in the home. He’ll both star in and produce the Netflix projects.

‘When these fine people came to me with an offer to make four movies for them, I immediately said yes for one reason and one reason only … Netflix rhymes with Wet Chicks,’ Sandler said in a statement. ‘Let the streaming begin!!!!'”

Tags: ,

Thanks to the excellent 3 Quarks Daily for pointing me to Mala Szalavitz’s Substance.com essay, “Most People With Addiction Simply Grow Out Of It.” Hopelessly addicted is a phrase we’re all familiar with, but it’s an extreme outlier, not the rule, as most people kick after a few years. We likely believe addiction is terminal because we conjure the most extreme and dramatic examples to represent it; call it the Availability Heuristic of heroin and the like. Unfortunately, that misunderstanding influences laws and treatment. An excerpt:

“Why do so many people still see addiction as hopeless? One reason is a phenomenon known as ‘the clinician’s error,’ which could also be known as the ‘journalist’s error’ because it is so frequently replicated in reporting on drugs. That is, journalists and rehabs tend to see the extremes: Given the expensive and often harsh nature of treatment, if you can quit on your own you probably will. And it will be hard for journalists or treatment providers to find you.

Similarly, if your only knowledge of alcohol came from working in an ER on Saturday nights, you might start thinking that prohibition is a good idea. All you would see are overdoses, DTs, or car crash, rape or assault victims. You wouldn’t be aware of the patients whose alcohol use wasn’t causing problems. And so, although the overwhelming majority of alcohol users drink responsibly, your ‘clinical’ picture of what the drug does would be distorted by the source of your sample of drinkers.

Treatment providers get a similarly skewed view of addicts: The people who keep coming back aren’t typical—they’re simply the ones who need the most help. Basing your concept of addiction only on people who chronically relapse creates an overly pessimistic picture.

This is one of many reasons why I prefer to see addiction as a learning or developmental disorder, rather than taking the classical disease view.”

Tags:

I just want an apology from the geniuses who mocked me for predicting at the start of the aughts (in a published article that’s no longer online) that films would eventually be released on all screens simultaneously, large theater ones as well as on TVs and computers. It hasn’t happened yet, but it may. Actually, it will, almost definitely. The opening of “Is Netflix Trying to Kill the Theater For Once and All?” by Grantland’s John Lopez:

“Next time you’re in Los Angeles, check out the historicBroadway theater district downtown: At the turn of the century, before the studios and theater chains were split apart, the stretch of Broadway between Third and Olympic boasted the highest concentration of cinemas in the word, the jeweled crown of L.A.’s burgeoning film industry. On any given night, studios premiered their latest films at sumptuous movie palaces like the Orpheum and the Million Dollar Theater. More recently, these temples of cinema, which wouldn’t look out of place at Versailles, have hosted Sunday revival churches and Spanish-language swap meets. Now they’re mostly ghosts of a bygone era when the theatrical experience was the undisputed king of American mass culture. It’s that ghost that streaked through modern-day multiplex owners’ nightmares Monday when Netflix (aided by the prince of darkness himself, Harvey Weinstein) announced that for the first time it would stream a major feature film,Crouching Tiger, Hidden Dragon: The Green Legend, simultaneous to its IMAX theater release next summer.

Predictably, by Tuesday morning a film business already battered by the worst box office summer since 1997 went apeshit. In fact, that nightmare freaked out theater owners so bad that Regal, Cinemark, and AMC — the nation’s three biggest theater chains — dropped the popcorn gauntlet Tuesday and announced they would refuse to carry the Crouching Tiger sequel at their theaters.1 In other words, as Netflix was announcing a historic new era when on-demand truly means on-demand, the nation’s theaters collectively said, “Over our swap-meet-hosting dead bodies.” Obviously, your local cineplex isn’t going to shut down after next summer, but let’s answer some questions about what’s going on here before the revolution arrives.

Could this truly be the beginning of the much-foretold end of the moviegoing experience? And should you even care?

Yes. And yes.”

Tags:

An excerpt from “Future of Rail 2050,” an ARUP study which predicts the demands of sprawling megacities will completely overhaul the nature of railway stations and that the typical person will be named “Nuno”:

“Hugo Dupont, 31 • Smart City Engineer

Hugo is rushing to catch the Metro train to work. Earlier, as he reached Rue Daval, he remembered that he had left a parcel on his kitchen counter and had to turn back to get it. Now, running a little late but parcel in hand, he pauses as a fleet of driverless pods pass by and then crosses the road at the signal, disappearing into the Metro station. 

He needs the package to be delivered that evening, as today is his friend Nuno’s birthday. At the entrance to the Metro, he drops the parcel into the International Express box next to the interactive tourist information wall. As he selects to receive freight alerts to track the progress of his  package and pays for the shipping with a tap of a button, a message notifies him that his meeting with colleagues in Hong Kong via holographic software will start in 15 minutes. He hurries to the platform to catch his train.

The platform screen doors slide shut just before Hugo can board the Metro. However, he isn’t too worried as he knows the next train will arrive in under a minute. The driverless metro trains can travel in close succession as they constantly communicate with each other and with rail infrastructure and automatically respond to the movements of the other trains on the track, making the metro extremely safe and efficient. 

As he waits, Hugo notices other commuters buying groceries from the virtual shopping wall. As his fridge hasn’t sent him any alerts, he thinks he is stocked up well enough at home for the time being. He also glances at some of the artwork on platform screen doors — he enjoys seeing the changing digital exhibitions every day.

Meanwhile, at 08:46, Hugo’s parcel drops onto a conveyor belt and is transported to a pod on the underground freight pipeline. The routing code is scanned as it is loaded onto the pod, and the package is whisked away to Gare Centrale. The electric pod travels uninhibited at a steady pace, independent of traffic and weather conditions, and at 09:16 the package is loaded onto the mail carriage at the back of the waiting high-speed EuroTrain that carries both passengers and small express freight. At 10:35, the train leaves the station and runs directly to Berlin.

In his office, Hugo is testing a new  system for analysing how much electricity from braking trains is fed back into the grid, when he receives a notification informing him that his package is on the train and is running on schedule. Hugo lives alone in an apartment in a large European megacity. Having studied abroad, he has returned to his home city and works as a Smart City Engineer for the City Authority, maintaining a network of sensors tracking electricity, traffic and people flows to create efficiencies across city systems. He likes gadgets and his wearable computers perform a variety of functions from wayfinding, to holographic communication, to the real-time monitoring of his health.”

« Older entries § Newer entries »