Urban Studies

You are currently browsing the archive for the Urban Studies category.

In a review of Martin Wolf’s The Shifts and the Shocks in the New York Review of Books, Paul Krugman argues that the financial bubble may not have led to the 2008 crash but merely briefly masked an economy that has stalled in a long-term way. An excerpt:

“Emphasizing the need to reduce financial fragility makes sense if you believe that the legacy of past financial excess is the reason we’re in so much trouble now. But are we sure about that? Let me offer two reasons to be skeptical.

First, while the depression that overtook the Western world in 2008 clearly came after the collapse of a vast financial bubble, that doesn’t mean that the bubble caused the depression. Late in The Shifts and the Shocks Wolf mentions the reemergence of the ‘secular stagnation’ hypothesis, most famously in the speeches and writing of Lawrence Summers (Lord Adair Turner independently made similar points, as did I). But I’m not sure whether readers will grasp the full implications. If the secular stagnationists are right, advanced economies now suffer from persistently inadequate demand, so that depression is their normal state, except when spending is supported by bubbles. If that’s true, bubbles aren’t the root of the problem; they’re actually a good thing while they last, because they prop up demand. Unfortunately, they’re not sustainable—so what we need urgently are policies to support demand on a continuing basis, which is an issue very different from questions of financial regulation.

Wolf actually does address this issue briefly, suggesting that the answer might lie in deficit spending financed by the government’s printing press. But this radical suggestion is, as I said, overshadowed by his calls for more financial regulation. It’s the morality play aspect again: the idea that we need to don a hairshirt and repent our sins resonates with many people, while the idea that we may need to abandon conventional notions of fiscal and monetary virtue has few takers.”

Tags: ,

Oh, it’s fun designing a city on paper or even redesigning one. At i09, Annalee Newitz has a thought experiment for making over New York: Imagine it all of a sudden becomes a megacity with triple the population and figure out how to make that sustainable. Probably good to practice since the number of New Yorkers will likely head to that stratosphere over the decades, if flooding doesn’t become a recurrent issue. An excerpt from the “Disappearing Streets” section:

“New York City is already one of the most densely-packed urban spaces in the world, with 10,724 people on average per square kilometer. To triple the living spaces here, we’ll need to build up — but we’ll also need to build between. The city could no longer afford to devote so much street space to the products of an already-shaky auto industry, and the city’s grid would change immeasurably. So would the laws that govern it.

For efficiency’s sake, Manhattan would have to retain a couple of the major avenues like Fifth, which cuts through the center of the island. But it would be reserved for trucks delivering food — or taking garbage out. Other streets would be for licensed taxis and services like Uber, while cars belonging to individuals might be routed to the edges of island, or to other boroughs entirely. Getting around in Manhattan would mean taking public transit, or paying dearly to get an Uber.

At the same time, there would be a flowering of pedestrian walkways like Sixth and a Half Avenue, which tunnels through the skyscrapers of midtown in between Sixth and Seventh Aves. As more skyscrapers grew, walkways would also take to the skies in bridges between buildings. To keep the ground-level streets less congested, pedestrians would be invited to walk Broadway from the air, hustling from building to building via a growing network of architectural tissues that would nourish a new sidewalk culture fifteen stories off the ground.

Some of these elevated sidewalks would be classic New York, complete with tar-gummed concrete and jagged nubs of rusted rebar poking out at odd angles. But others would look like high-tech works of art.”


The Economist has a piece about the so-called “Obesity Penalty,” which is supported by a new Swedish study which argues that overweight people earn less than their weed-like co-workers. Probably a good idea to be circumspect about the whole thing–or at least the causes if the effect is real. An excerpt:

“BEING obese is the same as not having an undergraduate degree. That’s the bizarre message from a new paper that looks at the economic fortunes of Swedish men who enlisted in compulsory military service in the 1980s and 1990s. They show that men who are obese aged 18 grow up to earn 16% less than their peers of a normal weight. Even people who were overweight at 18—that is, with a body-mass index from 25 to 30—see significantly lower wages as an adult.

At first glance, a sceptic might be unconvinced by the results. After all, within countries the poorest people tend to be the fattest. One study found that Americans who live in the most poverty-dense counties are those most prone to obesity. If obese people tend to come from impoverished backgrounds, then we might expect them to have lower earnings as an adult.

But the authors get around this problem by mainly focusing on brothers. Every person included in their final sample—which is 150,000 people strong—has at least one male sibling also in that sample. That allows the economists to use ‘fixed-effects,’ a statistical technique that accounts for family characteristics (such as poverty). They also include important family characteristics like the parents’ income. All this statistical trickery allows the economists to isolate the effect of obesity on earnings.

So what does explain the ‘obesity penalty’?”

My guess is that even if we have cars that are 90% autonomous (at least on highways) by 2015 and fully robotic in a half-dozen years as Elon Musk promises, it will take substantially longer than that to modify infrastructure to meet the demand. If no retrofitting is required, then that’s a whole different conversation. From Mike Ramsey at WSJ:

“Tesla Motors Inc. plans to unveil features that enable more computer-controlled driving of its Model S electric sedan on Thursday, following up on tweets sent by the company’s founder last week, according to a person familiar with the matter.

At an event scheduled for Thursday in Hawthorne, Calif., the Silicon Valley auto maker will announce the latest upgrades, about a week after Chief Executive Elon Musk posted a pair of tweets suggesting the auto maker soon would announce a product he referred to as ‘D.’

A Tesla spokeswoman declined to comment on the specifics of this week’s announcement.

Tesla’s foray into features that allow autonomous driving reflects a wider push among auto makers to produce cars that can handle more driving functions on their own. Mr. Musk recently said Tesla will have a fully autonomous car ready in five to six years.”

Do people still consider Marshall McLuhan to be so many mumbles the way they did when he fell from grace, without cause, by 1980 or so? He wasn’t always right, but the theorist was no Nostradamus, whose writing needs to be spun like an angel on the head of a pin to appear to be right. McLuhan was more correct about the looming Information Age than anyone. From Paul Herbert’s Pacific-Standard piece, “The Medium Is the Message: 50 Years Later“:

“TWENTY YEARS AGO, IN the introduction to a re-print of Understanding Media, renowned editor Lewis H. Lapham wrote that much of what McLuhan had to say made a lot more sense in 1994 than it did in 1964, what with two terms of Reagan and the creation of MTV. Twenty years after that, the banality of McLuhan’s ideas have solidified their merit. When Yahoo! CEO Marissa Mayer, for example, compared the expansion of big data to the planet developing a central nervous system, that’s McLuhan. When Chief Justice John Roberts opined that an alien from Mars might mistake the smartphone as an integral feature of human anatomy, that’s McLuhan, too. In 2014, it’s hard to overstate McLuhan’s prescience.

‘People who don’t like McLuhan in the academic world are either lazy, stupid, jealous, or some combination,’ says Paul Levinson, a professor of communication and media studies at Fordham University, where McLuhan taught for a year in the late ’60s. ‘McLuhan wasn’t into commonsense, reasonable propositions. He liked looking at things in a poetic, metaphoric way.’

And it’s true: McLuhan had a penchant for speaking in riddles and rhymes that might baffle at first, but grow into epiphany if given the chance. His rhetorical style was hyperbole. He didn’t shy away from playing the holy fool, as Wired would later call him, and on a number of occasions claimed his mission was simply to probe the new terrain, not come back to camp with answers.”


McLuhan with Tom Wolfe, one of his champions, in 1970:

Tags: ,

From the November 8, 1895 Brooklyn Daily Eagle:

“Complaints have reached Flatbush that the residents of the northeastern section of the Twenty-ninth ward are annoyed by a man named Thomas McCormick, who is well known to the police. He served two years and a half in state’s prison for highway robbery and two more years for house breaking, beside having been arrested and convicted for minor offenses a dozen times during the last fifteen years. He is 35 years old, powerfully built and as strong as three ordinary men. No particular charge has been brought against him this time for the reason, the police say, that people in his neighborhood are unwilling to appear against him in a police court because they fear his vengeance. Sergeant Zimmerman told an Eagle reporter last night that a few days ago McCormick went into a Flatbush barber shop, the owner of which is a bird fancier, and ate two live canary birds, feathers and all.”

Tags: ,


I have an authentic mummified money that is 75 years old available for sale. It looks just as it did when it passed. It is in a big jar but the glass is a bit cloudy. Rare piece.

Those who still believe privacy can be preserved by legislation either haven’t thought through the realities or are deceiving themselves. Get ready for your close-up because it’s not the pictures that are getting smaller, but the cameras. Tinier and Tinier. Soon you won’t even notice them. And they can fly.

I have no doubt the makers of the Nixie, the wristwatch-drone camera, have nothing but good intentions, but not everyone using it will. From Joseph Flaherty at Wired:

“Being able to wear the drone is a cute gimmick, but it’s powerful software packed into a tiny shell could set Nixie apart from bargain Brookstone quadcopters. Expertise in motion-prediction algorithms and sensor fusion will give the wrist-worn whirlybirds an impressive range of functionality. A ‘Boomerang mode’ allows Nixie to travel a fixed distance from its owner, take a photo, then return. ‘Panorama mode’ takes aerial photos in a 360° arc. ‘Follow me’ mode makes Nixie trail its owner and would capture amateur athletes in a perspective typically reserved for Madden all-stars. ‘Hover mode’ gives any filmmaker easy access to impromptu jib shots. Other drones promise similar functionality, but none promise the same level of portability or user friendliness.

‘We’re not trying to build a quadcopter, we’re trying to build a personal photographer,’ says Jovanovic.

A Changing Perspective on Photography

[Jelena] Jovanovic and her partner Christoph Kohstall, a Stanford postdoc who holds a Ph.D. in quantum physics and a first-author credit in the journal Nature, believe photography is at a tipping point.

Early cameras were bulky, expensive, and difficult to operate. The last hundred years have produced consistently smaller, cheaper, and easier-to-use cameras, but future developments are forking. Google Glass provides the ultimate in portability, but leaves wearers with a fixed perspective. Surveillance drones offer unique vantage points, but are difficult to operate. Nixie attempts to offer the best of both worlds.”•

Tags: , ,

I previously posted some stuff about driverless-car testing in a mock cityscape in Ann Arbor, Michigan, which might seem unnecessary given Google’s regular runs on actual streets and highways. But here’s an update on the progress from “Town Built for Driverless Cars,” by Will Knight at Technology Review:

“A mocked-up set of busy streets in Ann Arbor, Michigan, will provide the sternest test yet for self-driving cars. Complex intersections, confusing lane markings, and busy construction crews will be used to gauge the aptitude of the latest automotive sensors and driving algorithms; mechanical pedestrians will even leap into the road from between parked cars so researchers can see if they trip up onboard safety systems.

The urban setting will be used to create situations that automated driving systems have struggled with, such as subtle driver-pedestrian interactions, unusual road surfaces, tunnels, and tree canopies, which can confuse sensors and obscure GPS signals.

‘If you go out on the public streets you come up against rare events that are very challenging for sensors,’ says Peter Sweatman, director of the University of Michigan’s Mobility Transformation Center, which is overseeing the project. ‘Having identified challenging scenarios, we need to re-create them in a highly repeatable way. We don’t want to be just driving around the public roads.'”


The Mudd Club was a cabaret institution in New York for a few years in the late-’70s and early ’80s, the edgier little cousin to Studio 54, which wasn’t exactly Disneyland. An excerpt from a 1979 People article which includes a holy shit! quote from Andy Warhol:

“Ever on the prowl for outrageous novelty, New York’s fly-by-night crowd of punks, posers and the ultra hip has discovered new turf on which to flaunt its manic chic. It is the Mudd Club, a dingy disco lost among the warehouses of lower Manhattan. By day the winos skid by without a second glance. But come midnight (the opening time), the decked-out decadents amass 13 deep. For sheer kinkiness, there has been nothing like it since the cabaret scene in 1920s Berlin.

In just six months the Mudd has made its uptown precursor, Studio 54, seem almost passé and has had to post a sentry on the sidewalk. The difference is that the Mudd doesn’t have a velvet rope but a steel chain. Such recognizable fun-lovers as David Bowie, Mariel Hemingway, Diane von Furstenburg and Dan Aykroyd are automatically waved inside. For the rest, the club picks its own like some sort of perverse trash compactor. The kind of simple solution employed by U.S. gas stations is out of the question: At the Mudd, every night is odd. Proprietor Steve Mass, 35, admits that ‘making a fashion statement’ is the criterion. That means a depraved version of the audience of Let’s Make a Deal. One man gained entrance simply by flashing the stump of his amputated arm.

The action inside varies from irreverent to raunch. Andy Warhol is happy to have found a place, he says, ‘where people will go to bed with anyone—man, woman or child.’ Some patrons couldn’t wait for bedtime, and the management has tried to curtail sex in the bathrooms.”

Tags: ,

Rust never sleeps, and Walt Disney, even with all his great success and grand imagination, wasn’t immune to the quiet terrors of life any more than the rest of us. Almost two decades before he built his first safe and secure family theme park in California, the Hollywood house he’d purchased for his parents was invaded by a silent killer. Two articles follow from the Brooklyn Daily Eagle.


From the November 27, 1938 edition:


From the May 16, 1954 edition:


Tags: , ,

For productivity to increase, labor costs must shrink. That’s fine provided new industries emerge to accommodate workers, but that really isn’t what we’ve seen so far in the Technological Revolution, the next great bend in the curve, as production and wages haven’t boomed. It’s been the trade of a taxi medallion for a large pink mustache. More convenient, if not cheaper (yet), for the consumer, but bad for the drivers.

Perhaps that’s because we’re at the outset of a slow-starting boom, the Second Machine Age, or perhaps what we’re going through refuses to follow the form of the Industrial Revolution. Maybe it’s the New Abnormal. The opening of the Economist feature, “Technology Isn’t Working“: 

“IF THERE IS a technological revolution in progress, rich economies could be forgiven for wishing it would go away. Workers in America, Europe and Japan have been through a difficult few decades. In the 1970s the blistering growth after the second world war vanished in both Europe and America. In the early 1990s Japan joined the slump, entering a prolonged period of economic stagnation. Brief spells of faster growth in intervening years quickly petered out. The rich world is still trying to shake off the effects of the 2008 financial crisis. And now the digital economy, far from pushing up wages across the board in response to higher productivity, is keeping them flat for the mass of workers while extravagantly rewarding the most talented ones.

Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organisation for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades.

In July 1987 Robert Solow, an economist who went on to win the Nobel prize for economics just a few months later, wrote a book review for the New York Times. The book in question, The Myth of the Post-Industrial Economy by Stephen Cohen and John Zysman, lamented the shift of the American workforce into the service sector and explored the reasons why American manufacturing seemed to be losing out to competition from abroad. One problem, the authors reckoned, was that America was failing to take full advantage of the magnificent new technologies of the computing age, such as increasingly sophisticated automation and much-improved robots. Mr Solow commented that the authors, ‘like everyone else, are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution…has been accompanied everywhere…by a slowdown in productivity growth.'”

Anne Frank famously wrote, “Despite everything, I believe that people are really good at heart,” and if she could be hopeful, how can we feel grounded by despondency? A sneakily cheerful side to the growing cadre of scientists and philosophers focused on existential risks that could drive human extinction is that if we make it through the next century, we may have unbridled abundance. From Aaron Labaree at Salon:

“When we think about general intelligence,’ says Luke Muelhauser, Executive Director at MIRI, ‘that’s a meta-technology that gives you everything else that you want — including really radical things that are even weird to talk about, like having our consciousness survive for thousands of years. Physics doesn’t outlaw those things, it’s just that we don’t have enough intelligence and haven’t put enough work into the problem … If we can get artificial intelligence right, I think it would be the best thing that ever happened in the universe, basically.’

A surprising number of conversations with experts in human extinction end like this: with great hope. You’d think that contemplating robot extermination would make you gloomy, but it’s just the opposite. As [Martin] Rees explains, ‘What science does is makes one aware of the marvelous potential of life ahead. And being aware of that, one is more concerned that it should not be foreclosed by screwing up during this century.’ Concern over humanity’s extermination at the hands of nanobots or computers, it turns out, often conceals optimism of the kind you just don’t find in liberal arts majors. It implies a belief in a common human destiny and the transformative power of technology.

‘The stakes are very large,’ [Nick] Bostrom told me. ‘There is this long-term future that could be so enormous. If our descendants colonized the universe, we could have these intergalactic civilizations with planetary-sized minds thinking and feeling things that are beyond our ken, living for billions of years. There’s an enormous amount of value that’s on the line.’

It’s all pretty hypothetical for now.•

Tags: ,

I don’t know if it will happen within ten years–though that’s not an outrageous time frame–but 3-D printing will automate much of the restaurant process, making jobs vanish, and will also be common in homes as prices fall. The opening of “Is 3-D Printing the Next Microwave?” by Jane Dornbusch at the Boston Globe:

“Picture the dinner hour in a decade: As you leave work, you pull up an app (assuming we still use apps) on your phone (or your watch!) that will direct a printer in your kitchen to whip up a batch of freshly made ravioli, some homemade chicken nuggets for the kids, and maybe a batch of cookies, each biscuit customized to meet the nutritional needs of different family members. 

It sounds like science fiction, but scientists and engineers are working on 3-D printing, and the food version of the 3-D printer is taking shape. Don’t expect it to spin out fully cooked meals anytime soon. For now, the most popular application in 3-D food printing seems to be in the decidedly low-tech area of cake decoration.

Well, not just cake decoration, but sugary creations of all kinds. The Sugar Lab is the confectionary arm of 3-D printing pioneer 3D Systems, and it expects to have the ChefJet, a 3-D food printer, available commercially later this year. Though tinkerers have been exploring the possibilities of 3-D food printing for a few years, and another food printer, Natural Machines’ Foodini, is slated to appear by year’s end, 3D Systems says the ChefJet is the world’s first 3-D food printer.

Like so many great inventions, the ChefJet came about as something of an accident, this one performed by husband-and-wife architecture grad students Liz and Kyle von Hasseln a couple of years ago. At the Southern California Institute of Architecture, the von Hasselns used a 3-D printer to create models. Intrigued by the process, Liz von Hasseln says, ‘We bought a used printer and played around with different materials to see how to push the technology. One thing we tried was sugar. We thought if we altered the machine a bit and made it all food safe and edible, we could push into a new space.’ More tweaking ensued, and the ChefJet was born.”


Walter Cronkite presents the kitchen of 2001 in 1967:

Tags: , ,

I haven’t yet read Walter Isaacson’s new Silicon Valley history, The Innovators, but I would be indebted if it answers the question of how much Gary Kildall’s software was instrumental to Microsoft’s rise. Was Bill Gates and Paul Allen’s immense success built on intellectual thievery? Has the story been mythologized beyond realistic proportion? An excerpt from Brendan Koerner’s New York Times review of the book:

“The digital revolution germinated not only at button-down Silicon Valley firms like Fairchild, but also in the hippie enclaves up the road in San Francisco. The intellectually curious denizens of these communities ‘shared a resistance to power elites and a desire to control their own access to information.’ Their freewheeling culture would give rise to the personal computer, the laptop and the concept of the Internet as a tool for the Everyman rather than scientists. Though Isaacson is clearly fond of these unconventional souls, his description of their world suffers from a certain anthropological detachment. Perhaps because he’s accustomed to writing biographies of men who operated inside the corridors of power — Benjamin Franklin, Henry ­Kissinger, Jobs — Isaacson seems a bit baffled by committed outsiders like ­Stewart Brand, an LSD-inspired futurist who predicted the democratization of computing. He also does himself no favors by frequently citing the work of John Markoff and Tom Wolfe, two writers who have produced far more intimate portraits of ’60s ­counterculture.

Yet this minor shortcoming is quickly forgiven when The Innovators segues into its rollicking last act, in which hardware becomes commoditized and software goes on the ascent. The star here is Bill Gates, whom Isaacson depicts as something close to a punk — a spoiled brat and compulsive gambler who ‘was rebellious just for the hell of it.’ Like Paul Baran before him, Gates encountered an appalling lack of vision in the corporate realm — in his case at IBM, which failed to realize that its flagship personal computer would be cloned into oblivion if the company permitted Microsoft to license the machine’s MS-DOS operating system at will. Gates pounced on this mistake with a feral zeal that belies his current image as a sweater-clad humanitarian.”

Tags: , ,

Selfies, the derided yet immensely popular modern portraiture, draw ire because of narcissism and exhibitionism, of course, but also because anyone can take them and do so ad nauseum. It’s too easy and available, with no expertise or gatekeeper necessary. The act is magalomania, sure, but it’s also democracy, that scary, wonderful rabble. From, ultimately, a defense of the self-directed shot by Douglas Coupland in the Financial Times:

“Selfies are the second cousin of the air guitar.

Selfies are the proud parents of the dick pic.

Selfies are, in some complex way, responsible for the word ‘frenemy.’

I sometimes wonder what selfies would look like in North Korea.

Selfies are theoretically about control – or, if you’re theoretically minded, they’re about the illusion of self-control. With a selfie some people believe you’re buying into a collective unspoken notion that everybody needs to look fresh and flirty and young for ever. You’re turning yourself into a product. You’re abdicating power of your sexuality. Or maybe you’re overthinking it – maybe you’re just in love with yourself.

I believe that it’s the unanticipated side effects of technology that directly or indirectly define the textures and flavours of our eras. Look at what Google has already done to the 21st century. So when smartphones entered the world in 2002, I think that if you gathered a group of smart media-savvy people in a room with coffee and good sandwiches, before the end of the day, the selfie could easily have been forecast as an inevitable smartphone side effect. There’s actually nothing about selfies that feels like a surprise in any way. The only thing that is surprising is the number of years it took us to isolate and name the phenomenon. I do note, however, that once the selfie phenomenon was named and shamed, selfies exploded even further, possibly occupying all of those optical-fibre lanes of the internet that were once occupied by Nigerian princes and ads for penis enlargement procedures.”



From the August 18, 1889 Brooklyn Daily Eagle:

“‘Gloves which are sold as kid are often made of human skin,’ said Dr. Mark L. Nardyz, the Greek physician, of 716 Pine Street, yesterday. ‘The skin on the breast,’ continued the physician, ‘is soft and pliable and may be used for the making of gloves. When people buy gloves they never stop to question about the material of which they are made. The shopkeeper himself may be in ignorance, and the purchaser has no means of ascertaining whether the material is human skin or not. The fact is the tanning of human skin is extensively carried on in France and Switzerland. The product is manufactured into gloves, and these are imported into this country. Thus you see a person may be wearing part of a distant relative’s body and not know it.'”


The end is near, roughly speaking. A number of scientists and philosophers, most notably Martin Rees and Nick Bostrom, agonize over the existential risks to humanity that might obliterate us long before the sun dies. A school of thought has arisen over the End of Days. From Sophie McBain in the New Statesman (via 3 Quarks Daily):

“Predictions of the end of history are as old as history itself, but the 21st century poses new threats. The development of nuclear weapons marked the first time that we had the technology to end all human life. Since then, advances in synthetic biology and nanotechnology have increased the potential for human beings to do catastrophic harm by accident or through deliberate, criminal intent.

In July this year, long-forgotten vials of smallpox – a virus believed to be ‘dead’ – were discovered at a research centre near Washington, DC. Now imagine some similar incident in the future, but involving an artificially generated killer virus or nanoweapons. Some of these dangers are closer than we might care to imagine. When Syrian hackers sent a message from the Associated Press Twitter account that there had been an attack on the White House, the Standard & Poor’s 500 stock market briefly fell by $136bn. What unthinkable chaos would be unleashed if someone found a way to empty people’s bank accounts?

While previous doomsayers have relied on religion or superstition, the researchers at the Future of Humanity Institute want to apply scientific rigour to understanding apocalyptic outcomes. How likely are they? Can the risks be mitigated? And how should we weigh up the needs of future generations against our own?

The FHI was founded nine years ago by Nick Bostrom, a Swedish philosopher, when he was 32. Bostrom is one of the leading figures in this small but expanding field of study.”

Tags: , ,

Speaking of human laborers being squeezed: Open Source with Christopher Lydon has an episode called “The End of Work,” with two guests, futurist Ray Kurzweil and MIT economist Andrew McAfee. A few notes.

  • McAfee sees the Technological Revolution as doing for gray matter what the Industrial Revolution did for muscle fiber, but on the way to a world of wealth without toil–a Digital Athens–the bad news is the strong chance of greater income inequality and decreased opportunities for many. Kodak employed 150,000; Instagram a small fraction of that. With the new technologies, destruction (of jobs) outpaces creation. Consumers win, but Labor loses.
  • Kurzweil is more hopeful in the shorter term than McAfee. He says we have more jobs and more gratifying ones today than 100 years ago and they pay better. We accomplish more. Technology will improve us, make us smarter, to meet the demands of a world without drudgery. It won’t be us versus the machines, but the two working together. The majority of jobs always go away, most of the jobs today didn’t exist so long ago. New industries will be invented to provide work. He doesn’t acknowledge a painful period of adjustment in distribution before abundance can reach all.

Tags: , ,

Sure, we have phones that are way nicer now, but the Technological Revolution has largely been injurious to anyone in the Labor market, and things are going to get worse, at least in the near and mid term. A free-market society that is highly automated isn’t really very free. Drive for Uber until autonomous cars can take over the wheel, you’re told, or rent a spare room on Airbnb–make space for yourself on the margins through the Sharing Economy. But there’s less to share for most people. From an Economist report:

“Before the horseless carriage, drivers presided over horse-drawn vehicles. When cars became cheap enough, the horses and carriages had to go, which eliminated jobs such as breeding and tending horses and making carriages. But cars raised the productivity of the drivers, for whom the shift in technology was what economists call ‘labour-augmenting.’ They were able to serve more customers, faster and over greater distances. The economic gains from the car were broadly shared by workers, consumers and owners of capital. Yet the economy no longer seems to work that way. The big losers have been workers without highly specialised skills.

The squeeze on workers has come from several directions, as the car industry clearly shows. Its territory is increasingly encroached upon by machines, including computers, which are getting cheaper and more versatile all the time. If cars and lorries do not need drivers, then both personal transport and shipping are likely to become more efficient. Motor vehicles can spend more time in use, with less human error, but there will be no human operator to share in the gains.

At the same time labour markets are hollowing out, polarising into high- and low-skill occupations, with very little employment in the middle. The engineers who design and test new vehicles are benefiting from technological change, but they are highly skilled and it takes remarkably few of them to do the job. At Volvo much of the development work is done virtually, from the design of the cars to the layout of the production line. Other workers, like the large numbers of modestly skilled labourers that might once have worked on the factory floor, are being squeezed out of such work and are now having to compete for low-skill and low-wage jobs.

Labour has been on the losing end of technological change for several decades.”

"My cat urinated on it."

“My cat urinated on it.”

Free couch (Linden)

I have a full size couch I’m giving away because my cat urinated on it. I know there are cleaning solutions available to eliminate the odor at the pet stores but I’m moving and don’t want to take it. Free if interested. A truck will be needed to take it out. It’s solid and comfortable.

"My cat urinated on it."

Space-travel enthusiast and labor organizer David Lasser was one of the first Americans to champion a mission to the moon, and one of the most influential. His 1931 book, The Conquest of Space, suggested such a rocket voyage was possible, not fanciful. In Lasser’s 1996 New York Times obituary, Arthur C. Clarke said of the then-65-year-old volume: “[It was] the first book in the English language to explain that space travel wasn’t just fiction…[it was] one of the turning points in my life — and I suspect not only of mine.”

While an article about the book’s publication in the October 6, 1931 Brooklyn Daily Eagle took seriously Lasser’s vision of rocket-powered airline travel–from New York to Paris in one hour!–it gave less credence to his moonshot scenario.


Tags: ,

From the Overcoming Bias post in which economist Robin Hansen comments on Debora MacKenzie’s New Scientist article “The End of Nations,” a piece which wonders about, among other things, whether states in the modern sense predated the Industrial Revolution:

“An interesting claim: the nation-state didn’t exist before, and was caused by, the industrial revolution. Oh there were empires before, but most people didn’t identify much with empires, or see empires as much influencing their lives. In contrast people identify with nation-states, which they see as greatly influencing their lives. More:

Before the late 18th century there were no real nation states. … If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. …

Agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. … Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. … Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes.

Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. …

The industrial revolution … demanded a different kind of government. … ‘In 1800 almost nobody in France thought of themselves as French. By 1900 they all did.’ … Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split.

These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around.”


It’s a heartbreaker watching what’s happening to the New York Times these days, and the latest layoffs are just the most recent horrible headline. The Magazine is currently a green shoot, with its bright new editor, Jake Silverstein, and a boost to staffing, but that section is the outlier. The business can’t continue to suffer without being joined by the journalism. You just hope the company is ultimately sold to someone great.

At the Washington Post, not much has changed dramatically since Jeff Bezos bought the Graham family jewel, despite some executive shuffles and new hires. Does Bezos have a long-term plan? Does he have any plan? Does it really matter in the intervening period, since he can afford to wait for everyone else to fall and position his publication as the inheritor? From Dylan Byers at Politico:

“Despite expectations, Bezos himself had never promised a reinvention. ‘There is no map, and charting a path ahead will not be easy,’ he wrote in his first memo to Post staff in August of last year. Still, his reputation preceded him: With Amazon, he had revolutionized not just the book-selling business but the very means and standards of online shopping. He was planning ambitious new initiatives like drone delivery. Surely, this man had the silver bullet to save the Washington Post, and perhaps the newspaper industry.

Bezos, who declined to be interviewed for this story, is holding his cards close to his chest. He has no influence on the editorial side, according to [Exceutive Editorm Martin] Baron, but is focused on ‘broader strategic efforts.’

If Bezos has any innovative digital initiatives in the works, they’re being formed not in Washington but in New York. In March, the Post launched a Manhattan-based design and development office called WPNYC, which is focused on improving the digital experience and developing new advertising products.

‘Jeff’s preoccupation isn’t editorial, it’s delivery,’ one Post staffer said of WPNYC. ‘He wants to change the way people receive, read and experience the news. The only problem is we still don’t know what that looks like.'”

Tags: , , ,

Aiming to make surgical invasion even more minimal, robots are being devised that can slide into small openings and perform currently messy operations. From Sarah Laskow at the Atlantic:

“In the past few years, surgeons have been pushing to make these less invasive surgeries almost entirely invisible. Instead of cutting a tiny window in the outside of the body, they thought, why not cut one inside? Surgeons would first enter a person’s body through a ‘natural orifice’ and make one small incision, through which to access internal organs. The end result of this idea was that, in 2009, a surgeon removed a woman’s kidney through her vagina.

Few surgeons were convinced this was actually an improvement though. Instead, they have focused on minimizing the number of tiny incisions needed to perform surgery. Single-site surgery requires just one ‘port’ into a body.

A team of surgeons at Columbia, for instance, is working on a small robotic arm—minuscule, when compared to the da Vinci system—that can sneak into one 15 millimeter incision. And NASA is working on a robot that can enter the abdominal cavity through a person’s belly button, Matrix-like, to perform simple surgeries. It’s meant to be used in emergencies, but we know how this story goes: Soon enough, it’ll be routine for a robot to slide into a person’s body and pull her appendix back out.”


« Older entries § Newer entries »