Politics

You are currently browsing the archive for the Politics category.

Google does many great things, but its corporate leaders want you to trust them with your private information–because they are the good guys–and you should never trust any corporation with such material. The thing is, it’s increasingly difficult to opt out of the modern arrangement, algorithms snaking their way into all corners of our lives. The excellent documentarian Eugene Jarecki has penned a Time essay about Google and Wikileaks and what the two say about the future. An excerpt follows.

_________________________

I interviewed notorious Wikileaks founder Julian Assange by hologram, beamed in from his place of asylum in the Ecuadorian Embassy in London. News coverage the next day focused in one way or another on the spectacular and mischievous angle that Assange had, in effect, managed to escape his quarantine and laugh in the face of those who wish to extradite him by appearing full-bodied in Nantucket before a packed house of exhilarated conference attendees.

Beyond the spectacle, though, what got less attention was what the interview was actually about, namely the future of our civilization in an increasingly digital world. What does it mean for us as people to see the traditional town square go digital, with online banking displacing bricks and mortar, just as email did snail mail, Wikipedia did the local library, and eBay the mom and pop shop? The subject of our ever-digitizing lives is one that has been gaining currency over the past year, fueled by news stories about Google Glasses, self-driving cars, sky-rocketing rates of online addiction and, most recently, the scandal of NSA abuse. But the need to better understand the implications of our digital transformation was further underscored in the days preceding the event with the publication of two books: one by Assange and the other by Google Executive Chairman, Eric Schmidt.

Assange’s book, When Google Met Wikileaks, is the transcript (with commentary by Assange) of a secret meeting between the two that took place on June 23, 2011, when Schmidt visited Assange in England. In his commentary, Assange explores the troubling implications of Google’s vast reach, including its relationships with international authorities, particularly in the U.S., of which the public is largely unaware. Schmidt’s book, How Google Works, is a broader, sunnier look at how technology has presumably shifted the balance of power from companies to people. It tells the story of how Google rose from a nerdy young tech startup to become a nerdy behemoth astride the globe. Read together, the two books offer an unsettling portrait both of our unpreparedness for what lies ahead and of the utopian spin with which Google (and others in the digital world) package tomorrow. While Assange’s book accuses Google of operating as a kind of “‘Don’t Be Evil’ empire,” Schmidt’s book fulfills Assange’s worst fears, presenting pseudo-irreverent business maxims in an “aw shucks” tone that seems willfully ignorant of the inevitable implications of any company coming to so sweepingly dominate our lives in unprecedented and often legally uncharted ways.•

Tags: , ,

If you’re wondering what Aleksandr Solzhenitsyn would have thought of Putin’s engagement in Ukraine, he pretty much answered the question during a 1994 interview, and he was fully in favor of repatriation of the state. From Paul Klebnikov’s discussion with the once and former dissident, which was reprinted in Forbes at the time of his death in 2008:

Forbes:

Tension is mounting between Russia and the now independent Ukraine, with the West strongly backing Ukrainian territorial integrity. Henry Kissinger argues that Russia will always threaten the interests of the West, no matter what kind of government it has.

Aleksandr Solzhenitsyn:

Henry Kissinger, Zbigniew Brzezinski, [historian] Richard Pipes and many other American politicians and publicists are frozen in a mode of thought they developed a long time ago. With unchanging blindness and stubbornness they keep repeating and repeating this theory about the supposed age-old aggressiveness of Russia, without taking into consideration today’s reality.

Forbes:

Well, what about Ukraine? Hasn’t Russia made threats toward several of the former U.S.S.R. member states?

Aleksandr Solzhenitsyn:

Imagine that one not very fine day two or three of your states in the Southwest, in the space of 24 hours, declare themselves independent of the U.S. They declare themselves a fully sovereign nation, decreeing that Spanish will be the only language. All English-speaking residents, even if their ancestors have lived there for 200 years, have to take a test in the Spanish language within one or two years and swear allegiance to the new nation. Otherwise they will not receive citizenship and be deprived of civic, property and employment rights.

Forbes:

What would be the reaction of the United States? I have no doubt that it would be immediate military intervention.

Aleksandr Solzhenitsyn:

But today Russia faces precisely this scenario. In 24 hours she lost eight to 10 purely Russian provinces, 25 million ethnic Russians who have ended up in this very way–as ‘undesirable aliens.’ In places where their fathers, grandfathers, great-grandfathers have lived since way back–even from the 17th century–they face persecution in their jobs and the suppression of their culture, education and language.

Meanwhile, in Central Asia, those wishing to leave are not permitted to take even their personal property. The authorities tell them, ‘There is no such concept as ‘personal property’!’

And in this situation ‘imperialist Russia’ has not made a single forceful move to rectify this monstrous mess. Without a murmur she has given away 25 million of her compatriots–the largest diaspora in the world!

Forbes:

You see Russia as the victim of aggression, not as the aggressor.

Aleksandr Solzhenitsyn:

Who can find in world history another such example of peaceful conduct? And if Russia keeps the peace in the single most vital question that concerns her, why should one expect her to be aggressive in secondary issues?

Forbes:

With Russia in chaos, it does sound a bit far-fetched to see her as an aggressor.

Aleksandr Solzhenitsyn:

Russia today is terribly sick. Her people are sick to the point of total exhaustion. But even so, have a conscience and don’t demand that–just to please America–Russia throw away the last vestiges of her concern for her security and her unprecedented collapse. After all, this concern in no way threatens the United States.”

Tags: ,

At the time of his death in 1945, Italian dictator Benito Mussolini’s approval ratings were not at their highest.

Il Duce was executed by a firing squad and hung upside down from the roof of an Esso gas station for bringing ruin to the nation during World War II, his corpse later lowered into a pile with 16 other dead Fascists, where it could be further brutalized by an outraged citizenry. In an article in the May 30, 1945 Brooklyn Daily Eagle, Mussolini’s astounding end was described in gory detail.

Tags:

With computers so small they all but disappear, the infrastructure silently becoming more and more automated, what else will vanish from our lives and ourselves? I’m someone who loves the new normal of decentralized, free-flowing media, who thinks the gains are far greater than the losses, but it’s a question worth asking. Via Longreads, an excerpt from The Glass Cage, a new book by that Information Age designated mourner Nicholas Carr:

“There’s a big difference between a set of tools and an infrastructure. The Industrial Revolution gained its full force only after its operational assumptions were built into expansive systems and networks. The construction of the railroads in the middle of the nineteenth century enlarged the markets companies could serve, providing the impetus for mechanized mass production. The creation of the electric grid a few decades later opened the way for factory assembly lines and made all sorts of home appliances feasible and affordable. These new networks of transport and power, together with the telegraph, telephone, and broadcasting systems that arose alongside them, gave society a different character. They altered the way people thought about work, entertainment, travel, education, even the organization of communities and families. They transformed the pace and texture of life in ways that went well beyond what steam-powered factory machines had done.

The historian Thomas Hughes, in reviewing the arrival of the electric grid in his book Networks of Power, described how first the engineering culture, then the business culture, and finally the general culture shaped themselves to the new system. ‘Men and institutions developed characteristics that suited them to the characteristics of the technology,’ he wrote. ‘And the systematic interaction of men, ideas, and institutions, both technical and nontechnical, led to the development of a supersystem—a sociotechnical one—with mass movement and direction.’ It was at this point that what Hughes termed ‘technological momentum’ took hold, both for the power industry and for the modes of production and living it supported. ‘The universal system gathered a conservative momentum. Its growth generally was steady, and change became a diversification of function.’ Progress had found its groove.

We’ve reached a similar juncture in the history of automation. Society is adapting to the universal computing infrastructure—more quickly than it adapted to the electric grid—and a new status quo is taking shape. …

The science-fiction writer Arthur C. Clarke once asked, ‘Can the synthesis of man and machine ever be stable, or will the purely organic component become such a hindrance that it has to be discarded?’ In the business world at least, no stability in the division of work between human and computer seems in the offing. The prevailing methods of computerized communication and coordination pretty much ensure that the role of people will go on shrinking. We’ve designed a system that discards us. If unemployment worsens in the years ahead, it may be more a result of our new, subterranean infrastructure of automation than of any particular installation of robots in factories or software applications in offices. The robots and applications are the visible flora of automation’s deep, extensive, and invasive root system.”

Tags: ,

In the Financial Times, Edward Luce wonders about the deepening of divisions in American among African-Americans and whites during the two terms of our first black President. The Great Recession, I think, is largely to blame. Those most vulnerable got most hosed by that debacle. It was actually a great investment opportunity for others who had the available funds to buy cheap. Without Obama’s maneuverings to save large banks and industries, imperfect though they were, the cratering would have been far deeper. Meanwhile, the Affordable Care Act has been a great boon to lower-income Americans of all races. I would think in the longer term, having our appellate courts stocked with moderates and progressives will eventually be a help to those who have less. From Luce:

“Mr Obama shot to prominence in 2004 when he said there was no black or white America, just the United States of America. Yet as the continuing backlash to the police shooting of an unarmed young black man in Ferguson has reminded us, Mr Obama will leave the US at least as segregated as he found it. How could that be? The fair answer is that he is not to blame. The poor suffered the brunt of the Great Recession and blacks are far likelier to be poor. By any yardstick – the share of those with subprime mortgages, for example, or those working in casualised jobs – African-Americans were more directly in the line of fire.

Without Mr Obama’s efforts, African-American suffering would have been even greater. He has fought Congress to preserve food stamps and long-term unemployment insurance – both of which help blacks disproportionately. The number of Americans without health insurance has fallen by 8m since the Affordable Care Act came into effect. Likewise, no president has done as much as Mr Obama – to depressingly little effect – to try to correct the racial bias in US federal sentencing. Bill Clinton was once termed ‘America’s first black president.’ But it was under Mr Clinton that incarceration rates rose to their towering levels.”

 

Tags: ,

Many of the new corporations of the Information Age have been ostensibly good for consumers, with costs neatly hidden. For instance: Google and Facebook are completely free products, until you consider that you are the product. Amazon’s deep discounts have put all manner of cheap goods in consumers’ hands, great tools like books and tablets and smartphones, but competitors and producers have felt an increasing pinch. Eventually the earth is scorched and prices are largely in the hands of one company and the pipeline seriously shortened. Do the benefits outweigh the costs or vice versa?

In a New Republic article, Franklin Foer makes a convincing case that Amazon is already a clear monopoly, which has brought a virtual Walmartization to America. Those cheap items–Sam Walton’s or Jeff Bezos’–come at a dear price, he argues, favoring the purchaser in the short run but obliterating competitors and suppliers all the while. (And that doesn’t even begin to mention the treatment of workers who are made small so that prices can be likewise tiny.) An excerpt in which Foer looks at the disconnect between Industrial Age laws which govern monopolies and the megacorporations of the Information Age:

“Shopping on Amazon has so ingrained itself in modern American life that it has become something close to our unthinking habit, and the company has achieved a level of dominance that merits the application of a very old label: monopoly. 

That term doesn’t get tossed around much these days, but it should. Amazon is the shining representative of a new golden age of monopoly that also includes Google and Walmart. Unlike U.S. Steel, the new behemoths don’t use their barely challenged power to hike up prices. They are, in fact, self-styled servants of the consumer and have ushered in an era of low prices for everything from flat-screen TVs to paper napkins to smart phones. 

In other words, we’re all enjoying the benefits of these corporations far too much to think hard about distant dangers. Besides, the ideology of Silicon Valley suggests that we have nothing much to fear: If these firms no longer engineer breathtaking technologies, they will be creatively destroyed. That’s why Peter Thiel, the creator of PayPal, has argued that the term ‘monopoly’ should be stripped of its negative connotation. A monopoly, he argues, is really nothing more than a synonym for a highly successful company. Insulation from the brutish spirit of competition even makes them superior organizations—more beneficent employers, better able to both daydream and think clearly. In Thiel’s phrasing: ‘Creative monopolies aren’t just good for the rest of society; they’re powerful engines for making it better.’

Thiel makes an important point: The Internet-age monopolies are a different species; they flummox our conventional ways of thinking about corporate concentration and have proved especially elusive to those who ponder questions of antitrust, the discipline of law that aims to curb threats to the competitive marketplace. Part of the issue is the laws themselves, which were conceived to manage an industrial economy—and have, over time, evolved to focus on a specific set of narrow questions that have little to do with the core problem at hand.”

Tags:

I’m likely in the minority as someone who voted twice for President Obama and is very pleased with the results. He’s been bolder in certain ways than I anticipated, and I think he’s avoided a lot of twentieth-century pitfalls and set us up well for the twenty-first-century landscape. Most of his most ardent critics seem to me frivolous or hypocritical or dangerous. We’re a brighter, better-positioned nation for his leadership. Paul Krugman feels similarly about 44, as he demonstrates in his Rolling Stone essay, “In Defense of Obama.” The opening:

“When it comes to Barack Obama, I’ve always been out of sync. Back in 2008, when many liberals were wildly enthusiastic about his candidacy and his press was strongly favorable, I was skeptical. I worried that he was naive, that his talk about transcending the political divide was a dangerous illusion given the unyielding extremism of the modern American right. Furthermore, it seemed clear to me that, far from being the transformational figure his supporters imagined, he was rather conventional-minded: Even before taking office, he showed signs of paying far too much attention to what some of us would later take to calling Very Serious People, people who regarded cutting budget deficits and a willingness to slash Social Security as the very essence of political virtue.

And I wasn’t wrong. Obama was indeed naive: He faced scorched-earth Republican opposition from Day One, and it took him years to start dealing with that opposition realistically. Furthermore, he came perilously close to doing terrible things to the U.S. safety net in pursuit of a budget Grand Bargain; we were saved from significant cuts to Social Security and a rise in the Medicare age only by Republican greed, the GOP’s unwillingness to make even token concessions.

But now the shoe is on the other foot: Obama faces trash talk left, right and center – literally – and doesn’t deserve it. Despite bitter opposition, despite having come close to self-inflicted disaster, Obama has emerged as one of the most consequential and, yes, successful presidents in American history. His health reform is imperfect but still a huge step forward – and it’s working better than anyone expected. Financial reform fell far short of what should have happened, but it’s much more effective than you’d think. Economic management has been half-crippled by Republican obstruction, but has nonetheless been much better than in other advanced countries. And environmental policy is starting to look like it could be a major legacy.”

Tags: ,

Matt Bai penned one of 2014’s best articles, a New York Times Magazine piece about the Gary Hart scandal and what it tells us about modern politics and media. It’s an excerpt from his new book, All the Truth Is Out: The Week That Politics Went Tabloid, which delves far deeper into the subject. He’s doing an AMA at Reddit, and it breaks my heart a little that Ray Liotta currently has 70 times more questions. I understand it, but I don’t like it. Three exchanges follow.

______________________

Question:

Do you think that a candidate’s moral fortitude should play a role in their political success?

Matt Bai:

Yes. I want moral leaders, don’t you? The question is how we measure moral fortitude. Do we measure it by a single unflattering moment, a single gaffe or stupid decision? Or do we measure it in the full context of a life and political career. If a candidate has lied to his wife, but hasn’t lied to his constituents, or ducked tough votes, or been accused of corruption, then isn’t that worth something? Moral people do immoral things. Truthful people tell lies. I think we all want to be judged with some context (when we want to be judged at all), and so should our politicians.

______________________

Question:

I grew up in NH and witnessed many presidential candidates up close over several election cycles. Gary a Hart was easily the most impressive, decades ahead on a whole range of issues. He also gets little credit for the strength of character it took for him to buck the party crowd in Washington that was mired in the past. How do you think the world would be different if Hart had been elected President?

Matt Bai:

He would appreciate hearing that, and you would love my book. Hart himself told me that had he been elected, George W. Bush would never have become governor of Texas, and we wouldn’t have invaded Iraq. This haunts him. I can’t say how the world would have been different, but I know Hart had thought an awful lot about governing during a time of economic transformation and after the Cold War, and I think there’s a good chance he would have helped us navigate those changes sooner and better. That’s just my own gut feeling.

______________________

Question:

What institutional source of congressional gridlock do you think has been underreported on?

Matt Bai:

Always willing to look at new ideas, thanks. As for gridlock, I really feel like we don’t talk enough about the primary system and how antiquated it is in an age where fewer people want to join anything local, much less a political party. You have a system where fewer and fewer people — the hardcore ideologues — are making choices for the rest of us about who can be on the ballot, and I don’t think that’s sustainable. What happens in Congress is that the representatives are more worried about that small number of activists than they are about the policy or their broader constituency.

Tags: ,

I disagree with that holy fool Slavoj Žižek on some issues, but I agree with him that philosophy is far from dead, our technological and economic situations requiring ethical speculation more than ever. From his Guardian AMA:

Question:

What is the future of philosophy – both within academia and in the so-called ‘collective consciousness’?

Slavoj Žižek:

I think philosophy will become more important than ever, even for so-called ‘ordinary people’. Why? The incredible social dynamics of today’s capitalism, as well as scientific and technological breakthroughs, changed our situation so much that old ethical and religious systems no longer function. Think about biogenetic interventions, which may even change your character, how your psyche works. This was no even a possibility considered in traditional ethical systems, which means that we all in a way have to think. We have to make decisions. We cannot rely on old religious and ethical formulas. Like: are you for or against biogenetic interventions? In order to decide, to take a stance, you have somehow implicitly to address questions like: do I have a free will? Am I really responsible for my acts? And so on. So I think that 21st century will be the century of philosophy.”

Tags:

In a review of Martin Wolf’s The Shifts and the Shocks in the New York Review of Books, Paul Krugman argues that the financial bubble may not have led to the 2008 crash but merely briefly masked an economy that has stalled in a long-term way. An excerpt:

“Emphasizing the need to reduce financial fragility makes sense if you believe that the legacy of past financial excess is the reason we’re in so much trouble now. But are we sure about that? Let me offer two reasons to be skeptical.

First, while the depression that overtook the Western world in 2008 clearly came after the collapse of a vast financial bubble, that doesn’t mean that the bubble caused the depression. Late in The Shifts and the Shocks Wolf mentions the reemergence of the ‘secular stagnation’ hypothesis, most famously in the speeches and writing of Lawrence Summers (Lord Adair Turner independently made similar points, as did I). But I’m not sure whether readers will grasp the full implications. If the secular stagnationists are right, advanced economies now suffer from persistently inadequate demand, so that depression is their normal state, except when spending is supported by bubbles. If that’s true, bubbles aren’t the root of the problem; they’re actually a good thing while they last, because they prop up demand. Unfortunately, they’re not sustainable—so what we need urgently are policies to support demand on a continuing basis, which is an issue very different from questions of financial regulation.

Wolf actually does address this issue briefly, suggesting that the answer might lie in deficit spending financed by the government’s printing press. But this radical suggestion is, as I said, overshadowed by his calls for more financial regulation. It’s the morality play aspect again: the idea that we need to don a hairshirt and repent our sins resonates with many people, while the idea that we may need to abandon conventional notions of fiscal and monetary virtue has few takers.”

Tags: ,

Oh, it’s fun designing a city on paper or even redesigning one. At i09, Annalee Newitz has a thought experiment for making over New York: Imagine it all of a sudden becomes a megacity with triple the population and figure out how to make that sustainable. Probably good to practice since the number of New Yorkers will likely head to that stratosphere over the decades, if flooding doesn’t become a recurrent issue. An excerpt from the “Disappearing Streets” section:

“New York City is already one of the most densely-packed urban spaces in the world, with 10,724 people on average per square kilometer. To triple the living spaces here, we’ll need to build up — but we’ll also need to build between. The city could no longer afford to devote so much street space to the products of an already-shaky auto industry, and the city’s grid would change immeasurably. So would the laws that govern it.

For efficiency’s sake, Manhattan would have to retain a couple of the major avenues like Fifth, which cuts through the center of the island. But it would be reserved for trucks delivering food — or taking garbage out. Other streets would be for licensed taxis and services like Uber, while cars belonging to individuals might be routed to the edges of island, or to other boroughs entirely. Getting around in Manhattan would mean taking public transit, or paying dearly to get an Uber.

At the same time, there would be a flowering of pedestrian walkways like Sixth and a Half Avenue, which tunnels through the skyscrapers of midtown in between Sixth and Seventh Aves. As more skyscrapers grew, walkways would also take to the skies in bridges between buildings. To keep the ground-level streets less congested, pedestrians would be invited to walk Broadway from the air, hustling from building to building via a growing network of architectural tissues that would nourish a new sidewalk culture fifteen stories off the ground.

Some of these elevated sidewalks would be classic New York, complete with tar-gummed concrete and jagged nubs of rusted rebar poking out at odd angles. But others would look like high-tech works of art.”

Tags:

For productivity to increase, labor costs must shrink. That’s fine provided new industries emerge to accommodate workers, but that really isn’t what we’ve seen so far in the Technological Revolution, the next great bend in the curve, as production and wages haven’t boomed. It’s been the trade of a taxi medallion for a large pink mustache. More convenient, if not cheaper (yet), for the consumer, but bad for the drivers.

Perhaps that’s because we’re at the outset of a slow-starting boom, the Second Machine Age, or perhaps what we’re going through refuses to follow the form of the Industrial Revolution. Maybe it’s the New Abnormal. The opening of the Economist feature, “Technology Isn’t Working“: 

“IF THERE IS a technological revolution in progress, rich economies could be forgiven for wishing it would go away. Workers in America, Europe and Japan have been through a difficult few decades. In the 1970s the blistering growth after the second world war vanished in both Europe and America. In the early 1990s Japan joined the slump, entering a prolonged period of economic stagnation. Brief spells of faster growth in intervening years quickly petered out. The rich world is still trying to shake off the effects of the 2008 financial crisis. And now the digital economy, far from pushing up wages across the board in response to higher productivity, is keeping them flat for the mass of workers while extravagantly rewarding the most talented ones.

Between 1991 and 2012 the average annual increase in real wages in Britain was 1.5% and in America 1%, according to the Organisation for Economic Co-operation and Development, a club of mostly rich countries. That was less than the rate of economic growth over the period and far less than in earlier decades. Other countries fared even worse. Real wage growth in Germany from 1992 to 2012 was just 0.6%; Italy and Japan saw hardly any increase at all. And, critically, those averages conceal plenty of variation. Real pay for most workers remained flat or even fell, whereas for the highest earners it soared.

It seems difficult to square this unhappy experience with the extraordinary technological progress during that period, but the same thing has happened before. Most economic historians reckon there was very little improvement in living standards in Britain in the century after the first Industrial Revolution. And in the early 20th century, as Victorian inventions such as electric lighting came into their own, productivity growth was every bit as slow as it has been in recent decades.

In July 1987 Robert Solow, an economist who went on to win the Nobel prize for economics just a few months later, wrote a book review for the New York Times. The book in question, The Myth of the Post-Industrial Economy by Stephen Cohen and John Zysman, lamented the shift of the American workforce into the service sector and explored the reasons why American manufacturing seemed to be losing out to competition from abroad. One problem, the authors reckoned, was that America was failing to take full advantage of the magnificent new technologies of the computing age, such as increasingly sophisticated automation and much-improved robots. Mr Solow commented that the authors, ‘like everyone else, are somewhat embarrassed by the fact that what everyone feels to have been a technological revolution…has been accompanied everywhere…by a slowdown in productivity growth.'”

Speaking of human laborers being squeezed: Open Source with Christopher Lydon has an episode called “The End of Work,” with two guests, futurist Ray Kurzweil and MIT economist Andrew McAfee. A few notes.

  • McAfee sees the Technological Revolution as doing for gray matter what the Industrial Revolution did for muscle fiber, but on the way to a world of wealth without toil–a Digital Athens–the bad news is the strong chance of greater income inequality and decreased opportunities for many. Kodak employed 150,000; Instagram a small fraction of that. With the new technologies, destruction (of jobs) outpaces creation. Consumers win, but Labor loses.
  • Kurzweil is more hopeful in the shorter term than McAfee. He says we have more jobs and more gratifying ones today than 100 years ago and they pay better. We accomplish more. Technology will improve us, make us smarter, to meet the demands of a world without drudgery. It won’t be us versus the machines, but the two working together. The majority of jobs always go away, most of the jobs today didn’t exist so long ago. New industries will be invented to provide work. He doesn’t acknowledge a painful period of adjustment in distribution before abundance can reach all.

Tags: , ,

Sure, we have phones that are way nicer now, but the Technological Revolution has largely been injurious to anyone in the Labor market, and things are going to get worse, at least in the near and mid term. A free-market society that is highly automated isn’t really very free. Drive for Uber until autonomous cars can take over the wheel, you’re told, or rent a spare room on Airbnb–make space for yourself on the margins through the Sharing Economy. But there’s less to share for most people. From an Economist report:

“Before the horseless carriage, drivers presided over horse-drawn vehicles. When cars became cheap enough, the horses and carriages had to go, which eliminated jobs such as breeding and tending horses and making carriages. But cars raised the productivity of the drivers, for whom the shift in technology was what economists call ‘labour-augmenting.’ They were able to serve more customers, faster and over greater distances. The economic gains from the car were broadly shared by workers, consumers and owners of capital. Yet the economy no longer seems to work that way. The big losers have been workers without highly specialised skills.

The squeeze on workers has come from several directions, as the car industry clearly shows. Its territory is increasingly encroached upon by machines, including computers, which are getting cheaper and more versatile all the time. If cars and lorries do not need drivers, then both personal transport and shipping are likely to become more efficient. Motor vehicles can spend more time in use, with less human error, but there will be no human operator to share in the gains.

At the same time labour markets are hollowing out, polarising into high- and low-skill occupations, with very little employment in the middle. The engineers who design and test new vehicles are benefiting from technological change, but they are highly skilled and it takes remarkably few of them to do the job. At Volvo much of the development work is done virtually, from the design of the cars to the layout of the production line. Other workers, like the large numbers of modestly skilled labourers that might once have worked on the factory floor, are being squeezed out of such work and are now having to compete for low-skill and low-wage jobs.

Labour has been on the losing end of technological change for several decades.”

From the Overcoming Bias post in which economist Robin Hansen comments on Debora MacKenzie’s New Scientist article “The End of Nations,” a piece which wonders about, among other things, whether states in the modern sense predated the Industrial Revolution:

“An interesting claim: the nation-state didn’t exist before, and was caused by, the industrial revolution. Oh there were empires before, but most people didn’t identify much with empires, or see empires as much influencing their lives. In contrast people identify with nation-states, which they see as greatly influencing their lives. More:

Before the late 18th century there were no real nation states. … If you travelled across Europe, no one asked for your passport at borders; neither passports nor borders as we know them existed. People had ethnic and cultural identities, but these didn’t really define the political entity they lived in. …

Agrarian societies required little actual governing. Nine people in 10 were peasants who had to farm or starve, so were largely self-organising. Government intervened to take its cut, enforce basic criminal law and keep the peace within its undisputed territories. Otherwise its main role was to fight to keep those territories, or acquire more. … Many eastern European immigrants arriving in the US in the 19th century could say what village they came from, but not what country: it didn’t matter to them. … Ancient empires are coloured on modern maps as if they had firm borders, but they didn’t. Moreover, people and territories often came under different jurisdictions for different purposes.

Such loose control, says Bar-Yam, meant pre-modern political units were only capable of scaling up a few simple actions such as growing food, fighting battles, collecting tribute and keeping order. …

The industrial revolution … demanded a different kind of government. … ‘In 1800 almost nobody in France thought of themselves as French. By 1900 they all did.’ … Unlike farming, industry needs steel, coal and other resources which are not uniformly distributed, so many micro-states were no longer viable. Meanwhile, empires became unwieldy as they industrialised and needed more actual governing. So in 19th-century Europe, micro-states fused and empires split.

These new nation states were justified not merely as economically efficient, but as the fulfilment of their inhabitants’ national destiny. A succession of historians has nonetheless concluded that it was the states that defined their respective nations, and not the other way around.”

Tags:

I previously posted the audio of the “Turn On, Tune In, Drop Out” speech Timothy Leary delivered at UCLA in 1967, and here’s the video of the spirited LSD debate he participated in with Dr. Jerry Lettvin at MIT a few months later. In his remarks, Leary lambastes scientists and technologists devoted to manufacturing entertaining diversions. (Thanks to Open Culture.)

Tags: ,

Robotics will increase productivity, no doubt, but that doesn’t mean wages will likewise rise. Automation to the extent that will soon exist is uncharted territory and no one can predict the exact fallout. From Brad DeLong at Project Syndicate:

“The wages and salaries of low- and high-skill workers in the robot-computer economy of the future will not be determined by the (very high) productivity of the one lower-skill worker ensuring that all of the robots are in their places or the one high-skill worker reprogramming the software. Instead, compensation will reflect what workers outside the highly productive computer-robot economy are creating and earning.

The newly industrialized city of Manchester, which horrified Friedrich Engels when he worked there in the 1840s, had the highest level of labor productivity the world had ever seen. But the factory workers’ wages were set not by their extraordinary productivity, but by what they would earn if they returned to the potato fields of pre-famine Ireland.

So the question is not whether robots and computers will make human labor in the goods, high-tech services, and information-producing sectors infinitely more productive. They will. What really matters is whether the jobs outside of the robot-computer economy – jobs involving people’s mouths, smiles, and minds – remain valuable and in high demand.”

Tags:

Libertarian billionaire Peter Thiel, who refuses to do interviews unless someone asks, just sounded off to the Wall Street Journal about the technophobia he feels is pervasive in America and Europe. More likely, people enjoy technology’s benefits but have concerns about the downsides (privacy issues, environmental concerns, unemployment, etc.), although there certainly is tension between the old Dream Factory (Hollywood) and the new one (Silicon Valley). An excerpt:

“Forget all the buzz over driverless cars; the days spent waiting in line for the latest iPhone; the drones delivering medicine. Tech investor Peter Thiel says that, fundamentally, our society hates tech.

‘We live in a financial and capitalistic age,’ he said. ‘We do not live in a scientific or technological age. We live in an age that’s dominated by hostility and unfriendliness towards all things technological.” …

Silicon Valley, he said, has people who believe in technology and scientific innovation, while much of the rest of the U.S. doesn’t.

‘The easiest way to see this is you just look at all the movies Hollywood makes,’ he said. ‘They all show technology that doesn’t work; that kills people; that’s destroying the world, and you can choose between Avatar, or The Matrix, or Terminator films.’ (Mr. Thiel has previously lashed out at Hollywood, including criticizing how Silicon Valley was portrayed in the movie, The Social Network–which documents Facebook’s creation and Mr. Thiel’s part in it.) “

Tags:

One question from Jeffrey Rosen’s New Republic interview with Supreme Court Justice Ruth Bader Ginsburg, who has quietly and gradually become a towering figure in American life:

Question:

What’s the worst ruling the current Court has produced?

Ruth Bader Ginsburg:

If there was one decision I would overrule, it would be Citizens United. I think the notion that we have all the democracy that money can buy strays so far from what our democracy is supposed to be. So that’s number one on my list. Number two would be the part of the health care decision that concerns the commerce clause. Since 1937, the Court has allowed Congress a very free hand in enacting social and economic legislation. I thought that the attempt of the Court to intrude on Congress’s domain in that area had stopped by the end of the 1930s. Of course health care involves commerce. Perhaps number three would be Shelby County, involving essentially the destruction of the Voting Rights Act. That act had a voluminous legislative history. The bill extending the Voting Rights Act was passed overwhelmingly by both houses, Republicans and Democrats, everyone was on board. The Court’s interference with that decision of the political branches seemed to me out of order. The Court should have respected the legislative judgment. Legislators know much more about elections than the Court does. And the same was true of Citizens United. I think members of the legislature, people who have to run for office, know the connection between money and influence on what laws get passed.”•

Tags: ,

At the Financial Times, David Runciman has an article about Political Order and Political Decay, Francis Fukuyama’s follow-up to 2012’s The Origins of Political Order. The political scientist thinks America made sequential mistakes, establishing democracy ahead of a strong central government, and is paying for these and other sins. Few would argue the country isn’t currently in a political quagmire, but I think historical sequence may not be the most important factor. America by design is a disparate nation, an experiment in multitudes, and there will always by divisions. We ebb and flow, but the flows are pretty spectacular. And while the writer is correct to assert that too much of America’s power has fallen under the control of too few, you could say the same of the late 19th century, and that was remedied for a long spell. A passage about Fukuyama’s prescription for proper political order:

“Fukuyama’s analysis provides a neat checklist for assessing the political health of the world’s rising powers. India, for instance, thanks to its colonial history, has the rule of law (albeit bureaucratic and inefficient) and democratic accountability (albeit chaotic and cumbersome) but the authority of its central state is relatively weak (something Narendra Modi is trying to change). Two out of three isn’t bad, but it’s far from being a done deal. China, by contrast, thanks to its own history as an imperial power, has a strong central state (dating back thousands of years) but relatively weak legal and democratic accountability. Its score is more like one and a half out of three, though it has the advantage that the sequence is the right way round were it to choose to democratise. Fukuyama doesn’t say if it will or it won’t – the present signs are not encouraging – but the possibility remains open.

The really interesting case study, however, is the US. America’s success over the past 200 years bucks the trend of Fukuyama’s story because the sequence was wrong: the country was a democracy long before it had a central state with any real authority. It took a civil war to change that, plus decades of hard-fought reform. Among the heroes of Fukuyama’s book are the late-19th and early-20th-century American progressives who dragged the US into the modern age by giving it a workable bureaucracy, tax system and federal infrastructure. On this account, Teddy Roosevelt is as much the father of his nation as Washington or Lincoln.

But even this story doesn’t have a happy ending. Just as it can take a major shock to achieve political order, so in the absence of shocks a well-ordered political society can get stuck. That is what has happened to the US. In the long peace since the end of the second world war (and the shorter but deeper peace since the end of the cold war), American society has drifted back towards a condition of relative ungovernability. Its historic faults have come back to haunt it. American politics is what Fukuyama calls a system of ‘courts and parties': legal and democratic redress are valued more than administrative competence. Without some external trigger to reinvigorate state power (war with China?), partisanship and legalistic wrangling will continue to corrode it. Meanwhile, the US is also suffering the curse of all stable societies: capture by elites. Fukuyama’s ugly word for this is ‘repatrimonialisation.’ It means that small groups and networks – families, corporations, select universities – use their inside knowledge of how power works to work it to their own advantage. It might sound like social science jargon, but it’s all too real: if the next presidential election is Clinton v Bush again we’ll see it happening right before our eyes.”

 

Tags: ,

I don’t think there’s any question that Uber is good for consumers and bad for workers, but even if America’s newest set of wheels goes bust like Napster, the bigger picture is that the war has been quietly lost regardless of what happens in that one loud battle. The music industry wasn’t brought down just by Sean Parker, but by the wave he represented. From Avi-Asher Schapiro’s Jacobin article about labor’s share getting smaller:

“Uber claims there’s no need for a union; it instead asks drivers to trust that the company acts in their best interest. Uber refused to show me complete data detailing average hourly compensation for drivers. It does claim, however, that UberX drivers are making more money now than before this summer’s price cuts.

‘The average fares per hour for a Los Angeles UberX driver-partner in the last four weeks were 21.4% higher than the December 2013 weekly average,’ Uber spokesperson Eva Behrend told me. ‘And drivers on average have seen fares per hour increase 28% from where they were in May of this year.’

I couldn’t find a single driver who is making more money with the lower rates.

What’s clear is that for Uber drivers to get by, they’re going to have to take on more rides per shift. Uber implicitly concedes as much: ‘With price cuts, trips per hour for partner-drivers have increased with higher demand,’ Behrend said.

So while drivers make less per fare, Uber suggests they recoup losses by just driving more miles. That may make sense for an Uber analyst crunching the numbers in Silicon Valley, but for drivers, more miles means hustling to cram as many runs into a shift as possible to make the small margins worthwhile.”

Ebola isn’t threatening to be a pandemic yet and probably won’t, but the rampant regional spread of its terror and death has reached a scale that never had to be. Mobilizing against a known disease should be relatively easy, but politics seldom is. From a Spiegel interview by Rafaela von Bredow and Veronika Hackenbroch with Peter Piot, one of the scientists who first discovered the virus in 1976:

Spiegel:

There is actually a well-established procedure for curtailing Ebola outbreaks: isolating those infected and closely monitoring those who had contact with them. How could a catastrophe such as the one we are now seeing even happen? 

Peter Piot:

I think it is what people call a perfect storm: when every individual circumstance is a bit worse than normal and they then combine to create a disaster. And with this epidemic, there were many factors that were disadvantageous from the very beginning. Some of the countries involved were just emerging from terrible civil wars, many of their doctors had fled and their healthcare systems had collapsed. In all of Liberia, for example, there were only 51 doctors in 2010, and many of them have since died of Ebola.

Spiegel:

The fact that the outbreak began in the densely populated border region between Guinea, Sierra Leone and Liberia …

Peter Piot:

… also contributed to the catastrophe. Because the people there are extremely mobile, it was much more difficult than usual to track down those who had had contact with the infected people. Because the dead in this region are traditionally buried in the towns and villages they were born in, there were highly contagious Ebola corpses traveling back and forth across the borders in pick-ups and taxis. The result was that the epidemic kept flaring up in different places.

Spiegel:

For the first time in its history, the virus also reached metropolises like Monrovia and Freetown. Is that the worst thing that can happen?

Peter Piot:

In large cities — particularly in chaotic slums — it is virtually impossible to find those who had contact with patients, no matter how great the effort. That is why I am so worried about Nigeria as well. The country is home to mega-cities like Lagos and Port Harcourt and if the Ebola virus lodges there and begins to spread, it would be an unimaginable catastrophe.

Spiegel:

Have we completely lost control of the epidemic?

Peter Piot:

I have always been an optimist and I think that we now have no other choice than to try everything, really everything. It’s good that the United States and some other countries are finally beginning to help. But Germany or even Belgium, for example, must do a lot more. And it should be clear to all of us: This isn’t just an epidemic anymore. This is a humanitarian catastrophe. We don’t just need care personnel, but also logistics experts, trucks, jeeps and foodstuffs. Such an epidemic can destabilize entire regions. I can only hope that we will be able to get it under control. I really never thought that it could get this bad.”

Tags: , ,

As robots proliferate, we’re going require far more than three laws to govern their actions. The questions are seemingly endless, and the answers will likely have to be very elastic. The opening of an Economist report about the RoboLaw group’s recently released findings:

“WHEN the autonomous cars in Isaac Asimov’s 1953 short story ‘Sally’ encourage a robotic bus to dole out some rough justice to an unscrupulous businessman, the reader is to believe that the bus has contravened Asimov’s first law of robotics, which states that ‘a robot may not injure a human being or, through inaction, allow a human being to come to harm.’

Asimov’s ‘three laws’ are a bit of science-fiction firmament that have escaped into the wider consciousness, often taken to be a serious basis for robot governance. But robots of the classic sort, and bionic technologies that enhance or become part of humans, raise many thorny legal, ethical and regulatory questions. If an assistive exoskeleton is implicated in a death, who is at fault? If a brain-computer interface is used to communicate with someone in a vegetative state, are those messages legally binding? Can someone opt to replace their healthy limbs with robotic prostheses?

Questions such as these are difficult to anticipate. The concern for policymakers is creating a regulatory and legal environment that is broad enough to maintain legal and ethical norms but is not so proscriptive as to hamper innovation.”

Jeremy Waldron of the New York Review of Books has an article about legal scholar Cass Sunstein, who believes a nudge is often better than a “no,” though it’s not always easy to define the distinction.

Of the examples of nudge-ish paternalism provided in the below excerpt, the one that most bothers me is the TV that’s programmed to always turn on first to PBS. It feels like a violation of personal space and an oppression of cultural tastes. Plenty of gatekeepers have been wrong over the years, while the masses have been right. Cheap comics weren’t a plague and rock music wasn’t just a bunch of noise.

It’s true, though, that the absence of paternalism doesn’t mean we have unobstructed free will. Corporations don’t just nudge but shove us toward their products (here and here), often unhealthy ones, and some pushback is warranted. An excerpt:

“Cass Sunstein is a Harvard law professor and the author of dozens of books on the principles of public policy. He knew Barack Obama from Harvard Law School and in 2009, he was appointed administrator of the White House Office of Information and Regulatory Affairs. Sunstein’s thought about nudging is evidently the fruit of his determination to consider alternatives to the old command-and-control models of regulation. Now, with his government service behind him (for the time being), he has given us another book, called Why Nudge?, in which he provides an accessible defense of what he calls ‘libertarian paternalism’—a good-natured paternalism that is supposed to leave individual choosing intact.

‘Paternalism’ is usually a dirty word in political philosophy: the nanny state passing regulations that restrict us for our own good, banning smoking and skateboarding because they’re unsafe, or former New York City Mayor Michael Bloomberg trying to limit the size of sugary sodas sold in New York City—’the Big Gulp Ban.’ Now, a nudger wouldn’t try anything so crass. If you ordered a soda in nudge-world, you would get a medium cup, no questions asked; you’d have to go out of your way to insist on a large one. Not only that, but diet beverages would probably be the ones displayed most prominently in nudge-world and served without question unless the customer insisted on getting the classic version from under the counter.

You could order a supersized sugary beverage if you wanted it badly enough, but it wouldn’t be so convenient to carry it to your table because Thaler and Sunstein are in favor of abolishing trays. It is all too easy to load up a tray with food that will never be eaten and napkins that go unused. You could insist on a tray if you wanted to hold up the line, but a tray-free policy has been proved to lower food and beverage waste by up to 50 percent in certain environments. Nudge and Why Nudge? are replete with examples like this.

Nudging is paternalistic, but it is surely a very mild version of paternalism. It’s about means, not ends: we don’t try to nudge people toward a better view of the good life, with compulsory library cards, for example, or PBS always coming up when you turn on your TV. And it is mild too because you can always opt out of a nudge. Not that Sunstein is opposed to more stringent regulations. Sometimes a straightforward requirement—like the rule about seat belts—might be a better form of paternalism. These options are left open for the regulator.”

Tags: ,

There are many areas where billionaire investor Peter Thiel and I disagree–he seems unconcerned about that–but we do concur that AI is more the solution than the problem. While I accept that smart machines could possibly be the ruination of humans, it seems very likely to me that we’ll become extinct without their continued development. (There are some other low-tech, out-of-the-box measures which might be useful in staving off our species’ collapse, but it’s highly improbable they’ll be implemented.) 

In his new Financial Times piece, “Robots Are Our Saviors, Not the Enemy,” Thiel examines the more mundane economic costs of silicon brain drain. He extols the value of a man-machine hybrid, saying that robots will complement rather than replace human labor in many instances, though former employees of Blockbuster, Borders and Fotomat might disagree. And if driverless cars arrive anywhere near on schedule, there will be a whole new class of unemployed that wasn’t a useful complement. An excerpt:

“Unlike fellow humans of different nationalities, computers are not substitutes for American labour. Men and machines are good at different things. People form plans and make decisions in complicated situations. We are less good at making sense of enormous amounts of data. Computers are exactly the opposite: they excel at efficient data processing but struggle to make basic judgments that would be simple for any human.

I came to understand this from my experience as chief executive of PayPal. In mid-2000 we had survived the dotcom crash and we were growing fast but we faced one huge problem: we were losing upwards of $10m a month to credit card fraud. Since we were processing hundreds or even thousands of trans­actions each minute, we could not possibly review each one. No human quality control team could work that fast.

We tried to solve the problem by writing software that would automatically identify bogus transactions and cancel them in real time. But it quickly became clear that this approach would not work: after an hour or two, the thieves would catch on and change their tactics to fool our algorithms.

Human analysts, however, were not easily fooled by criminals’ adaptive strategies. So we rewrote the software to take a hybrid approach: the computer would flag the most suspicious trans­actions, and human operators would make the final judgment.

This kind of man-machine symbiosis enabled PayPal to stay in business, which in turn enabled hundreds of thousands of small businesses to accept the payments they needed to thrive on the internet.”

Tags: ,

« Older entries § Newer entries »