“The Tasks That Have Proved Most Vexing To Automate Are Those Demanding Flexibility, Judgment, And Common Sense”

The MIT economist David Autor doesn’t believe it’s different this time, he doesn’t think automation will lead to widespread technological unemployment any more than it did during the Industrial Revolution or the last AI scares of the 1960s and 1970s. Autor feels that robots may come for some of our jobs, but there will still be enough old and new ones to busy human hands because our machine brethren will probably never be our equal in common sense, adaptability and creativity. Technology’s new tools may be fueling wealth inequality, he acknowledges, but the fear of AI soon eliminating Labor is unfounded. 

Well, perhaps. But if you’re a truck or bus or taxi or limo or delivery driver, a hotel clerk or bellhop, a lawyer or paralegal, a waiter or fast-casual food preparer, or one of the many other workers whose gigs will probably disappear, you may be in for some serious economic pain before abundance emerges at the other side of the new arrangement.

Autor certainly is right in arguing that the main economic problem caused by mass automation would be “one of distribution, not of scarcity.” But that’s an issue requiring some political consensus to solve, and reaching a majority isn’t easy these days in our polarized society.

From Autor’s smart article in the Journal of Economic Perspectives “Why Are There Still So Many Jobs?“:

Polanyi’s Paradox: Will It Be Overcome?

Automation, complemented in recent decades by the exponentially increasing power of information technology, has driven changes in productivity that have disrupted labor markets. This essay has emphasized that jobs are made up of many tasks and that while automation and computerization can substitute for some of them, understanding the interaction between technology and employment requires thinking about more than just substitution. It requires thinking about the range of tasks involved in jobs, and how human labor can often complement new technology. It also requires thinking about price and income elasticities for different kinds of output, and about labor supply responses.

The tasks that have proved most vexing to automate are those demanding flexibility, judgment, and common sense—skills that we understand only tacitly. I referred to this constraint above as Polanyi’s paradox. In the past decade, computerization and robotics have progressed into spheres of human activity that were considered off limits only a few years earlier—driving vehicles, parsing legal documents, even performing agricultural field labor. Is Polanyi’s paradox soon to be at least mostly overcome, in the sense that the vast majority of tasks will soon be automated?

My reading of the evidence suggests otherwise. Indeed, Polanyi’s paradox helps to explain what has not yet been accomplished, and further illuminates the paths by which more will ultimately be accomplished. Specifically, I see two distinct paths that engineering and computer science can seek to traverse to automate tasks for which we “do not know the rules”: environmental control and machine learning. The first path circumvents Polanyi’s paradox by regularizing the environment, so that comparatively inflexible machines can function semi-autonomously. The second approach inverts Polanyi’s paradox: rather than teach machines rules that we do not understand, engineers develop machines that attempt to infer tacit rules from context, abundant data, and applied statistics.•

Tags: