“Computers Break Down. They Have Bugs. They Get Hacked”

I wish everyone writing about technology could turn out prose as sparkling and lucid as Nicholas Carr. In a New York Times opinion piece, he stresses that while people are flawed, so are computers, and our silicon counterparts thus far lack the dexterity we possess to react to the unforeseen. He suggests humans and machines permanently remain a team, allowing us to benefit from the best of both.

I think that’s the immediate future, but I still believe market forces will ultimately cede to robots anything they can do as well (or nearly as well) as humans. And I’m curious as to the effects of Deep Learning on the impromptu responses of machinery.

From Carr:

While our flaws loom large in our thoughts, we view computers as infallible. Their scripted consistency presents an ideal of perfection far removed from our own clumsiness. What we forget is that our machines are built by our own hands. When we transfer work to a machine, we don’t eliminate human agency and its potential for error. We transfer that agency into the machine’s workings, where it lies concealed until something goes awry.
 
Computers break down. They have bugs. They get hacked. And when let loose in the world, they face situations that their programmers didn’t prepare them for. They work perfectly until they don’t.
 
Many disasters blamed on human error actually involve chains of events that are initiated or aggravated by technological failures. Consider the 2009 crash of Air France Flight 447 as it flew from Rio de Janeiro to Paris. The plane’s airspeed sensors iced over. Without the velocity data, the autopilot couldn’t perform its calculations. It shut down, abruptly shifting control to the pilots. Investigators later found that the aviators appeared to be taken by surprise in a stressful situation and made mistakes. The plane, with 228 passengers, plunged into the Atlantic.

The crash was a tragic example of what scholars call the automation paradox. Software designed to eliminate human error sometimes makes human error more likely. When a computer takes over a job, the workers are left with little to do. Their attention drifts. Their skills, lacking exercise, atrophy. Then, when the computer fails, the humans flounder.

Tags: