“Software Is Not Free Of Human Influence”

I’ve written many times that I’d like algorithms to rid us of gerrymandering, that if we want a Congress that has to worry about a seven-percent approval rating, we need to take the drawing of districts from partisan hands.

But formulas can also have embedded biases if we’re not careful (or honest). Claire Cain Miller, one of the brightest thinkers at New York Times’ Upshot section, makes this clear in her latest post, “When Algorithms Discriminate.” Regularly running simulations to test these processes is of utmost importance. An excerpt:

“There is a widespread belief that software and algorithms that rely on data are objective. But software is not free of human influence. Algorithms are written and maintained by people, and machine learning algorithms adjust what they do based on people’s behavior. As a result, say researchers in computer science, ethics and law, algorithms can reinforce human prejudices.

Google’s online advertising system, for instance, showed an ad for high-income jobs to men much more often than it showed the ad to women, a new study by Carnegie Mellon University researchers found.

Research from Harvard University found that ads for arrest records were significantly more likely to show up on searches for distinctively black names or a historically black fraternity. The Federal Trade Commission said advertisers are able to target people who live in low-income neighborhoods with high-interest loans.”•

Tags: