“Once You Create These Databases Is It Very Easy To Fall Into Function Creep”

In a discussion about why facial-recognition software didn’t work in the case of the Boston Marathon bombers but will likely be able to verify identity in such instances in the near future, Andrew Leonard of Salon asks Carnegie Mellon computer scientist Alessandro Acquisti about the potential downsides to improving this technology. The first part of his answer doesn’t bother me so much since witnesses and reporters and juries are very flawed anyway, but the second part does. An excerpt:

Question:

Looking forward, are there reasons why improved facial recognition should worry us?

Alessandro Acquisti:

I am concerned by the possibility for error. We may start to rely on these technologies and start making decisions based on them, but the accuracy they can give us will always be merely statistical: a probability that these two images are images of the same person. Maybe that is considered enough by someone on the Internet who will go after a person who turns out to be innocent. There’s also the problem of secondary usage of data. Once you create these databases is it very easy to fall into function creep — this data should be used only in very limited circumstances but people will hold on to it because it may be useful later on for some secondary purpose.”

Tags: ,