Philosophically, there’s a debate to be made that we don’t want AI to resemble us too much nor be too seamless. Maybe we should know it’s different and present. Given time, those distinctions will blur no matter what we consciously decide because that’s the nature of people and their machines.
There are vital practical reasons, however, for AI to be able to recognize our body language, and one MIT experiment is allowing its algorithm to learn about their carbon neighbors via binge-watching TV programs. Isn’t that how a lot of human newcomers to a language absorb the details of a culture, by viewing soaps and sitcoms?
From Tim Moynihan at Wired:
THE NEXT TIME you catch your robot watching sitcoms, don’t assume it’s slacking off. It may be hard at work.
TV shows and video clips can help artificially intelligent systems learn about and anticipate human interactions, according to MIT’s Computer Science and Artificial Intelligence Laboratory. Researchers created an algorithm that analyzes video, then uses what it learns to predict how humans will behave.
Six-hundred hours of clips from shows like The Office and Big Bang Theory let the AI learned to identify high-fives, handshakes, hugs, and kisses. Then it learned what the moments leading to those interactions looked like.
After the AI devoured all that video to train itself, the researchers fed the algorithm a single frame from a video it had not seen and tasked it with predicting what would happen next. The algorithm got it right about 43 percent of the time.•