“It Could Make People Feel More Positive Or Negative Through A Process Of ‘Emotional Contagion'”

Trusting Facebook or Google or any large data-mining tech corporation with our information, let alone our emotions, is a mistake. And is many cases, it’s not a matter of a poor choice by us–there’s no choice at all. We’re inside that matrix now. The opening of Robert Booth’s Guardian report on Mark Zuckerberg’s behemoth executing large-scale psychological experiments on unsuspecting users:

“It already knows whether you are single or dating, the first school you went to and whether you like or loathe Justin Bieber. But now Facebook, the world’s biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.

It has published details of a vast experiment in which it manipulated information posted on 689,000 users’ home pages and found it could make people feel more positive or negative through a process of ’emotional contagion.’

In a study with academics from Cornell and the University of California, Facebook filtered users’ news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users’ exposure to their friends’ ‘positive emotional content,’ resulting in fewer positive posts of their own. Another test reduced exposure to ‘negative emotional content’ and the opposite happened.

The study concluded: ‘Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.’

Lawyers, internet activists and politicians said this weekend that the mass experiment in emotional manipulation was ‘scandalous,’ ‘spooky’ and ‘disturbing.'”

Tags: ,