“Before The Filter Bubble, There Was The So-Called Echo Chamber”

glennbeckurine

Trending Topics at Afflictor: Monkeys, Astronauts, Monkey Astronauts. Oh, and President Trump’s tiny monkey hands. Take it or leave it.

Facebook got into trouble last week with the GOP because a report claimed human editors choosing the social network’s Trending Topics are exercising bias against conservatives. It was widely thought these keywords rose and fell purely based on popularity, in an automated way.

What’s most amusing about this is that Republicans have been greatly aided in recent congressional races because of gerrymandering, in which humans draw up districts in a willfully biased way. Just suggest to them that prejudice-less algorithms decide the nation’s districts based purely on population statistics. The connection will go silent.

There’s no doubt that Facebook perhaps becoming our chief news source is problematic because the social network isn’t mainly in the journalism business and news will never be its main priority. But I wonder if what it delivers is really any more biased than what we get from traditional outlets. 

In a Wall Street Journal column, Christopher Mims looks deeper at the issue and questions if narrowcasting in Facebook feeds is actually a problem. An excerpt:

Claiming that Facebook is contributing to our age of hyper-partisanship by only showing us things that fit our own personal slant is, ironically, an example of confirmation bias, because the evidence around it is mixed.

After an exhaustive search of the literature around filter bubbles, five co-authors and Frederik J. Zuiderveen Borgesius, a researcher at the a researcher at the Personalised Communication project at the University of Amsterdam, concluded concerns might be overblown. “In spite of the serious concerns voiced, at present there is no empirical evidence that warrants any strong worries about filter bubbles,” Mr. Zuiderveen Borgesius wrote in an email.

The authors examined not only Facebook but other online services, including Google search. Mr. Zuiderveen Borgesius’s conclusion: We don’t have enough data to say whether Facebook is biasing the news its readers see, or—and this is even more important—whether it affects their views and behavior.

Facebook’s opacity aside, where does the hand-wringing come from? Two places, I think: the first is that everyone in the media is terrified of Facebook’s power to determine whether individual stories and even entire news organizations succeed or fail. The second is an ancient fear that, by associating only with people like ourselves, and being selective in what we read, we are biasing ourselves unduly.

Before the filter bubble, there was the so-called echo chamber.•

Tags: