“Our Goal Is To Build The Perfect Personalized Newspaper For Every Person In The World”

When the idea of online personalization was first presented to me by an Web 1.0 entrepreneur, I was unimpressed and uninterested, saying I preferred newspapers and magazines introducing me to ideas I hadn’t been seeking. You could have your very own tailor-made newspaper or magazine, I was told, but I insisted that wasn’t what I would subscribe to. I naively believed others would feel the same way.

When the very young version of Mark Zuckerberg (who is still young) infamously said “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa,” he betrayed himself as stunningly callous at that point in his life but also demonstrated the myopia of personalization. Our new tools can enable us to live in a bubble, but that doesn’t mean they ennoble us.

Targeted information helps maximize online advertising revenue, but it’s lousy for democracy, paradoxically offering us greater freedom while simultaneously undermining it. Whether the Internet is inherently at odds with liberty is a valid question. And considering Fox News has been fake news for more than 20 years, the decentralized media in general has to similarly analyzed.


The opening of Tim Harford’s latest Financial Times column:

“Our goal is to build the perfect personalised newspaper for every person in the world,” said Facebook’s Mark Zuckerberg in 2014. This newspaper would “show you the stuff that’s going to be most interesting to you.”

To many, that statement explains perfectly why Facebook is such a terrible source of news.

A “fake news” story proclaiming that Pope Francis had endorsed Donald Trump was, according to an analysis from BuzzFeed, the single most successful item of news on Facebook in the three months before the US election. If that’s what the site’s algorithms decide is interesting, it’s far from being a “perfect newspaper.”

It’s no wonder that Zuckerberg found himself on the back foot after Trump’s election. Shortly after his victory, Zuckerberg declared: “I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way . . . is a pretty crazy idea.” His comment was greeted with a scornful response.

I should confess my own biases here. I despise Facebook for all the reasons people usually despise Facebook (privacy, market power, distraction, fake-smile social interactions and the rest). And, as a loyal FT columnist, I need hardly point out that the perfect newspaper is the one you’re reading right now.

But, despite this, I’m going to stand up for Zuckerberg, who recently posted a 5,700-word essay defending social media. What he says in the essay feels like it must be wrong. But the data suggest that he’s right. Fake news can stoke isolated incidents of hatred and violence. But neither fake news nor the algorithmically driven “filter bubble” is a major force in the overall media landscape. Not yet.•


From Eli Pariser in the New York Times in 2011:

Like the old gatekeepers, the engineers who write the new gatekeeping code have enormous power to determine what we know about the world. But unlike the best of the old gatekeepers, they don’t see themselves as keepers of the public trust. There is no algorithmic equivalent to journalistic ethics.

Mark Zuckerberg, Facebook’s chief executive, once told colleagues that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa.” At Facebook, “relevance” is virtually the sole criterion that determines what users see. Focusing on the most personally relevant news — the squirrel — is a great business strategy. But it leaves us staring at our front yard instead of reading about suffering, genocide and revolution.

There’s no going back to the old system of gatekeepers, nor should there be. But if algorithms are taking over the editing function and determining what we see, we need to make sure they weigh variables beyond a narrow “relevance.” They need to show us Afghanistan and Libya as well as Apple and Kanye.

Companies that make use of these algorithms must take this curative responsibility far more seriously than they have to date. They need to give us control over what we see — making it clear when they are personalizing, and allowing us to shape and adjust our own filters. We citizens need to uphold our end, too — developing the “filter literacy” needed to use these tools well and demanding content that broadens our horizons even when it’s uncomfortable.

It is in our collective interest to ensure that the Internet lives up to its potential as a revolutionary connective medium. This won’t happen if we’re all sealed off in our own personalized online worlds.•

Tags: , ,