Facebook has been trying to deny its outsize role in creating a filter bubble for several election seasons, though that balloon burst for good after the fake-news onslaught on 2016. Those who stubbornly denied that the Arab Spring was enabled to a good degree by our powerful new tools were crushed beneath the other shoe dropping last November. In the right moment, social media can have a deep impact on governments, for better or worse.
· · ·
In all fairness, Mark Zuckerberg’s empire isn’t alone in the fake-news business. The 21 years of Fox News have overlapped almost exactly with a steep decline in trust in mass media among Republicans and Independents. Many of the senior citizens who voted for Trump have never looked at News Feed, instead absorbing misinformation from Sean Hannity and similar talkers. For all the influence of those outlets, however, their reach is dwarfed by Facebook, which doesn’t count its users in millions but billions. And social media would appear to be a better way to reach undecideds, a surer approach when trying to tip an election.
· · ·
In an excellent New York Times Magazine feature, Farhad Manjoo investigates News Feed, a “global news distributor that is run by machines, rather than by humans,” as Zuckerberg newly tries to control the chaos of his invention, a ginormous piece of the largest experiment involving anarchy in history. He seems to have good intentions, but it may not be possible to maximize both profits and ethics, and his invention might just be irrevocably bad for individuals and worse for democracy. The system itself may be a fatal error.
As Manjoo notes, “The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition or seniority.” Zuckerberg has likely grown from the person who said seven or so years ago that “a squirrel dying in your front yard may be more relevant to your interests right now than people dying in Africa,” but he’s still deeply dedicated to personalization and selectively oblivious about the monster he’s created.
The writer stresses that Facebook is trying to fix its fake-news problem via engineering rather than editing. The latter is sometimes far from perfect–the New York Times itself wasn’t without culpability in the recent election–but the answer to all technological problems isn’t necessarily more technology.
An excerpt:
After studying how people shared 1.25 million stories during the campaign, a team of researchers at M.I.T. and Harvard implicated Facebook and Twitter in the larger failure of media in 2016. The researchers found that social media created a right-wing echo chamber: a “media network anchored around Breitbart developed as a distinct and insulated media system, using social media as a backbone to transmit a hyperpartisan perspective to the world.” The findings partially echoed a long-held worry about social news: that people would use sites like Facebook to cocoon themselves into self-reinforcing bubbles of confirmatory ideas, to the detriment of civility and a shared factual basis from which to make collective, democratic decisions. A week and a half after the election, President Obama bemoaned “an age where there’s so much active misinformation and it’s packaged very well and it looks the same when you see it on a Facebook page or you turn on your television.”
After the election, Zuckerberg offered a few pat defenses of Facebook’s role. “I’m actually quite proud of the impact that we were able to have on civic discourse over all,” he said when we spoke in January. Misinformation on Facebook was not as big a problem as some believed it was, but Facebook nevertheless would do more to battle it, he pledged. Echo chambers were a concern, but if the source was people’s own confirmation bias, was it really Facebook’s problem to solve?
It was hard to tell how seriously Zuckerberg took the criticisms of his service and its increasingly paradoxical role in the world. He had spent much of his life building a magnificent machine to bring people together. By the most literal measures, he’d succeeded spectacularly, but what had that connection wrought? Across the globe, Facebook now seems to benefit actors who want to undermine the global vision at its foundation. Supporters of Trump and the European right-wing nationalists who aim to turn their nations inward and dissolve alliances, trolls sowing cross-border paranoia, even ISIS with its skillful social-media recruiting and propagandizing — all of them have sought in their own ways to split the Zuckerbergian world apart. And they are using his own machine to do it.
In Silicon Valley, current events tend to fade into the background. The Sept. 11 attacks, the Iraq war, the financial crisis and every recent presidential election occurred, for the tech industry, on some parallel but distant timeline divorced from the everyday business of digitizing the world. Then Donald Trump won. In the 17 years I’ve spent covering Silicon Valley, I’ve never seen anything shake the place like his victory. In the span of a few months, the Valley has been transformed from a politically disengaged company town into a center of anti-Trump resistance and fear. A week after the election, one start-up founder sent me a private message on Twitter: “I think it’s worse than I thought,” he wrote. “Originally I thought 18 months. I’ve cut that in half.” Until what? “Apocalypse. End of the world.”•
Tags: Farhad Manjoo