Zuckerberg realises the dangers of the social-media revolution he helped start

May 5th, 2017

In early January, I went to see Mark Zuckerberg at MPK20, a concrete-and-steel building on the campus of Facebook's headquarters in Menlo Park, California. The Frank Gehry-designed building has a pristine 3.6-hectare rooftop garden, yet much of the interior appears unfinished. Many of the internal walls are unpainted plywood. The space looks less like the headquarters of one of the world's wealthiest companies and more like a Chipotle restaurant with standing desks. It's an aesthetic meant to reflect one of Facebook's founding ideologies: that things are never quite finished, that nothing is permanent, that you should always look for a chance to take an axe to your surroundings.

The mood in overwhelmingly liberal Silicon Valley at the time, days before US President Donald Trump's inauguration, was grim. But Zuckerberg is preternaturally unable to look anything other than excited about the future. "Hey, guys!" he beamed, greeting Mike Isaac, a New York Times colleague who covers Facebook, and me.

"2016 was an interesting year for us," he said as the three of us, plus a public relations executive, sat in a glass-walled conference room. No one, not even Zuckerberg, has a private office. It was an understatement and a nod to the obvious: Facebook had become a global political and cultural force, and the full implications of that transformation had begun to come into view last year.

"If you look at the history of Facebook, when we started off, there really wasn't news as part of it," Zuckerberg went on. But as Facebook grew and became a bigger part of how people learn about the world, the company had been slow to adjust to its new place in people's lives. The events of 2016, he said, "set off a number of conversations that we're still in the middle of".

Nearly 2 billion people use Facebook every month, about 1.2 billion of them daily. The company, which Zuckerberg co-founded in his Harvard dormitory room 13 years ago, has become the largest and most influential entity in the news business, commanding an audience greater than that of any American or European television news network, any newspaper in the Western world and any online news outlet. It is also the most powerful mobilising force in politics, and it is fast replacing television as the most consequential entertainment medium. Just five years after its initial public offering, Facebook is one of the 10 highest market-capitalised public companies in the world.

But over the course of 2016, Facebook's gargantuan influence became its biggest liability. During the US election, propagandists - some working for money, others for potentially state-sponsored lulz [mischief] - used the service to turn fake stories into viral sensations, such as the one about Pope Francis endorsing Trump (he hadn't). With its huge reach, Facebook has begun to act as the great disseminator of misinformation and half-truths swirling about the rest of media. It sucks up lies from cable news and Twitter, then precisely targets each lie to the partisan bubble most receptive to it.

After studying how people shared 1.25 million stories during the campaign, a team of researchers at Massachusetts Institute of Technology and Harvard implicated Facebook and Twitter in the larger failure of media in 2016, finding that social media created a right-wing echo chamber: a "media network anchored around Breitbart developed as a distinct and insulated media system, using social media as a backbone to transmit a hyperpartisan perspective to the world". After the election, former president Barack Obama bemoaned "an age where there's so much active misinformation and it's packaged very well and it looks the same when you see it on a Facebook page or you turn on your television."

Zuckerberg offered a few pat defences of Facebook's role. "I'm actually quite proud of the impact that we were able to have on civic discourse over all," he said in January. Misinformation on Facebook was not as big a problem as some believed it was, but Facebook nevertheless would do more to battle it, he pledged.

It was hard to tell how seriously Zuckerberg took the criticisms of his service and its increasingly paradoxical role in the world. Across the globe, Facebook now seems to benefit actors who want to undermine the global vision at its foundation. Supporters of Trump and the European right-wing nationalists who aim to turn their nations inward and dissolve alliances, even ISIS with its skillful social-media recruiting and propagandising - have sought to split the Zuckerbergian world apart. And they are using his own machine to do it.

Since election day Silicon Valley has been consumed with a feeling of complicity. Trump had benefited from a media environment that is now shaped by Facebook - and, more to the point, shaped by a single Facebook feature, the same one to which the company owes its remarkable ascent to social-media hegemony: the computationally determined list of updates you see every time you open the app. The list has a formal name, News Feed. But most users are apt to think of it as Facebook itself.

If it's an exaggeration to say that News Feed has become the most influential source of information in the history of civilisation, it is only slightly so. Facebook created News Feed in 2006 to solve a problem: In the social-media age, people suddenly had too many friends to keep up with. To figure out what any of your connections were up to, you had to visit each of their profiles to see if anything had changed. News Feed fixed that. Every time you open Facebook, it hunts through the network, collecting every post from every connection. Then it weighs the merits of each post before presenting you with a feed sorted in order of importance: a hypersonalised front page designed just for you.

Scholars and critics have been warning of the solipsistic irresistibility of algorithmic news at least since 2001, when the constitutional law professor Cass Sunstein warned, in his book Republic.com, of the urgent risks posed to democracy "by any situation in which thousands or perhaps millions or even tens of millions of people are mainly listening to louder echoes of their own voices". (In 2008, I piled on with my own book, True Enough: Learning to Live in a Post-Fact Society.) In 2011, the digital activist and entrepreneur Eli Pariser gave this phenomenon a memorable name in the title of his own book: The Filter Bubble.

Facebook says its own researchers have been studying the filter bubble since 2010. In 2015, they published an in-house study, which was criticised by independent researchers, concluding that Facebook's effect on the diversity of people's information diet was minimal. When News Feed did show people views contrary to their own, they tended not to click on the stories. For Zuckerberg, the finding let Facebook off the hook.

Then, last year, Facebook's domination of the news became a story itself. In May, Gizmodo reported that some editors who had worked on Facebook's Trending Topics section had been suppressing conservative points of view. To smooth things over, Zuckerberg convened a meeting of conservative media figures and eventually significantly reduced the role of human editors. Then in September, Facebook deleted a post that included the photojournalist Nick Ut's iconic photo of a naked nine-year-old girl, Phan Thi Kim Phuc, running in terror after a napalm attack during the Vietnam War, on the grounds that it ran foul of Facebook's prohibition of child nudity.

Facebook, under criticism, reinstated the picture, but the photo incident highlighted the difficulty of building a policy framework for what Facebook was trying to do. Zuckerberg wanted to become a global news distributor that is run by machines, rather than by humans who would try to look at every last bit of content and exercise considered judgment. "It's something I think we're still figuring out," he told me in January. "There's a lot more to do here than what we've done. And I think we're starting to realise this now as well."

It struck me as an unsatisfying answer, and it became apparent that Zuckerberg seemed to feel the same way. A month after the first meeting, Zuckerberg wanted to chat again.

The Zuckerberg who greeted us was less certain in his pronouncements, more questioning. Earlier, Zuckerberg's staff had sent me a draft of a 5700-word manifesto that, I was told, he spent weeks writing. The document, "Building Global Community", argued that until now, Facebook's corporate goal had merely been to connect people. According to the manifesto, Facebook's next focus will be developing the social infrastructure for community - for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all". If it was a nebulous crusade, it was also vast in its ambition.

"There are questions about whether we can make a global community that works for everyone," Zuckerberg writes, "and whether the path ahead is to connect more or reverse course." He also confesses misgivings about Facebook's role in the news. "Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared," he writes. "But the past year has also shown it may fragment our shared sense of reality."

At the time, the manifesto was still only a draft. When I suggested that it might be perceived as an attack on Trump, he looked dismayed. A few weeks earlier, there was media speculation, fuelled by a post-election tour of America by Zuckerberg and his wife, that he was laying the groundwork to run against Trump in 2020, and he took pains to shoot down the rumours.

If the company pursues the aims outlined in "Building Global Community", the changes will echo across media and politics, and some are bound to be considered partisan. The risks are especially clear for changes aimed at adding layers of journalistic ethics across News Feed, which could transform the public's perception of Facebook, not to mention shake the foundations of its business.

The solution to the broader misinformation dilemma - the pervasive climate of rumour, propaganda and conspiracy theories that Facebook has inadvertently incubated - may require something that Facebook has never done: ignoring the likes and dislikes of its users. Facebook's entire project, when it comes to news, rests on the assumption that people's individual preferences ultimately coincide with the public good, and that, if it doesn't appear that way at first, you're not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our world view even if it isn't true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

After the election, Margaret Sullivan, a Washington Post columnist and a former public editor of the Times, called on Facebook to hire an executive editor who would monitor News Feed with an eye to fact-checking, balance and editorial integrity. Jonah Peretti, the founder of BuzzFeed, told me that he wanted Facebook to use its data to create a kind of reputational score for online news.

Late last year, Facebook outlined a modest effort to curb misinformation. News Feed would now carry warning labels: If a friend shares a viral story that has been shot down by one of Facebook's fact-checking partners (including Snopes and PolitiFact), you'll be cautioned that the piece has been "disputed". But even that slight change has been met with fury on the right, with Breitbart and The Daily Caller fuming that Facebook had teamed up with liberal hacks motivated by partisanship. If Facebook were to take more significant action, such as hiring human editors or paying journalists, the company would instantly become something it has long resisted: a media company rather than a neutral tech platform.

In many ways, the worry over how Facebook changes the news is really a manifestation of a grander problem with News Feed, which is simply dominance itself. News Feed's aggressive personalisation wouldn't be much of an issue if it weren't crowding out every other source.

By my second meeting with Zuckerberg, Facebook had announced plans for the Facebook Journalism Project, in which the company would collaborate with news companies on new products. Facebook also created a project to promote "news literacy" among its users, and it hired the former CNN news anchor Campbell Brown to manage the partnership between it and news companies. Zuckerberg's tone towards critics of Facebook's approach to news had grown far more conciliatory.

"I think it's really important to get to the core of the actual problem," he said. "I also really think that the core social thing that needs to happen is that a common understanding needs to exist. And misinformation I view as one of the things that can possibly erode common understanding. But sensationalism and polarisation and other things, I actually think, are probably even stronger and more prolific effects. And we have to work on all these things. I think we need to listen to all the feedback on this."

Still, Zuckerberg remained preoccupied with the kind of problems that could be solved by the kind of hyperconnectivity he believed in, not the ones caused by it. "There's a social infrastructure that needs to get built for modern problems in order for humanity to get to the next level," he said. "Having more people oriented not just towards short-term things but towards building the long-term social infrastructure that needs to get built across all these things in order to enable people to come together is going to be a really important thing over the next decades."

Zuckerberg continued, "We're getting to a point where the biggest opportunities I think in the world ... problems like preventing pandemics from spreading or ending terrorism, all these things, they require a level of co-ordination and connection that I don't think can only be solved by the current systems that we have." What's needed is some global superstructure to advance humanity.

Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook - and, considering that he commands absolute voting control of the company, himself - as a critical enabler of the next generation of human society. His mission drips with megalomania, albeit of a particularly sincere sort.

Building new "social infrastructure" usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next. In the case of the shattering media landscape, Zuckerberg may yet come up with fixes for it. But in the meantime, Facebook rushes headlong into murky new areas, uncovering new dystopian possibilities at every turn.

A few months after we spoke, Facebook held its annual developer conference in San Jose, California. At last year's show, Zuckerberg introduced an expanded version of Facebook's live streaming service which had been promised to revolutionise how we communicate. Live had generated iconic scenes of protest, but was also used to broadcast a terrorist attack in Munich and at least one suicide. Hours before Zuckerberg's appearance, a Cleveland man who had killed a stranger and posted a video on Facebook had shot himself after a manhunt.

But as he took the stage in San Jose, Zuckerberg was ebullient. For a brief moment, there was a shift in tone: Statesman Zuck. "In all seriousness, this is an important time to work on building community," he said. He offered Facebook's condolences to the victim in Cleveland; the incident, he said, reminded Facebook that "we have a lot more to do".

Zuckerberg then pivoted to Facebook's next marvel, a system for digitally augmenting your pictures and videos. The technical term for this is "augmented reality". The name bursts with dystopian possibilities - fake news on video rather than just text - but Zuckerberg never mentioned them. The statesman had left the stage; before us stood an engineer. -

The New York Times Magazine

Protect you business and subscribe to the HR Help Desk service. Ensure your business is compliant with the latest workplace legislation. 
 

Back to News