Facebook CEO Mark Zuckerberg says the social network will step back from its role in choosing the news that 2 billion users see on its site every month. Picture / AP
Mark Zuckerberg's crusade to fix Facebook this year is beginning with a startling retreat. The social network, its chief executive said, would step back from its role in choosing the news that 2 billion users see on its site every month.
The company is too "uncomfortable" to make such decisions in a world that has become so divided, Zuckerberg explained recently.
The move was one result of a tumultuous, 18-month struggle by Facebook to come to grips with its dark side, interviews with 11 current and former executives show.
As outsiders criticised the social network's harmful side effects, such as the spread of disinformation and violent imagery, vigorous internal debates played out over whether to denounce Donald Trump directly, how forthcoming to be about Russian meddling on its platform in the 2016 election, and how to fight the perception that Facebook is politically biased.
Whether Zuckerberg's proposed changes can address these issues will soon be tested with another major election only 10 months away. Right now, the company isn't confident that it can prevent the problems that roiled Facebook during the 2016 presidential election, a top executive acknowledged.
"I can't make a final assessment, other than we are substantially better today in light of the experience than we were a year ago," Elliot Schrage, Facebook's vice president for communications, policy and marketing, said in an interview. "We will be dramatically better even still a year from now."
Some current and former executives think Facebook has not fully owned up to the negative consequences of its tremendous power. At the heart of the dilemma is the very technology that makes the social network work, they said.
"The problem with Facebook's whole position is that the algorithm exists to maximise attention, and the best way to do that is to make people angry and afraid," said Roger McNamee, an investor and mentor to Zuckerberg in Facebook's early days.
He and others - including the company's first president, its former head of growth and Zuckerberg's former speechwriter - have been criticising the company in increasingly harsh terms.
Altering the formula may diminish what made Facebook successful in the first place - a risk Zuckerberg and his team have said they are willing to take.
"Until the last few years, and certainly until the last year, the focus of our investment has been on building our service and offering new tools and experiences," Schrage said.
"One of the discoveries that the election in particular really demonstrated was, at the same time that we were making investments in positive new experiences, we were underinvesting in the tools and oversights to protect against inappropriate, abusive and exploitative experiences."
That insight led to changes this month to revamp the news feed, the scrolling page that pops up when Facebook users sign in.
Posts shared by family and close friends will now rank above content from news organizations and brands, Facebook said. On Friday, the company said it is also going to let users vote on which news organisations are the most trustworthy and should get the biggest play on Facebook, diminishing its own role in the distribution of news.
On Monday, Samidh Chakrabarti, Facebook's product manager for civic engagement, made another surprising admission in a company blog post called "Hard Questions": Social media can "spread misinformation and corrode democracy."
Much is at stake for Facebook as it seeks to fix itself. The company is now the fifth most valuable in the United States and one of the world's biggest distributors of news.
The chorus of criticism - especially from within its own ranks - threatens to demoralise its workforce and spur regulators to use a stronger hand. The focus on quality control could come at the cost of growth and disappoint investors - who already signaled their frustration by sending Facebook's stock down 4.5 per cent the day after Zuckerberg began laying out changes this year.
But perhaps the most worrisome issue for the company is that it will spend a year trying to clean up its act - and the abuses will continue.
"They want to avoid making a judgment, but they are in a situation where you can't avoid making a judgment," said Jay Rosen, a journalism professor at New York University. "They are looking for a safe approach. But sometimes you can be in a situation where there is no safe route out."
Before Facebook became an online advertising behemoth, and before an array of humbling events in 2017, the company was far more willing to tout its ability to influence society.
They want to avoid making a judgment, but they are in a situation where you can't avoid making a judgment.
In 2011, managers published a white paper bragging that an ad campaign on the social network was able to tilt the results of a local election in Florida. In 2014, Facebook rolled out an "I voted" button that aimed to increase voter turnout. Facebook researchers found that it increased turnout in a California election by more than 300,000 votes.
That same year, Facebook openly endorsed social engineering. Its prestigious data science research division published an emotion manipulation study on 700,000 Facebook users, which showed that the company could make people feel happier or more depressed by tweaking the content of news feeds.
The internal impression of Facebook's influence partly shaped how Zuckerberg waded into political issues during the campaign, some Facebook executives said.
In the spring of 2016, when then-candidate Trump began calling for a border wall between the United States and Mexico, several leaders in the tech industry criticised the action.
Zuckerberg, a major backer of immigration-related causes, considered drafting a post for his Facebook page condemning the wall proposal, according to Dex Torricke-Barton, one of Zuckerberg's former speechwriters, and another executive who had direct knowledge of the matter.
But several advisers argued against doing so. They were concerned that calling out a US presidential candidate by name would push the company too deeply into politics.
Ultimately, the post did not include Trump's name, the two people said. The result was generic language condemning "those who try to build walls".
Torricke-Barton said he fumed about the decision at the time. But when he complained to a colleague, he received an answer that was even more troubling to him. The colleague made the point that Facebook was more powerful than any nation-state, and therefore it made sense for Zuckerberg to stay above the fray.
"On every major public policy issue, there are different voices who call for Facebook to take a stronger public stance or to play it safe," Torricke-Barton said in an interview. "In practice, Facebook has become very good at walking right up to the line to take a stand on critical issues but not actually crossing it."
A Facebook spokesman denied that Zuckerberg had originally wanted to include Trump's name in the post, adding that the boss had intended to express broader concerns about the rise of nativism around the world.
A few months later, the company faced accusations of political partisanship. A news story - citing unnamed Facebook contractors - on the tech news site Gizmodo accused Facebook of routinely excluding conservative media outlets and themes from the Trending Topics list of top stories.
As the story spread, panicked Facebook executives conducted a study of what went wrong and discovered that the decisions about which outlets to include were made by low-level contractors and went unchecked by higher-ups.
In May of 2016, Zuckerberg and his aides invited leading conservative media figures to the Facebook campus in Menlo Park , California, to discuss the matter. They also took the unusual step of inviting right-leaning Facebook board member Peter Thiel to help plan the meeting, people familiar with the matter said, speaking on the condition of anonymity to discuss the events freely.
Facebook employees didn't know at the time - and were surprised to find out soon after - that Thiel was bankrolling a lawsuit against the media outlet Gawker, and during that summer declared his support for Trump. Some executives thought he was using his position on the board to court political allies and influence the media contrary to Facebook's goal to step away from any perception of political partisanship. Thiel did not respond to requests for comment.
Regardless, executives began to recognise the Trending Topics incident as a turning point in the perception of Facebook's power, and they debated whether they should be more transparent about how the news algorithm worked, current and former executives said. Some openly worried that emerging views of the company's influence could invite regulation.
Schrage said he was surprised initially by the central position the public and the media attributed to Facebook in shaping the political environment.
"The issue of Trending Topics was an important moment," he said, "for appreciating the vital role that we played in the dynamic of political discourse and in the concerns that people had that we were not being an honest broker."
Ultimately, executives decided not to reveal much about its software. Instead, they fired the human contractors whose judgment - and leaks - executives thought had fueled the crisis.
As the election neared, concerns about appearing biased continued to weigh on Facebook executives. In addition, by October of 2016, Facebook's security team had identified and purged 5.8 million fake accounts, some of which had spread phony news reports - including one about Pope Francis endorsing Trump.
The day after the election, Facebook employees, including several top executives, came to work "sick to their stomachs", according to a former executive, who said that the internal discussions turned immediately to the role of false stories on the platform.
Still, the following week, Zuckerberg dismissed the idea that fake news on Facebook had an impact on the election, describing that possibility as "crazy".
Even after receiving a personal warning from President Barack Obama, who cautioned Zuckerberg to take the subject more seriously, executives worried that taking down too many accounts would make the company appear biased against conservatives, causing a repeat of the Trending Topics debacle, according to a former executive. A December 2016 blog post said the company would focus on blocking "the worst of the worst".
Over the following months, Facebook's security teams began to unearth more evidence pointing to the role of Russian operatives.
Executives debated what to reveal or whether it was appropriate for Facebook to single out a foreign nation. Ultimately, an April white paper included only a general description of this effort and the assertion that Facebook's data "does not contradict" the conclusions of US intelligence officials, who had found that Russian operatives tried to meddle in the election to help Trump.
But the compromise led to grumbling among members of the security team, some of whom complained that most of their groundbreaking work was kept from reaching the public, according to several people who heard such complaints and spoke on the condition of anonymity to discuss the matter freely.
As the company was discovering evidence of Russian meddling, Senator Mark Warner visited Facebook's headquarters, where he pressured the firm to disclose what it knew.
In the months after Warner's visit, the company found evidence of 3,000 ads bought by the Internet Research Agency, a troll farm with Kremlin ties.
Initially, the company planned to only disclose the ads to Congress - and declined to share information on Russian strategies and their reach, such as the content of the ads, which types of Americans were targeted, and how many were exposed to Russian content that was not advertising and looked like regular Facebook posts.
After independent researchers claimed that the reach of Russian operatives was much greater than the company had disclosed, Facebook conducted an investigation and discovered that 126 million Americans were exposed to Russian disinformation through posts on Facebook, aside from an estimated 10 million users who saw the ads created by the Russian troll farm.
At the same time, the company blocked access to metrics and posts about Russian accounts that had enabled outside researchers to estimate the reach of the Russian ads.
"They only act in response to public pressure," said Tristan Harris, Google's former design ethicist and the founder of TimeWellSpent, an advocacy group pressuring technology companies to make less addictive products.
Facebook ultimately promised to publish all political ads going forward and built a feature, released on the Friday before Christmas, that allows individual Facebook users to see whether they were targets of Russian disinformation.
The company has also vowed to hire more than 10,000 workers, including academics, subject matter experts, and content moderators, to step up its efforts to ensure election integrity, boost its understanding of propaganda, and evaluate violent videos and hate speech.
But some current and former executives questioned whether the company could hire enough people to clean up the social network.
They say that the changes the company are making are just tweaks when, in fact, the problems are a core feature of the Facebook product, said Sandy Parakilas, a former Facebook privacy operations manager.
"If they demote stories that get a lot of likes, but drive people toward posts that generate conversation, they may be driving people toward conversation that isn't positive. 'The Pope endorses Trump' was an [incorrect] article that drove a lot of conversation."
Zuckerberg has since apologised for dismissing the impact of fake news on the election. And Schrage says that, after the growing pains of 2017, the company is learning to be more forthcoming - and to make painful choices.
But Schrage emphasised Zuckerberg's willingness to hurt the company's bottom line if it meant making decisions that would improve user safety. He added that Facebook had still been more transparent than other tech giants, pointing out that Facebook was the first internet company to admit Russian interference.
The events of 2017 "reinforced to us the tremendous responsibility we have, and made clear the important disconnect between our assessment of our responsibility - and the steps we were taking to exercise it - and other people's assessments of our power and responsibility and how well we were exercising it," Schrage said.
Last year, Schrage launched the Hard Questions blog, where executives discuss their decision-making on challenging issues such as censorship of posts, hate speech and Russian meddling.
"I do think people have broad questions about how the internet and the technologies that the internet enables are perceived," he said. "There is tremendous anxiety about this, and so what we've decided - and this has been a conscious decision - is that best way to alleviate the anxiety is to help people understand what we're doing."