Facebook's chief executive, Mark Zuckerberg, resisted pressure to change its political advertising policies. Photo / Eric Thayer, The New York Times
By opting not to change the company's political advertising rules, Mark Zuckerberg has ensured another election shaped by the social network.
If you were hoping to hear less about Facebook this year, you're out of luck.
The social platform announced on Thursday — after months of hemming and hawing —that it would not change its basic rules for political advertising ahead of the 2020 election. Unlike Google, which restricted the targeting of political ads last year, or Twitter, which barred political ads entirely, Facebook and its chief executive, Mark Zuckerberg, decided to preserve the status quo.
Politicians will still be exempt from Facebook's fact-checking program, and will still be allowed to break many of the rules that apply to other users. Campaigns will still be allowed to spend millions of dollars on ads targeted to narrow slices of the electorate, upload their voter files to build custom audiences, and use all the other tools of Facebook tradecraft.
The social network has spent much of the past three years apologising for its inaction during the 2016 election, when its platform was overrun with hyperpartisan misinformation, some of it Russian, that was amplified by its own algorithms. And ahead of 2020, some people wondered if Zuckerberg — who is, by his own admission, uncomfortable with Facebook's power — would do everything he could to step out of the political crossfire.
Instead, Zuckerberg has embraced Facebook's central role in elections — not only by giving politicians a pass on truth, but also by preserving the elements of its advertising platforms that proved to be a decisive force in 2016.
"It was a mistake," Alex Stamos, Facebook's former chief security officer, said about Facebook's decision. Stamos, who left the company after the 2016 election, said that political considerations had likely factored into the decision to leave its existing ad targeting options in place.
"They're clearly afraid of political pushback," he said.
Stamos, like some Facebook employees and outside agitators, had advocated for small but meaningful changes to Facebook's policies, such as raising the minimum size audience a political advertiser is allowed to target and disallowing easily disprovable claims made about a political candidate by his or her rivals. These proposed changes were intended to discourage bad behavior by campaigns, while still letting them use Facebook's powerful ad tools to raise money and turn out supporters.
But in the end, those arguments lost out to the case — made by Andrew Bosworth, a Facebook executive, in an internal memo, as well as President Donald Trump's campaign and several Democratic groups — that changing the platform's rules, even in an ostensibly neutral way, would amount to tipping the scales. Bosworth, who oversaw Facebook's ad platform in 2016, argued that the reason Trump was elected was simply because "he ran the single best digital ad campaign I've ever seen from any advertiser."
In other words, the system worked as designed.
Don't get me wrong: Facebook has made strides since 2016 to deter certain kinds of election interference. It has spent billions of dollars beefing up its security teams to prevent another Russian troll debacle, and it has added more transparent tools to shine more light on the dark arts of digital campaigning, such as a political ad library and a verification process that requires political advertisers to register with an American address. These moves have forced would-be election meddlers to be stealthier in their tactics, and have made a 2016-style foreign influence operation much less likely this time around.
But despite these changes, the basic architecture of Facebook is largely the same as it was in 2016, and vulnerable in many of the same ways. The platform still operates on the principle that what is popular is good. It still takes a truth-agnostic view of political speech — telling politicians that, as long as their posts don't contain certain types of misinformation (like telling voters the wrong voting day, or misleading them about the census), they can say whatever they want. And it is still reluctant to take any actions that could be construed as partisan — even if those actions would lead to a healthier political debate or a fairer election.
Facebook has argued that it shouldn't be an arbiter of truth, and that it has a responsibility to remain politically neutral. But the company's existing policies are anything but neutral. They give an advantage to candidates whose campaigns are good at cranking out emotionally charged, hyperpartisan content, regardless of its factual accuracy. Today, that describes Trump's strategy, as well as those used successfully by other conservative populists, including President Jair Bolsonaro of Brazil and Prime Minister Viktor Orban of Hungary. But it could just as well describe the strategy of a successful Democratic challenger to Trump. Facebook's most glaring bias is not a partisan one — it is a bias toward candidates whose strategies most closely resemble that of a meme page.
On one level, Zuckerberg's decision on ads, which came after months of passionate lobbying by both Republican and Democratic campaigns, as well as civil rights groups and an angry cohort of Facebook employees, is a bipartisan compromise. Both sides, after all, rely on these tools, and there is an argument to be made that Democrats need them in order to close the gap with Trump's sophisticated digital operation.
Ultimately, though, Zuckerberg's decision to leave Facebook's platform architecture intact amounts to a powerful endorsement — not of any 2020 candidate, but of Facebook's role in global democracy. It's a vote for the idea that Facebook is a fairly designed playing field that is conducive to healthy political debate, and that whatever problems it has simply reflect the problems that exist in society as a whole.
Ellen L. Weintraub, a commissioner on the Federal Election Commission who has been an outspoken opponent of Facebook's existing policies, told me on Thursday that she, too, was disappointed in the company's choice.
"They have a real responsibility here, and they're just shirking it," Weintraub said. "They don't want to acknowledge that something they've created is contributing to the decline of our democracy, but it is."
In Facebook's partial defense, safeguarding elections is not a single company's responsibility, nor are tech companies the sole determinants of who gets elected. Income inequality, economic populism, immigration policy — these issues still matter, as do the media organizations that shape perception of them. I also don't believe, as some Facebook critics do, that Zuckerberg is doing this for the money. Facebook's political advertising revenue is a tiny portion of its overall revenue, and even a decision to bar political ads entirely wouldn't materially change the company's financial health.
Instead, I take Zuckerberg at his word that he genuinely believes that an election with Facebook at its core is better than one without it — that, as he said last year, "political ads are an important part of voice."
There are reasons to quibble with Zuckerberg's definition of "voice," and to ask why a platform that fact-checked politicians' ads or limited their ability to microtarget voters would have less of it. But it barely matters, because the terms for the 2020 election are now set. This election, like the 2016 election, will be determined in large part by who can best exploit Facebook's reluctance to appear to be refereeing our politics, even while holding the whistle.
"They've laid out what the rules are going to be — and now everyone has to line up behind these rules," said Stamos, the former Facebook security chief. "Which are effectively no rules."