Facebook boss Mark Zuckerberg says the company has a policy on deepfakes. Photo / Getty Images
Facebook Chief Executive Mark Zuckerberg said in an interview he worries "about an erosion of truth" online but defended the policy that allows politicians to peddle ads containing misrepresentations and lies on his social network, a stance that has sparked an outcry during the 2020 presidential campaign.
"People worry, and I worry deeply, too, about an erosion of truth," Zuckerberg told The Washington Post ahead of a speech Thursday at Georgetown University. "At the same time, I don't think people want to live in a world where you can only say things that tech companies decide are 100 per cent true. And I think that those tensions are something we have to live with."
Zuckerberg's approach to political speech has come under fire in recent weeks. Democrats have taken particular issue with Facebook's decision to allow an ad from President Donald Trump's 2020 campaign that included falsehoods about former Vice President Joe Biden and his son, Hunter. Sen. Elizabeth Warren, D-Mass., responded to Facebook's decision by running her own campaign ad, satirically stating that Zuckerberg supports Trump for re-election.
Zuckerberg framed the issue as part of a broader debate over free expression, warning about the dangers of social networks, including Facebook, "potentially cracking down too much." He called on the U.S. to set an example for tailored regulation in contrast to other countries, including China, that censor political speech online. And Zuckerberg stressed Facebook must stand strong against governments that seek to "pull back" on free speech in the face of heightened social and political tensions.
Zuckerberg's appearance in Washington marks his most forceful attempt to articulate his vision for how governments and tech giants should approach the Web's most intractable problems. The scale of Facebook and its affiliated apps, Instagram and WhatsApp, which make up a virtual community of billions of users, poses challenges for Zuckerberg and regulators around the world as they struggle to contain hate speech, falsehoods, violent imagery and terrorist propaganda on social media.
Next week, Zuckerberg is set to testify at a congressional hearing that's likely to serve as a wide-ranging review of the company's business practices. Facebook's size, meanwhile, has become a primary object of derision from some Democrats seeking the White House in 2020, who contend that Facebook is too big, powerful and problematic and should be regulated or broken apart.
The election lends urgency to Facebook's issues. The social network became a major platform for disinformation during the 2016 race, and experts say that the forms of manipulation and deception have evolved since then, including the arrival of deepfakes, or videos that convincingly distort what a subject is doing or saying using artificial intelligence. A fake clip of House Speaker Nancy Pelosi, doctored to make her appear drunk, that went viral on Facebook in May drew attention to the problem.
Zuckerberg acknowledged Facebook has work to do to combat such digital ills. He revealed Facebook has been "working through what our policy should be" on deepfake videos. "I think we're getting pretty close to at least rolling out the first version of it." He declined to provide additional details.
The tech giant declined to take down the video, though it did later append a notation that it was false, drawing sharp criticism from the House speaker and others that the company had failed to address a blatant lie.
Asked if the Pelosi incident illustrated a serious gap at Facebook, Zuckerberg agreed. "If anything becomes a big issue, and we haven't already prepared for it, then that means we were too slow in preparing for it," he said. "And I think figuring out which types of deepfakes are actually a threat today, versus are a theoretical future threat once the technology advances, is one of the things that we need to make sure we get right."
But Zuckerberg stood behind the way Facebook, which has long eschewed fact-checking political ads, handles political ads. "I think we're in the right place on this," he said. "In general, in a democracy, I think that people should be able to hear for themselves what politicians are saying."
The Trump campaign ad about the Bidens made claims about their connections to Ukraine, a critical element in the congressional impeachment inquiry. Biden's campaign asked Facebook to remove the ad, describing it as false, but the social-networking site declined, pointing to a policy against fact-checking such political speech. The company's response drew widespread rebukes from Biden and other 2020 Democratic candidates, including Warren, many of whom have charged that Facebook essentially is profiting from misinformation.
Speaking at Georgetown later Thursday, Zuckerberg acknowledged the company once considered prohibiting political ads but decided against it, believing it "favours incumbents and whoever the media covers."
Facebook has faced a barrage of criticism from both sides of the aisle about what content it censors. Republicans, for example, have contended that the company censors conservative users and news websites, a charge the company has long denied.
"Often, the people who call the most for us to remove content are often the first to complain when it's their content that falls on the wrong side of a policy," Zuckerberg said. "These are very complex issues, and in general, unless it is absolutely clear what to do, I think you want to err on the side of greater expression."
For many Democrats, however, Trump's ads expose vast vulnerabilities at Facebook that malicious actors could exploit to spread misinformation targeting the 2020 presidential election just as Russian agents used Facebook and other social-media sites to spread falsehoods, sow social and political unrest and undermine Democratic candidate Hillary Clinton in 2016.
Zuckerberg stressed that he believes Facebook is in a "much better place now" to stop such disinformation campaigns, citing the company's investments in staff and artificial intelligence. But he also cautioned that misinformation is a threat that one "could never say that it's going to go away, because they're going to keep on pushing." In recent months, Facebook has reported disinformation campaigns from countries including Iran and China.
He blamed the U.S. government's lack of initial response as part of the reason the problem has worsened since the last presidential election. "Unfortunately, the U.S. did not have a particularly strong response to Russia after 2016," he continued, "so it sent the signal to other countries that they could get in on this too."
Zuckerberg's speech comes seven months after he issued his initial call for governments to adopt "rules for the Internet," and tech giants, including Facebook, to set up systems so that no one executive or company determines what is or isn't appropriate online. Facebook is actively setting up a "supreme court" of sorts so that users can appeal the company's decisions about the content it leaves up or takes down.
On Thursday, though, Zuckerberg's message served more as a warning that overreaction could stifle the very sort of speech that many regulators seek to protect.
He said that a common perception these days is that "more people having a voice is leading to division, and not bringing people together," he said. "In times of social tension we pull back on expression, and we always end up believing that it was the wrong thing to do."