Facebook has hit back at the creators of a new Netflix documentary-drama about the negative impacts of social media, accusing the producers of The Social Dilemma of burying "the substance in sensationalism".
The movie, released on Netflix last month, argues tech and social media platforms have been deliberately designed to addict us and profit off our attention and digs into the algorithms used to drive the content we see.
There was an immediate reaction, with some viewers saying it was enough to make them delete their social media accounts and throw away their smartphones.
Now Facebook has hit back in a document headlined "What 'The Social Dilemma' Gets Wrong".
"Rather than offer a nuanced look at technology, it gives a distorted view of how social media platforms work to create a convenient scapegoat for what are difficult and complex societal problems," the social media giant said.
Facebook argues the film's creators failed to include insights from people currently at the companies or from experts whose views dissent from the film's narrative.
It said the film also did not acknowledge all the hard work Facebook and others are doing to address the issues, instead relying on sources "who haven't been on the inside for many years". Several people who appear in the film actually quit their jobs over ethical quandaries posed by working at those companies.
Facebook said it does not deliberately design its platform to be addictive, and also hit back at claims the recommendation algorithm drives people toward content that provokes outrage.
"Facebook's algorithm is not 'mad.' It keeps the platform relevant and useful," the company argued.
The company also said it conducts its own research and funds independent academics to do the same, to "better understand how our products might contribute to polarisation so that we can continue to manage this responsibly".
In May, the Wall Street Journal revealed high level executives at Facebook knew its algorithms "exploit the human brain's attraction to divisiveness" thanks to internal research it commissioned in 2018, but had done little to nothing about it since.
In the dramatisation woven through The Social Dilemma, a teenage boy becomes addicted to "extreme centrist" content fed to him by algorithms, which culminates in a dramatisation of him and his wise older sister being arrested at a fiery political rally.
While the film presents his transition to "extreme centrism" as the result of constant stimuli fed to him on the internet, Facebook argues that "the overwhelming majority" of content on its platform "is not polarising or even political".
"This content is a tiny percentage of what most people see on Facebook," the company argued.
"We've acknowledged that we made mistakes in 2016 (in the lead-up to the US election). Yet the film leaves out what we have done since 2016 to build strong defences to stop people from using Facebook to interfere in elections."
Facebook also hit back at "the idea that we allow misinformation to fester on our platform, or that somehow we benefit from this content", which it said is "wrong".
The company said it has fact checkers who look for false and misleading content.
It also said misinformation that could lead to imminent violence, physical harm or voter suppression is removed outright and not just given less priority in people's news feeds.
In August, Facebook boss Mark Zuckerberg revealed why the page for militia group Kenosha Guard - reported more than 400 times in the space of a day over fears it could lead to imminent violence - wasn't removed until after two people had died in Kenosha.
"The contractors and the reviewers who the initial complaints were funnelled to basically didn't pick this up," Zuckerberg said, calling it an "operational mistake".
Facebook said that "despite what the film says… we don't want hate speech on our platform and work to remove it".
"We know our systems aren't perfect and there are things that we miss. But we are not idly standing by and allowing misinformation or hate speech to spread on Facebook."