Anti-vaccine protesters gather in front of the Pfizer headquarters, in Paris, France. Photo / AP
Troubled by the number of unvaccinated Covid-19 patients showing up at his hospital, the French doctor logged on to Facebook and uploaded a video urging people to get vaccinated.
He was soon swarmed by dozens, then hundreds, then more than 1,000 hateful messages from an anti-vaccine extremist group known as V_V. The group, active in France and Italy, has harassed doctors and public health officials, vandalised government offices and tried to disrupt vaccine clinics.
Alarmed by the abuse of its platform, Facebook kicked off several accounts tied to the group last December. But it didn't stop V_V, which continues to use Facebook and other platforms and, like many anti-vaccine groups around the world, has expanded its portfolio to include climate change denialism and anti-democratic messaging.
"Let's go and get them at home, they don't have to sleep anymore," reads one post from the group. "Fight with us!" reads another.
The largely unchecked nature of the attacks on the indisputable health benefits of the vaccine highlight the clear limits of a social media company to thwart even the most destructive kind of disinformation, particularly without a sustained aggressive effort.
Researchers at Reset, a UK-based nonprofit, identified more than 15,000 abusive or misinformation-laden Facebook posts from V_V — activity that peaked around April 2022, months after the platform announced its actions against the organisation. In a report on V_V's activities, Reset's researchers concluded that its continued presence on Facebook raises "questions about the effectiveness and consistency of Meta's self-reported intervention".
Meta, Facebook's parent company, noted in response that its 2021 actions were never meant to eliminate all V_V content, but to take down accounts found to be participating in coordinated harassment. After The Associated Press notified Facebook of the group's continued activities on its platform, it said it removed an additional 100 accounts this week.
Meta said it's trying to strike a balance between removing content from groups like V_V that clearly violate rules against harassment or dangerous misinformation, while not silencing innocent users. That can be particularly difficult when it comes to the contentious issue of vaccines.
"This is a highly adversarial space and our efforts are ongoing: since our initial takedown, we've taken numerous actions against this network's attempts to come back," a Meta spokesman told the AP.
V_V is also active on Twitter, where Reset researchers found hundreds of accounts and thousands of posts from the group. Many of the accounts were created shortly after Facebook took action on the program last winter, Reset found.
In response to Reset's report, Twitter said it took enforcement actions against several accounts linked to V_V but did not detail those actions.
V_V has proved especially resilient to efforts to stop it. Named for the movie "V for Vendetta," in which a lone, masked man seeks revenge on an authoritarian government, the group uses fake accounts to evade detection, and often coordinates its messaging and activities on platforms such as Telegram that lack Facebook's more aggressive moderation policies.
That adaptability is one reason why it's been hard to stop the group, according to Jack Stubbs, a researcher at Graphika, a data analysis firm that has tracked V_V's activities.
"They understand how the internet works," Stubbs said.
Graphika estimated the group's membership to be 20,000 in late 2021, with a smaller core of members involved in its online harassment efforts. In addition to Italy and France, Graphika's team found evidence that V_V is trying to create chapters in Spain, the United Kingdom, Ireland, Brazil and Germany, where a similar anti-government movement known as Querdenken is active.
Groups and movements such as V_V and Querdenken have increasingly alarmed law enforcement and extremism researchers who say there's evidence that far-right groups are using scepticism about Covid-19 and vaccines to expand their reach.
Increasingly, such groups are moving from online harassment to real world action.
For instance, in April, V_V used Telegram to announce plans to pay a 10,000 Euro bounty to vandals who spray painted the group's symbol (two red Vs in a circle) on public buildings or vaccine clinics. The group then used Telegram to disseminate photos of the vandalism.
A month before Facebook took action on V_V, Italian police raided the homes of 17 anti-vaccine activists who had used Telegram to make threats against government, medical and media figures for their perceived support of Covid-19 restrictions.
Social media companies have struggled with responding to a wave of misinformation about vaccines since the beginning of the Covid-19 pandemic. Earlier this week, Facebook and Instagram suspended Children's Health Defence, an influential anti-vaccine organisation led by Robert Kennedy Jr.
One reason is the tricky balancing act between moderating harmful content and protecting free expression, according to Joshua Tucker of New York University, who co-directs NYU's Centre for Social Media and Politics and is a senior advisor at Kroll, a tech, government and economic consulting firm.
Striking the right balance is especially important because social media has emerged as a key source of news and information around the world. Leave up too much bad content and users may be misinformed. Take down too much and users will begin to distrust the platform.
"It is dangerous for society for us to be moving in a direction in which nobody feels they can trust information," Tucker said.