In the case of anti-vaccination messaging, social media algorithms cause a practice known as "outbidding" in which misinformation gains traction because, on the unlikely chance it is true, the consequences would be horrific. "More extreme propaganda of negative effects is incentivised, thus leading to a spiral of threat matched by public fear," wrote Wilson and Wiysonge.
That "incentivising" has also been exposed in disclosures made to the US Securities and Exchange Commission this week by former Facebook employee-turned-whistleblower Frances Haugen's legal counsel.
The Associated Press reports the documents reveal that, in the midst of the Covid-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal Facebook employees regularly suggested solutions for countering anti-vaccine content on the site, to no avail.
Facebook works by ranking posts by engagement — the number of likes, dislikes, comments, and reshares. Facebook's own documents disclosed to Congress show how engagement-based ranking emphasises polarisation, disagreement, and doubt on issues such as vaccines.
Effectively, the more horrid and outrageous the lie, the more popular it will be and more recommended to others. Social media has been a Godsend to family and friends separated by Covid-induced closures but the flipside is built-in measures push the biggest and most terrible porkies.
In a statement, Facebook spokeswoman Dani Lever has said the internal documents "don't represent the considerable progress we have made since that time in promoting reliable information about Covid-19 and expanding our policies to remove more harmful Covid and vaccine misinformation".
However, the physical and emotional curses have been released from Pandora's Box and circulate malevolently.
Facebook CEO Mark Zuckerberg announced on March 15 that the company would label posts about vaccines to point out they are safe. That move has only allowed Facebook to continue chasing high engagement and ultimately profit from anti-vaccine comments, says Imran Ahmed of the Centre for Countering Digital Hate.
The US Senate is considering legislation to require social media platforms to give users the option of turning off the algorithms that organise individual newsfeeds. This might stem the tide of misinformation to those aware enough to undertake the measure. But they are not the ones so willingly drinking the Kool-Aid.
"Facebook has taken decisions which have led to people receiving misinformation which caused them to die," Ahmed said. "At this point, there should be a murder investigation."
We can only hope it doesn't come to that here.