Step 2: That conspiracy is voluntarily shared and propagated by individuals who agree with the narrative - largely within the first two hours, but again at the 20-hour mark.
Step 3: The conspiracy gradually branches throughout the network over a period of days, its speed slowing but its audience growing continuously. Within a period of two weeks or so, the theory has been adopted by large portions of the community - and once they've been adopted, they're "highly resistant to correction".
In fact, as this group of researchers has found before, attempts to correct conspiracy theories often have the opposite effect: They make conspiracists grip their beliefs all the more strongly.
And while this particular study looked at conspiracy theories, specifically, its findings also apply to misinformation of other kinds: fake news, hoaxes, that sort of thing.
In an email to the Washington Post, Walter Quattrociocchi - the head of the Laboratory of Computational Social Science at IMT Lucca, and a co-author of the paper - said that on Facebook, "attempts to correct information (not only conspiracy theories) end up producing contents that are used only in the echo chamber that produced the content", or the debunk, to begin with.
There are two very interesting things going on here.
First off, these theories are not circulating willy-nilly around Facebook as a whole. They spread within specific, defined, ideologically homogenous communities, or echo chambers, which might not be visible to the naked eye - but may as well be walled off.
The researchers find that: Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarisation. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumours, mistrust, and paranoia.
Because users create these walled communities themselves - choosing to read only news that agrees with their biases, or unfriending people who challenge their socio-political views - there's not much Facebook can do to remedy the situation.
Facebook itself came to that conclusion last year, when the company's data scientists found that echo chambers were born less of algorithmic bias than our own intolerance and illiberality.
How do you solve a problem like human nature, though?
It would be more comforting, frankly, if there was a technological solution at hand: some algorithmic measure of truth, perhaps, or some way to tag hoaxes.
This research concludes that those options probably won't work, but that doesn't mean they've given up.
Their next steps will involve studying messages that improbably make it over the walls of different echo chambers, the better to determine how social and cognitive biases can be overcome on a network-wide scale.
For the curious, here's a first step: If you have conspiracy theorists, fabulists or extreme ideologues in your social network, you actually shouldn't unfriend them.