• Respond in groups: You're more persuasive to the person you're arguing with if other people are arguing your side, too.
• Have a few back-and-forth exchanges with your opponent, but never go past three or four. Up to that point, your chance of persuading them is pretty good. But Tan says that "when the back-and-forth goes on for too long, your chances at persuasion become very low."
• Link to outside evidence.
• Don't quote the person you're arguing with. They'll usually interpret that as "nit-picking with their wording," Tan says, and thus what you say is unlikely to sway their opinion.
• Don't act too intense - that scares people off. Stick to calm, even-keeled language.
• Write a longer response if you're actually trying to change someone's opinion. A one-liner probably won't do it.
• Last but not least, try to base your arguments around points that your opponent didn't initially address: i.e.: If your weird uncle posts that he's voting Trump because Trump will improve the economy, you should argue that he shouldn't vote Trump because of his views on Muslims. The researchers found that arguments whose "content words" differed from those of the original poster were more likely to persuade them.
There are a few caveats here, of course - I don't want to give you the false impression that you're now equipped to smite all the world's trolls. For one thing, 70 per cent of the people this study looked at were unpersuadable. For another, the links between these techniques and opinion changes are correlative, not causal.
On top of that, the social and interpersonal dynamics of opinion change are messy and complicated, beholden to more forces than these findings may suggest. I might have a better time convincing you of something if I'm a Twitter power-user, for instance, vs. some random egg with an Internet connection. (Incidentally, a 2014 paper found that Twitter is "not an idealised space" for "rational" disagreement. Who'd've guessed!)
All that said, the dynamics of opinion-change on social media are critically important, both because there's so much misinformation online and because this is increasingly the venue where people get their news. After I ended my long-running column, "What Was Fake on the Internet This Week," I got a lot of questions about the best way to correct misinformation and misguided views - including whether, in this current environment, that was even possible. This research seems to suggest it is, though not in every instance, and only with some difficulty.
"Even though the internet is mostly a self-organized bottom-up system, it is - by no means - democratic nor horizontal," Taha Yasseri, a researcher at the Oxford Internet Institute, explained by email. "There are lots of social group formation, hierarchies, and dynamic structures that can have considerable effects on how things move and evolve both online and consequently in the off-line world."
Yasseri recently completed her own study of internet disagreements: It looked at patterns of edit reversion on Wikipedia. But she can't be sure which of the reversions were changed back again and which actually "stuck," because, at some point, the edit flows become too splintered. There's so much noise in the data, Yasseri lamented, that it becomes hard to navigate.
Ironically, that sounds a whole lot like most online debates.