CHEN: Hello, Kevin! You've been down the, ahem, misinformation rabbit hole for quite some time. There's been a lot of chitchat among journalists and scholars about the mass migration from public social networks like Facebook and Twitter toward private messaging services. In general, there's concern that misinformation could become even more difficult to fight in private channels.
So come up to the surface for a moment. Can you explain what's going on?
ROOSE: I can try! So in the world of extremists and conspiracy theorists that I follow, there's been a kind of frantic mass migration from big platforms like Facebook, Twitter and YouTube, as those platforms crack down on misinformation and hate speech. A lot of the biggest figures in that world — including groups like the Proud Boys and QAnon conspiracy theorists — have moved onto more private platforms, where there's less danger of getting deplatformed.
So there's now this debate about whether it's good that all these unsavoury characters from the dregs of the internet are disappearing from big social platforms or whether it's dangerous to have them congregating in spaces where researchers, journalists and law enforcement can't keep tabs on them as easily.
Succinct enough?
CHEN: Perfect. So the migration is heading toward Signal and Telegram. The apps offer "end-to-end encryption," which is a jargony way to describe messages that get scrambled to become indecipherable to anyone except for the sender and the recipient.
The obvious benefit is that people are ensured privacy. The possible downside is that it's tougher for the companies and law enforcement to hold misinformation spreaders and criminals accountable because their messages won't be accessible.
So what's your take? Are you concerned?
ROOSE: Honestly, not really?
It's obviously not great for public safety that neo-Nazis, far-right militias and other dangerous groups are finding ways to communicate and organise and that those ways increasingly involve end-to-end encryption. We've seen this happen for years, going all the way back to ISIS, and it definitely makes things harder for law enforcement agencies and counterterrorism officials.
At the same time, there's a real benefit to getting these extremists off mainstream platforms, where they can find new sympathisers and take advantage of the broadcast mechanics of those platforms to spread their messages to millions of potential extremists.
The way I've been thinking about this is in a kind of epidemiological model. If someone is sick and at risk of infecting others, you ideally want to get them out of the general population and into quarantine, even if it means putting them somewhere like a hospital, where there are a lot of other sick people.
It's a pretty bad metaphor, but you see what I mean. We know that when they're on big, mainstream platforms like Facebook, Twitter and YouTube, extremists don't just talk among themselves. They recruit. They join totally unrelated groups and try to seed conspiracy theories there. In some ways, I'd rather have 1,000 hardened neo-Nazis doing bad stuff together on an encrypted chat app than have them infiltrating 1,000 different local Dogspotting groups or whatever.
CHEN: I see where you're going with this!
When you open Facebook or Twitter, the first thing you see is your timeline, a general feed that includes posts by your friends. But you could also see posts from strangers if your friends reshared them or "liked" them.
When you open Signal or Telegram, you see a list of the conversations you are having with individuals or groups of people. To get a message from someone you don't know, that person would need to know your phone number to reach out to you.
So to complete our analogy, Facebook and Twitter are essentially billions of people packed into an enormous auditorium. Encrypted messaging apps like Signal and Telegram are like big buildings with millions of people, but each person is living inside a private room. People have to knock on one another's doors to send messages, so spreading misinformation would take more effort. In contrast, on Facebook and Twitter, a piece of misinformation can go viral in seconds because the people in this auditorium can all hear what everyone else is shouting.
ROOSE: Right. Facebook and Twitter are the big, germ-filled auditoriums, and Signal and Telegram are the college dorms. You can definitely get your roommate sick, but spreading it to your entire floor is going to require some effort.
CHEN: I confess that I am worried about Telegram. Other than private messaging, people love to use Telegram for group chats — up to 200,000 people can meet inside a Telegram chat room. That seems problematic.
ROOSE: I do think the crackdowns of the big platforms will make it harder for these groups to congregate out in the open. But I share your worry about the encrypted apps becoming, essentially, huge shadow social networks. These apps are designed for one-to-one messaging, but the addition of features like forwarding, combined with the big caps on maximum chat sizes, makes them vulnerable to the same kinds of one-to-many contagion effects as the big broadcast platforms.
It's interesting to note that WhatsApp has restricted message forwarding for exactly this reason. People were using it to spread misinformation to thousands of people at a time, and it was creating a ton of havoc in places like India. I'm not sure why Telegram hasn't done something similar, but it seems like something they'll have to address, along with maybe rethinking their current room size limits.
Are you worried about Signal at all?
CHEN: I'm not as worried about Signal. Similar to WhatsApp, Signal set a limit so that you can forward messages to only five people at a time. So it would be time-consuming for misinformation spreaders to make a message go viral. Also, Signal limits group chats to up to 1,000 people. That's large, but not as huge as a Telegram group chat.
I reached out to Signal and Telegram, by the way.
Moxie Marlinspike, Signal's founder, said there was minimal risk for misinformation to become a big problem on the app because inside it, people are not exposed to algorithms like Facebook's that surface other people's posts and stoke the spread of misinformation.
Telegram did not respond to multiple requests for comment. The company's website doesn't contain language about limits on message forwarding. This makes me nervous.
While I am concerned about Telegram in general, it's important to note that group chats there are not end-to-end encrypted. Neither are forwarded messages. So if Telegram or law enforcement authorities wanted to investigate the contents of a big group chat, they could do so, in theory. If Telegram does become the next misinformation hot spot, we won't be helpless. I may be murdering our analogy, but there will be methods for contact tracing!
ROOSE: Right. And the rest of us nonextremists can rest a little easier knowing that our feeds won't be overrun by Proud Boys and neo-Nazis, because at least Facebook and Twitter and YouTube are doing a little more filtering of the bad stuff? Maybe not a perfectly fitted N95 mask, but at least a neck gaiter.
OK, now I am officially retiring this metaphor.
CHEN: I want to end on a note of optimism, which is rare for me. Private messaging apps are a net positive. Every app and object that connects to the internet has the potential to spy on us, so we desperately need tools that keep our online conversations private. We aren't going to let the bad guys ruin this for us.
Written by: Brian X. Chen and Kevin Roose
Photographs by: Glenn Harvey
© 2021 THE NEW YORK TIMES