Haugen said that Facebook stretched itself too thin to effectively confront harms like ethnic violence and human trafficking that had been tied to activity on its apps. She dissected the ways that Facebook's fixation on getting us to spend more time online aggravated our worst impulses. And she hammered the message that the public should not be kept in the dark about what Facebook knew about its influence on us and our world.
The picture that emerged from recent Wall Street Journal reporting and Haugen's media interviews was not of Facebook as a cartoonish James Bond villain. It was of a company that cannot control the machines that it built but refuses to accept that reality.
"Facebook is stuck in a feedback loop that they can't get out of," Haugen told senators.
Some of what Haugen and Facebook critics have said about the company is probably overstated. And a lot of what Haugen said was not new. But she is a laser-focused messenger at a time when people in power are ready to stop bickering and ask: What now? What should be done to maximise the good of Facebook and minimize the harm?
There are no magic fixes, but Haugen and many others have offered sound suggestions on what to try.
The most compelling idea from Haugen was that "engagement-based ranking" is an original sin of Facebook, YouTube, TikTok, Pinterest and other popular apps. When computers prioritise what we see online based on what is likely to captivate us and keep us around longer, they tend to fan the most salacious or extreme views and subtly nudge people to post more of the same.
Haugen suggested, essentially, turning off the computer algorithms and making more of the internet gravitate toward designs like those of iMessage or past versions of Facebook and Instagram that showed posts in chronological order.
Kate Klonick, who has researched policies on online expression at internet companies, wrote in The New York Times that Facebook could redesign its websites to optimise holistic measures of the good things that it offers. Rather than focusing on metrics such as which posts are likely to get a ton of shares or likes, it could look at what is likely to lead you to attend a protest or give to a charitable cause.
Haugen and others have recommended changing U.S. law to hold Facebook responsible for real-world harms, including terrorist acts, resulting from posts that the company's computer systems distributed to people's feeds.
In a recent interview, Haugen also mentioned the idea of public representatives to oversee Facebook from the inside, similar to Federal Reserve examiners for large banks. She also backed the idea of regulations to force Facebook to work with researchers who want to study the company's effects on users.
And Haugen suggested that many of Facebook's worst moments, including its social network being used to fan ethnic violence, may be the result of having too few people to manage its ambitions. Should Facebook be forced to do less, like quitting countries unless the company devotes more resources to them and establishes cultural competence?
There are plenty of reasons to feel pessimistic. Facebook essentially told Congress, "You tell us what to do." Yet US lawmakers and regulators have done little to tell Facebook how to better govern apps used by billions of people.
Facebook has said, correctly, that it strives to continually improve its apps and that doing so is a tricky exercise in trade-offs. Mark Zuckerberg on Tuesday rejected the (oversimplified) notion that his company chooses profits over people's lives and well-being and that the company ignores ideas for improvement.
Maybe none of the ideas tossed around to fix Facebook will be better than the status quo. But what felt fresh from Haugen was a message of hope: We need the best of Facebook, and we must work together to make it better.
This article originally appeared in The New York Times.
Written by: Shira Ovide
Photographs by: Burton Booz
© 2021 THE NEW YORK TIMES