Facebook governs more people than any country on Earth. But who makes its rules? Photo / Getty Images
It's around 9.30am at Facebook's headquarters in Menlo Park, California, and things are already getting heated. Or at least as heated as they ever seem to get in this grown-up playground of free snacks and bright posters.
The subject is nudity, and specifically female nipples: who should be allowed to show them, and who might object to them?
One employee, dialling in from Dublin, proposes making an exception to Facebook's no-nipples policy for indigenous peoples whose traditional clothing leaves their chests uncovered. But colleagues immediately raise doubts, objecting that this could single out non-white people and questioning the wisdom of Facebook deciding whose nudity counts as "traditional" and whose is just obscene.
It is in this conference room that Facebook decides what 2.3 billion people around the world can and cannot say on its service. Every two weeks, representatives from its policy, research, software design and content moderation teams meet to consider proposed changes to its speech rules.
Facebook has long insisted that these rules are the product of extensive consultation and robust internal debate, but until recently it has rarely let outsiders into that process. Now it is opening up, and The Telegraph was allowed to sit in on two such meetings.
Once, before Facebook grew so big, it was far more relaxed about policing speech. Its first rulebook was written entirely by one man, Dave Willner, and was one page long, with the basic principle that moderators should listen to their gut.
That changed within a year, and kept changing as Facebook battled ethical crises, advertiser revolts and regulatory scrutiny on its breakneck journey to governing more people than any nation on Earth. Today its public rules are around 25 pages long, but its internal moderation guidelines, at the last leak, were 1,400.
Any changes to those rules must come before the fortnightly meeting, which is officially called the Content Standards Forum (CSF). Facebook claims the CSF has existed in some form for about six years, but it has only recently become so formal.
Teams dial in from offices across the globe, from other confererence rooms with whimsical names such as "A Stew to a Kill" and "Iain M Banks". Often it is chaired by Monika Bickert, a former Chicago prosecutor and one of Facebook's top policy decision makers who combines an old-fashioned faith in free speech with a pragmatic acceptance that the company can never please everyone.
Sometimes Mark Zuckerberg, Facebook's chief executive, and Sheryl Sandberg, its chief operating officer, take a personal interest - but they have been overruled before.
The range of issues that come before the CSF is kaleidoscopic. In one session attendees hear how Facebook has a procedure for designating groups as terrorist organisations, but no procedure for overturning that status. When does a terrorist become an ex-terrorist? A working group forms to study past examples.
Another session considers Facebook's "newsworthiness" exception, which allows content that would otherwise count as hate speech to stay online if it is "important to the public interest".
One employee worries that this creates perverse incentives for politicians to push the boundaries, and is promptly press-ganged into a working group.
This is one of Facebook's biggest dilemmas, which never goes away: where should the line be drawn between protecting users and censoring political debate? But it's also just another Tuesday, so the CSF moves swiftly on.
Policing such a wide range of content requires truly baroque detail, sometimes detailed to the point of absurdity. Different rules interlock and support each other like elements of a computer program, and like a computer program they sometimes have glitches.
Until 2017, Facebook censored invective against "white men" but not "black children" because of an arcane quirk in how it prioritised different elements of people's identities.
That same year, when the MeToo movement first erupted, Facebook provoked protests by censoring statements such as "men are scum" and "men are pigs" as equivalent to misogynist abuse: gender is a protected characteristic, and both genders count.
At Zuckerberg's direction that policy was debated at the CSF, but in the end it remained in place.
Any new rules must also be enforceable by an army of 15,000 moderators working in more than 40 languages. Many of them are contractors rather than employees, and there have been persistent concerns about their welfare, their pay and the trauma they can suffer from trawling through the internet's worst effluents.
A proposal to maintain lists of banned curse words in every country in which Facebook operates, so that expletive-happy Glaswegians and Australians are not subject to American puritanism, is rejected in part because it would be too operationally complex.
The CSF opts instead to ban all female-gendered curse words everywhere, starting in May. That recommendation passes without dissent: there is no voting at the CSF.
Others are more contentious. For years, Facebook has banned images of nipples belonging to anyone who identifies as female (a transgender man who retains his breasts would be exhibiting male nudity. Discussion is ongoing as to how this should apply to people who identify as neither gender).
After criticism from breastfeeding mothers and breast cancer survivors, including a campaign to "free the nipple", it carved out exceptions for pregnancy, breastfeeding, protests and medical matters. But it is increasingly having to make case by case allowances for "cultural" nudity, such as the kind displayed by indigenous Brazilians.
Experts, the nudity working group explains, "overwhelmingly" agree that Facebook should "allow more nudity on the platform". The most radical option is to ban female nipples only if the image is "sexually suggestive", and let all others go free, albeit with an "age gate" to hide them from young users.
This would be a new world, with men and women treated equally and no more awkward carve-outs and exceptions.
It would also risk flooding Facebook with borderline pornography, and the age gate might be taken to imply a negative judgment.
So instead, the working group proposes formalising the exception for "cultural/indigenous nudity".
In parts of Nairobi, they explain, female toplessness is not even considered nudity, to the degree that people were confused by the idea of making an "exception" at all.
Quickly this proposal meets respectful but vigorous objections. An Africa expert notes that all the examples feature brown people, and says that plenty of white people have "cultural" nudity too. Will this policy cover Norwegian skinnydippers, topless sunbathers in Florida or celebrants at Mardi Gras? If not, why not?
A senior policy person in DC says that this line will be hard for Facebook to defend in public, and fears the exception may become a loophole for malicious users to spread non-consensual pictures of indigenous people.
The original speaker defends the policy, but there is no consensus. It's back to the drawing board.
If these discussions sometimes seem surreal in their exactitude, there is a reason. For years Facebook has been accused of taking decisions in secret without properly thinking them through. Now it is inviting reporters to watch them being made and publishing regular minutes.
The CSF, once obscure, is part of a new push for transparency, alongside a planned "supreme court" to hear moderation appeals. That push is limited, because the CSF is still technically a closed meeting and journalists attend at Facebook's discretion. But then its members are not actually elected politicians or officials.
They are private employees of a private company that just happens to be acting like a government.