The volunteer group provided expertise and guidance on how Twitter could better combat hate, harassment and other harms but didn’t have any decision-making authority and didn’t review specific content disputes. Shortly after buying Twitter for US$44 billion in late October, Musk said he would form a new “content moderation council” to help make major decisions but later changed his mind.
“Twitter’s Trust and Safety Council was a group of volunteers who over many years gave up their time when consulted by Twitter staff to offer advise on a wide range of online harms and safety issues,” tweeted council member Alex Holmes. “At no point was it a governing body or decision making.”
Twitter, which is based in San Francisco, had confirmed the meeting with the council on Thursday in an email in which it promised an “open conversation and Q&A” with Twitter staff, including the new head of trust and safety, Ella Irwin.
That came on the same day that three council members announced they were resigning in a public statement posted on Twitter that said that “contrary to claims by Elon Musk, the safety and wellbeing of Twitter’s users are on the decline.”
Those former council members soon became the target of online attacks after Musk amplified criticism of them and Twitter’s past leadership for allegedly not doing enough to stop child sexual exploitation on the platform.
“It is a crime that they refused to take action on child exploitation for years!” Musk tweeted.
A growing number of attacks on the council led to concerns from some remaining members who sent an email to Twitter earlier on Monday demanding the company stop misrepresenting the council’s role.
Those false accusations by Twitter leaders were “endangering current and former Council members,” the email said.
The Trust and Safety Council, in fact, had as one of its advisory groups one that focused on child exploitation. This included the National Center for Missing & Exploited Children, the Rati Foundation and YAKIN, or Youth Adult Survivors & Kin in Need.
Former Twitter employee Patricia Cartes, whose job it was to form the council in 2016, said Monday its dissolution “means there’s no more checks and balances”. Cartes said the company sought to bring a global outlook to the council, with experts from around the world who could relay concerns about how new Twitter policies or products might affect their communities.
She contrasted that with Musk’s current practice of surveying his Twitter followers before making a policy change affecting how content gets moderated. “He doesn’t really care as much about what experts think,” she said.