Just days after being removed from Facebook for bestiality posts, gender exclusive group Bad Girls Advice resurfaced under a new a name.
It hadn't even been 24 hours since the closure of the group when group founder Amy Louise proudly boasted it would return to the social media platform.
The freshly created group, known as B.G. Army, already had 31,000 members, with the promise of replicating the 200,000-plus members seen in the original Bad Girls Advice page.
As many former members hope to be added to the new hidden group, admins have reportedly been more stringent with their screening process to ensure they are not shut down again.
One anonymous source told news.com.au she had to provide proof she was part of the old BGA and add one of the admin as a friend before she could be accepted into B.G. Army.
But it appeared this wasn't necessary, with the group founder claiming the original Bad Girls Advice has since been restored by Facebook.
Although, the page has changed the name of the group to B.G. Army OG.
When asked on its decision to backflip on the removal of the group, Facebook provided the following statement:
"All content shared on Facebook must comply with our Community Standards, including in Groups, and we will remove any content reported to us that does not comply with these standards," a spokesman told news.com.au.
"Our community operations team reviews millions of pieces of content per week and from time-to-time we make mistakes. In that instance, we restore the content and apologise to the people impacted by our error."
The apology came despite the most recent post condoning bestiality and previous posts on the same topic, plus evidence of sharing unsolicited naked photos , condoning violence against men, and publishing memes making fun of the Manchester bombing attack.
"Facebook has no interest in trying to police the platform or see content removed because the more active the group, the more people seeing ads," he told news.com.au.
"Even though Facebook has made progress in controlling contenet in recent years, removing the group could be viewed as a possible violation of its lucrative business model.
Associate Professor Marjorie Kibby shared similar sentiments in regards to Facebook putting business ahead of providing a safe online community.
"Facebook's community standards were always a set of guidelines that it was hoped would be followed to provide content that was acceptable for the majority of its users," he said.
"The guidelines worked while Facebook users were a fairly cohesive group, but the current diversity of users, their motivations and standards have seen an increasing demand for it to apply these guidelines as rules and to police contributions for adherence to the rules.
"It is fairly obvious though that they do not have the mechanisms in place to undertake this role, and that the role is not necessarily part of their core business - which is providing users with the content they want, so more of them will visit, more often and stay longer."
Prof. Leaver said Facebook only police enough content to ensure it avoids too much public backlash and he expected the platform to keep turning a blind eye on the hidden group.
"In a group where most of the members are happy to engage with the infringing content, it's going to take longer for Facebook to take action because it will only come to their attention when someone flags it and even then, it will only focus on the infringing post," he said.
"If a line is constantly being crossed, Facebook could call out the admin publicly or block the personal account, but I don't believe it would."
Prof. Kibby said the chances of finding a solution to the problem were slim to none.
"A person has to report a violation of the guidelines, and other people then determine whether the content does or does not meet 'community standards', and even what community standards are. With billions of bits of content added daily, this is always going to be a losing games," he said.