On Monday last week, Facebook had turned back on its initial redaction and re-allowed the video, characterising the site in a statement to CNET as: "A place where people turn to share their experiences, particularly when they're connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent events."
They saw the video as clearly condemning the violent acts being depicted and therefore deemed it permissible.
After further outcries and complaints, Facebook backpedalled the following day and took the video down again.
In an update on graphic content Facebook stated: "We have re-examined recent reports of graphic content and have concluded that this content improperly and irresponsibly glorifies violence.
"For this reason, we have removed it."
The incident highlights several dilemmas Facebook faces.
Dilemma 1: Freedom of expression or duty to protect?
How does Facebook navigate between its desired position as a platform for people to express themselves freely and its perceived responsibility to protect users from inappropriate, disturbing or offensive material?
Providing opportunities for people to consider and debate ideas in the public sphere is an important part of deliberative democracy, yet not everyone will always like what is said.
It seems naive of Facebook to assume that all "people share videos of these events on Facebook to condemn them".
There will always be those who thrive on, endorse and promote such material.
Yet again, is this perhaps part of deliberative democracy?
And is it important to allow controversies like this to unfold to engage publics in such deliberative processes?
Dilemma 2: Who's to judge?
Dilemma 1 leads to dilemma 2 - who gets to judge what entails inappropriate, disturbing or offensive material? Facebook's terms and conditions also prohibit content that contains nudity.
This led to controversy earlier this year, as Facebook-using mothers were outraged when they found that the site censored images of breast-feeding babies that exposed the mothers' nipples.
The issue resurfaced when Facebook decided to allow the violent beheading video last week, causing people to question Facebook's ways of deciding what content is in or out.
Dilemma 3: What is Facebook's purpose?
Is Facebook a news medium? We know it's a private company with commercial interests - but is it perceived more as a public, democratic, neutral platform?
As people increasingly use social media as news sources, publishing contentious material on these sites and opening it up for public debate may be in the public interest.
But then, should all content be allowed equally?
Newspapers have editors and reporters who make decisions about what makes the news. And like (privatised) news corporations, Facebook also has commercial interests in keeping users happy and thus loyal to the service.
Programmer and activist Aaron Swartz commented on the power private companies have over people's freedoms and the dangers of this "corporate tyranny", as private companies have not been elected and are not subject to a constitution.
Of course, this brings us back to dilemma 1: what are the roles and responsibilities of such a private company to the public?
Censorship and the internet entail ongoing points of friction. Yet freedom of speech, the role of the media, government, corporations and the public and the relations between them have been contested for a long time and will likely not be resolved to everyone's satisfaction.
Facebook is clearly trying to do the right thing by its users, and, along with everyone else, is trying to figure out how best to manage the explosion of content and access and spread of information on the internet.
While Facebook should not be granted the authority to act as arbiters over what's right and wrong, appropriate or inappropriate, censored or allowed, Facebook is a service provider which is accountable for putting rules and regulations in place to guide and protect its users.
Importantly, Facebook's commercial interests need to be closely watched and prevented from ever compromising what is in the best public interest.
Theresa Sauter is research associate in the ARC Centre for Excellence in Creative Industries & Innovation at Queensland University of Technology.
theconversation.edu.au