Facebook SecurityIn May Facebook introduced a temporary ban on a particularly graphic video showing the beheading of a woman in Mexico. The ban was implemented in response to complaints from users and from the Family Online Safety Institute which is a member of Facebook’s Safety Advisory Board.

In October the ban was lifted and the video again became accessible on Facebook under the title: Challenge: Anybody can watch this video?

Stephen Balkam, the head of the Family Online Safety Institute, told the BBC: “I’m very unhappy that these have gone back up and that they have gone up without any warning.”

David Cameron tweeted: “It’s irresponsible of Facebook to post beheading videos, especially without a warning. They must explain their actions to worried parents.” Facebook was faced with mounting pressure from charities, Facebook users and advertisers.

A spokeswoman from Facebook set out its position:

Facebook has long been a place where people turn to share their experiences, particularly when they’re connected to controversial events on the ground, such as human rights abuses, acts of terrorism and other violent eventsPeople are sharing this video on Facebook to condemn it. If the video were being celebrated, or the actions in it encouraged, our approach would be different.

A position that is somewhat disingenuous and certainly not the whole picture.

Indeed, Facebook’s justification was not enough to stem the growing complaints and criticism and on 22 October 2013 it removed the video. David Cameron responded, again by tweet, saying: “I’m pleased Facebook has changed its approach on beheading videos. The test is now to ensure their policy is robust in protecting children.”

But what is Facebook’s policy on censoring content uploaded onto its site? Newspapers have been quick to pick up on the seemingly double standards and warped moral priorities of a site which allows videos of graphic violence and death but bans images of a mother breastfeeding when you can see her nipple. Facebook’s ‘Community Standards’ page sets out that “graphic images shared for sadistic effect or to celebrate or glorify violence have no place on our site.”But if the motivation behind a post is the deciding factor that motivation may not always be clear. Even if we are to believe that Facebook is sincere about judging the content of a video by the purpose for which it was uploaded, this does not lessen the harm done to viewers.

Those who argued that the video should not be removed say that if we are to discuss atrocities, which are beyond the scope of our imagination, then we should first see them. This has to be a weak argument given that 13 year olds can join Facebook and there is no real bar to younger users. Others say that publishing such videos exposes the wicked. Whilst a superficially attractive argument it has little to commend it.

Far more important, and especially in relation to Facebook, is the psychological damage such videos can do. The video has a complete disregard for the dignity of the deceased and her family and risks winning notoriety for the killers. More sinister still it reveals our gory fascination and desire to be horrified by death and violence.

Facebook claims to be holding a libertarian stance allowing its users to post what they like on the grounds that it is facilitator, rather like a notice board or a pub landlord, and is not responsible for moderating content. It would only intervene when a user violates its terms and conditions. Facebook has been forced to back down from this position. Whilst the video, and others like it, can still be viewed online on other websites Facebook has said in a blog post that it is “strengthening the enforcement of our policies“. It is wishful thinking to suggest that this might mark a turning point for Facebook in taking greater responsibility for the content on its site but it does show that Facebook and sites like it are being forced to take greater care in their attitude to content posted on their sites. It is perhaps telling that Facebook’s rival social networking site operated by Google, Google+, has adopted a far stronger position: “Do not distribute depictions of graphic or gratuitous violence.”

Rhory Robertson is a Partner and Sophie Pugh a Trainee Solicitor working in the Collyer Bristow Cyber Investigations Unit.