Transformation

Censoring the terrors of war

Who decides whether or not an image is fit for human consumption, and on what grounds?

Ian Mahoney
19 September 2016
IanMahoney.jpg

Credit: Flickr/Tommy Japan. Creative Commons 2.0. Some rights reserved.

On the 8th September 2016, Aftenposten, Norway’s largest newspaper, published a front page rebuttal of Facebook’s request to remove an image from its own Facebook page that was written by its editor-in-chief and CEO, Espen Egil Hansen. The rebuttal was written in response to the censorship of Nick Ut’s Pulitzer Prize winning photograph ‘The Terror of War,’ which depicted children fleeing a napalm strike in Vietnam and was posted on Facebook as part of Norwegian author Tom Egeland’s entry about seven images that changed the history of warfare. 

Hansen was right to draw attention to the fact that “The napalm-girl is by far the most iconic documentary photography from the Vietnam war. The media played a decisive role in reporting different stories about the war than the men in charge wanted them to publish. They brought about a change of attitude which played a role in ending the war. They contributed to a more open, more critical debate. This is how a democracy must function.”

The reason given for the censorship was that nine-year old ‘napalm girl’ Kim Phuc was naked, and Facebook stated that “Any photographs of people displaying fully nude genitalia or buttocks, or fully nude female breast, will be removed.” The fact that the content of the image clearly depicts the pain and suffering of a young girl burned alive by the indiscriminate use of the weapons of war appears to have been glossed over.  

Apparently there was no understanding or consideration of the role and importance of this image, or of the message which Ut and subsequently Egeland sought to convey—only a generic, blanket response. Facebook later bowed to pressure from media outlets and politicians, stating that it had “listened to the community”, and acknowledging “the global importance of the photograph.” However this was not before significant questions had been raised regarding the corporation's role as an increasingly influential gatekeeper and censor.

As Hansen identified, Facebook occupies an incredibly powerful position when it comes to the dissemination of information and news, from deciding how its ‘trending now’ algorithm decides what is or is not going to show up at any point in time, to directly censoring imagery.  However, it is also apparent that Facebook’s policies and practices are applied inconsistently, and they serve to undermine the ‘responsible guardian’ status it seeks to convey.

In contrast to this latest row over censorship, I used the Facebook reporting tool to flag up a video in November 2015. The clip in question was hosted on an external site called “menscomedy.com”, and was entitled “Isis Terrorist Gets Blown Up by French Missile As He Records A Video.”  Thankfully both the clip and the site have now been taken down. 

Sure enough, the footage appeared to show someone dressed in black, speaking in Arabic, and recording a video when he suddenly disappears in a fireball from a missile. I lodged my report on the grounds that the video contained—and  glorified—graphic  violence, which I assumed would be apparent from the text and contents of the link. Instead I received the following response from Facebook: 

“Thank you for taking the time to report something that you feel may violate our Community Standards. Reports like yours are an important part of making Facebook a safe and welcoming environment. We reviewed the post you reported for containing graphic violence and found it doesn't violate our Community Standards.”

As a result of this exchange, I decided to investigate these ‘Community Standards’ and what they say in relation to the sharing of violent and graphic content. Here’s what I found:

“Facebook has long been a place where people share their experiences and raise awareness about important issues. Sometimes, those experiences and issues involve violence and graphic images of public interest or concern, such as human rights abuses or acts of terrorism. In many instances, when people share this type of content, they are condemning it or raising awareness about it. We remove graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence. 

When people share anything on Facebook, we expect that they will share it responsibly, including carefully choosing who will see that content. We also ask that people warn their audience about what they are about to see if it includes graphic violence.”

Herein lies one of the greatest of contradictions in relation to Facebook’s censorship policies.  On the one hand, they have actively sought the removal of some of the world’s most important documentary evidence on the ravages of war and the profound impact war can have, not just on those serving on the front lines but also on the people, families and communities who are torn apart by years of conflict. Facebook removed the photo of Kim Phuc, not on the grounds of the trauma it depicts and the graphic nature of the experiences of war it shows, but because of fears of child sexual abuse and exploitation. Such abuse should never be trivialized, but the socio-political context in which an image is created must always be understood. 

On the other hand, despite claiming to actively remove images which “are shared for sadistic pleasure or to celebrate or glorify violence”, the link I reported—which provides far more graphic content than anything shown in The Terrors of War and which actively glorified and trivialised the deaths of others—was deemed to be acceptable and left on the site for a considerable amount of time until the external site was taken down.  

What emerges from this story is that Facebook’s one-size-fits-all policies fail to consider the impact that images and videos can convey.  Indeed, in this example there was a clear lack of consideration of the implications of celebrating the deaths of apparent Jihadi’s, particularly given that the video was posted in the aftermath of the Paris terror attacks

As Egeland and Hansen’s responses to Facebook have shown, the corporation has a great deal of responsibility on its shoulders. Whilst it has bowed to pressure this time around, deciding what content should or should not be available to billions of users worldwide who log on at least once a month make it an incredibly influential gatekeeper. The debate which occurred around this particular example—and over Facebook's censorship of imagery surrounding breast-feeding mothers and whether such a natural interaction should be allowed to be shared publicly or not—helps to reveal the complex nature of the politics of imagery, and draws our attention to the often-hidden nature of censorship.  

This is hardly a new phenomenon. Throughout history, and particularly during times of conflict, national governments have repeatedly censored text and images for fear of undermining the war effort. What has changed, however, is the ever increasing involvement of private companies which have a level of autonomy over and above the state. As Nik Williams has pointed out, the self-censorship of journalists, academics and editors has also grown considerably.

It is increasingly apparent that a much more nuanced and entirely more human system should be employed when considering whether or not an image is ‘fit for human consumption.’ The moral, philosophical and political questions of who has the right to decide what people can or cannot see are too important to be left to an algorithm.  Instead, we need a system which celebrates artistic endeavour regardless of how harrowing the story, one which challenges ingrained cultural assumptions about the ‘us and them’ nature of war and raises awareness of the damage it causes. 

Mediatized violence, particularly if that violence is glorified, can desensitize and normalize such phenomena and lead to an increasing disconnect between suffering in one place and our own experiences elsewhere. A more nuanced approach to violent images might help to bring us back to considering the roots of our shared humanity. But such an approach requires that global corporations like Facebook accept the responsibility to tackle hate speech and the glorification of graphic material around death and violence, while eschewing a mechanical system of censorship which excises necessarily-powerful images from the public imagination.

Had enough of ‘alternative facts’? openDemocracy is different Join the conversation: get our weekly email

Comments

We encourage anyone to comment, please consult the oD commenting guidelines if you have any questions.
Audio available Bookmark Check Language Close Comments Download Facebook Link Email Newsletter Newsletter Play Print Share Twitter Youtube Search Instagram WhatsApp yourData