Skip to main contentSkip to navigationSkip to navigation
Instagram screenshot of Celeste Barber next to Candice Swanepoel instagram image.
Celeste Barber’s parody of Candice Swanepoel’s photograph that Instagram stopped users from sharing. Photograph: Instagram
Celeste Barber’s parody of Candice Swanepoel’s photograph that Instagram stopped users from sharing. Photograph: Instagram

Instagram censored one of these photos but not the other. We must ask why

This article is more than 3 years old

Celeste Barber’s latest parody was flagged by the platform, but its algorithm’s prejudices aren’t a new problem

Last week brought an issue to the attention of millions of Instagram users – one that we in marginalised communities have been aware of for years: the Instagram algorithm favours thin, white, cisgendered people and effectively censors the rest of us.

On Friday, Australian comedic juggernaut Celeste Barber posted the latest in her #CelesteChallengeAccepted series of parody images to her audience of 7.3 million: a side-by-side photo of her imitating a post from former Victoria’s Secret model Candice Swanepoel, clutching her bare breast and exposing side boob.

Allow Instagram content?

This article includes content provided by Instagram. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'.

But while both photos revealed the exact same parts of each body (Barber, in fact, added a string bikini), Instagram wouldn’t let fans share Barber’s post, notifying some users that it “goes against our community guidelines on nudity or sexual activity”. Swanepoel’s post, meanwhile, went unreported.

“Hey Instagram, sort out your body-shaming standards, guys,” Barber wrote, sharing one fan’s screenshot showing Instagram’s repost rejection. “It’s 2020. Catch up.”

The Instagram algorithm is a beast that we may never understand completely, but what we do know is that images in breach of Instagram’s community guidelines are flagged through a mix of manual reporting and AI tech. Instagram also has over 15,000 employees working all around the world to review posts, and look for banned material. With all of this technology and so many employees, it is hard to understand how prejudices still haunt the algorithm – and yet here we are.

This is not an isolated incident. In June 2020, plus-size model Nyome Nicholas-Williams posted an artistic topless photo of herself, in which her breasts were covered by her arms – and Instagram promptly removed it. Censoring a black woman in the context of the renewed Black Lives Matter movement was a particularly bold move, and one that did not go unnoticed.

Allow Instagram content?

This article includes content provided by Instagram. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'.

Nicholas-Williams’s followers – now 60,000 - rallied behind her, using the hashtag #IWantToSeeNyome. There were reposts of her original photo, plus artistic tributes and posts from women of colour posing in their own versions of it; many had their own posts removed.

The incident resulted in a Change.org petition, which has now amassed more than 20,000 signatures, protesting what it refers to as “Instagram’s prejudicial and clearly racially motivated censoring” with the aim to “showcase all people of all sizes and ethnicities”.

Since then, Nicholas-Williams has spoken about the double standards that people of colour and fat people face that thin, white models are not subjected to. In a recent interview with Freeda, Nicholas-Williams explained that “censorship happens everywhere, but on social media it happens more, especially to black fat women like myself”.

That month, the Instagram CEO, Adam Mosseri, publicly acknowledged the need for Instagram to look at “algorithmic bias”, as well as harassment, verification and content distribution. “Our focus will start with Black community, but we’re also going to look at how we can better serve other underrepresented groups,” he said.

But by October, not much had changed. Which is why Kayla Logan, an American plus-size blogger and influencer, started #DontDeleteMyBody: an effort to not only prove to society how fatphobic and racist the Instagram algorithm is, but also to beat it.

On 1 October, at 6pm American Pacific standard time, 50 fat influencers from around the world – of diverse racial backgrounds and with millions of followers collectively – posted an image to their grid. Some were artistic nudes, others fun fashion pics, but all contained a search bar with the words: “Why does IG sensor [sic] my body but not thin bodies?”

Predictably, many of the influencers reported the removal of their posts for breaching Instagram’s community guidelines. And – as with Barber’s post last week - many of the images were banned from being reshared. One participant, Kalae Nouveau, suffered more than others: her original and subsequent posts were deleted, resharing her posts was banned, and her account was shadow-banned, meaning its visibility was restricted. The only #DontDeleteMyBody post that Nouveau was able to keep in her feed is a grey tile that reads: “I’m tired. I’ll continue to tell y’all not to delete my body tomorrow. #DontDeleteMyBody”

Is it a coincidence that the only post to remain did not feature her body?

Allow Instagram content?

This article includes content provided by Instagram. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'.

There are three different issues at play here: the censorship of black people and people of colour; the censorship of fat people and people in marginalised bodies; and the censorship of women and women’s bodies. Yet all three bring us to the same conclusion: it’s time all social media giants update their guidelines to make room for everyone.

Barber and Nicholas-Williams both say Instagram has apologised, saying their images were mistakenly censored. Both women are now working with the platform to help update their guidelines for the future.

Until that happens, you can help make a difference: consciously follow people in marginalised communities and engage with their content; save posts, comment, like and share the content that resonates with you, and that you think people need to see. Show the social media giants and big corporations that diversity and inclusion matter in the world, and that you want to see it.

/

Most viewed

Most viewed