Instagram policies leaning towards censorship and social bias
Instagram’s murky ‘shadow bans’ just serve to censor marginalised communities
Chanté JosephImages of queer and plus-sized bodies are not ‘sexually suggestive’ content. So why is Instagram blocking them?
Vulnerable and marginalised communities on Instagram have been calling for a wider conversation to address what they say is the platform’s censoring of queer and plus-sized bodies.
Instagram has a confusing policy of hiding certain content from its users – and in the past few weeks, Pxssy Palace (a popular queer arts collective and London club night) has called out the photo-sharing platform for “shadow banning”some of their content. Shadow banning refers to when images aren’t outright removed from the platform, but instead strategically hidden from users. This strategy prevents users from searching shadow-banned hashtags, and removes affected content from Instagram’s Explore page.
Despite having over 28,000 followers, Pxssy Palace claims it has recently seen sharp dips in likes, comments and engagement as a result of shadow banning. Bernice Mulenga, a member of the collective, told me that some users can’t comment, like, or even search for their posts, while other posts of partygoers dancing in see-through bodysuits have been taken down altogether. This censorship of queer online communities taking pride in their bodies and sexuality is a huge step backwards: the implication is that people who fall outside traditional body and gender norms should be ashamed of who they are and what they look like.
Although it implements these shadow bans quietly (Instagram has only vaguely addressed criticisms, saying it can deprioritize “sexually suggestive” images), the practice has become an increasingly regular part of the app’s broader crackdown on what it deems “sexual content”.
When it comes to nudity online, Instagram purports to have a seemingly straightforward policy. Its community guidelines state: “Photos of post-mastectomy scarring and women actively breastfeeding” are allowed, however, “photos, videos, and some digitally-created content that show sexual intercourse, genitals, and close-ups of fully-nude buttocks” are not. But what many users don’t know is that beyond the straightforward banning of nude images, Instagram utilizes this far murkier form of censorship for content that lies in a grey area it describes as being “sexually suggestive”.
The vagueness of Instagram’s shadow-banning policy leaves users confused as to what is and isn’t appropriate
Salty, an online newsletter and platform for women, trans and non-binary people, released new research last month that looked at the ways that algorithms affect marginalised groups, concluding that plus-sized profiles were often flagged for “excessive nudity” and “sexual solicitation”, and that queer people and women of colour are policed far more than their white, straight, cis counterparts.Advertisement
When images of fully-clothed plus-sized or black women are removed for being “inappropriate”, the platform’s AI learns to adopt biases that reinforce misogyny and racism, creating barriers for certain groups in the digital realm. It is ironic to think that social media, which is frequently framed as an equalising force, serves to suppress communities who are most often discriminated against offline.
Non-sexual pictures of queer people, women of colour and plus-sized women, even relatively mundane ones, have been shadow banned – implying these bodies, clothed or not, are inherently sexual, or something to be hidden from view. In a world that already intensely scrutinises marginalised people’s bodies, pushing back against body-shaming and censorship is important.
Shadow bans can also present practical problems for marginalised communities. In the run-up to a recent event, for example, Pxssy Palace noticed their posts were not receiving much engagement and people were unsure if they were even still putting on parties. The shadow ban meant people weren’t seeing their event promotion and were unaware of future events.
As someone involved in pole fitness, Instagram’s shadow ban has also placed limits on the work of my own community by shadow-banning pole dance videos and images or taking down posts altogether. Speaking with pole teachers themselves, I have been told how censorship hinders their ability to promote their work, and thus has a tangible impact on their earning potential.
The vagueness of Instagram’s shadow-banning policy is perhaps the most frustrating part. It leaves users confused as to what is and isn’t appropriate, and punishes them in a way that seems indiscriminate, but is inevitably discriminatory. The reluctance to properly define what it means to be “sexually suggestive” and a refusal to acknowledge the nuances around it are unfair.
Users who want to utilise their accounts to push cultural boundaries and provide safe, accepting online communities for marginalised groups are being punished without proper consideration. In the process, we give people the illusion of freedom of speech and expression, but only on terms set by predominantly white, straight, male tech guys.
How we protect people online and distinguish between content that is pornographic rather than protest, is still up for debate. However, I do know that in the age of fake news floods and far-right forums, body-positive women on social media are not the greatest threat to our safety online.
What is Instagram really protecting users from when it censors people’s bodies and sexualities? I know that had women’s bodies in all their glory, shape and size been normalised for me earlier on, I would not have spent so long being ashamed to live confidently on my own.
The Guardian, • Chanté Joseph