Women Are Pretending To Be Men On Instagram To Avoid Sexist Censorship

The app's shadow ban on vaguely "inappropriate" content has disproportionately affected female users.
HuffPost Illustration/Getty Images

Female pole dancers, fitness instructors and sex workers who use Instagram have started changing their gender to male on the app. The widespread deception is in response to a sexist policy the tech giant introduced earlier this year.

In April, Instagram began hiding photos and videos that it considers to be vaguely “inappropriate” without explaining what specific kind of content that includes or alerting affected users. Such posts are algorithmically blocked from being featured in the Facebook-owned website’s public Explore and hashtag pages, which help grow people’s accounts by giving them broader exposure.

This kind of covert censorship, known as “shadow banning,” has disproportionately affected women and members of marginalized communities, including those whose livelihoods depend on Instagram — leaving many urgently seeking ways to restore their visibility on the platform.

“Many of us within the pole dancing community rely on Instagram to thrive,” said Michelle, an Australian pole dance performer, teacher and studio owner who, like other women quoted in this story, asked to be identified by her first name only for privacy reasons. “We use [Instagram] to share training videos, connect with new people and, for lots of us, to grow our businesses.”

In late October, having already watched her content’s engagement steadily decline for months, Michelle decided to change her profile to male. She’d seen research suggesting Instagram’s algorithm is biased against women, and felt like she had nothing to lose.

Within three days of switching, she said, things went back to normal: Through Instagram’s analytics tool, she found that her posts have been getting far more likes and views, indicating that Instagram has been displaying them to a wider audience again.

“It’s ridiculous that we have to resort to trying this kind of thing,” she said.

Instagram/everybodyvisible

Though strictly experimental, the gender-swapping tactic has started to take off among shadow-banned women due to recent promotion from anti-censorship activism pages such as @everybodyvisible. Like Michelle, several other women have reported positive changes to their content’s performance since pretending to be men — a change many have made reluctantly.

“It’s really upsetting and ridiculous that women are having to change their gender [on Instagram] to avoid being censored,” said Carolina, a founding member of @everybodyvisible who researches online content moderation as part of her doctoral studies in London.

“The supportive community I found through Instagram is what gives me and so many others confidence,” added Carolina, who is also a pole dancer. “But now, with Instagram choosing who’s ‘appropriate’ and who’s not, it’s hard to feel welcome there.”

In a statement to HuffPost, a Facebook spokesperson denied that Instagram is biased against women.

“Gender information from profiles has no impact on content we filter from hashtags or the Explore page,” the spokesperson said. “We want to make sure the content we recommend to people on Instagram is safe and appropriate for everyone. Ensuring women feel heard is an essential part of that effort.”

But the platform has previously admitted to restricting content from pole dancers in particular.

Over the summer, pole dancers around the world noticed that posts containing popular hashtags such as #PoleFitness, #PoleTrick and #FemaleFitness (but notably, not #MaleFitness) seemed to be shadow banned on Instagram. At first, Instagram reportedly denied that this was happening, but after a petition addressing the matter went viral, the company acknowledged that it had in fact been hiding pole dancers’ content and apologized for doing so.

Instagram/eizabeth_bfit

Instagram users attempting to play by the rules and simply understand what they’re allowed to post on the platform without being shadow banned won’t find many answers — so perhaps it’s not surprising that they’re are trying to game the system.

Unlike Instagram’s policy for posts containing nudity that are subject to removal — which include depictions of sexual intercourse, genitals, “close-ups of fully-nude buttocks” and female nipples — the platform’s policy for borderline content that is subject to demotion is nebulous and obscurely worded. Instagram has refused to define what it means by “inappropriate” imagery; the sole example included in its guidelines is “sexually suggestive” material.

The only public indication of what Instagram might consider to be “sexually suggestive” is tucked into its parent company’s advertising policy pages, which prohibit “adult” content but go into greater detail about what that covers. There, Facebook features several photos to illustrate to advertisers what it means by the terms “sexually suggestive,” “sexually provocative,” “implied nudity” and “sexual in nature.”

Facebook's ad policies may offer some insight into what Instagram considers to be "sexually suggestive" content.
Facebook
Facebook's ad policies may offer some insight into what Instagram considers to be "sexually suggestive" content.

Nearly all of the photos feature women, including one model who’s leaning forward in a low-cut shirt and another who’s eating a banana.

Instagram also offers a bit more detail when rejecting advertisers. Upon turning down an ad from Michelle’s company featuring pole dancing students in shorts and crop tops, it sent her a notification explaining the ad was unacceptable because it showed “excessive skin” — despite the fact that the sport requires skin-on-pole contact for grip.

That Instagram has the power to arbitrarily decide whose content can be visible on its massive platform should be concerning to everyone — not just the women who are currently being shadow banned, said Carolina from @everybodyvisible.

“Social media giants including Instagram have a monopoly over our data and online interactions,” she said. “Freedom of expression is at stake here. Users really do not have a voice — we have to cope with their policies, and unfortunately for us, everything that even slightly involves sex scares the shit out of Instagram.”

Sex workers who spoke to HuffPost described a crackdown on their Instagram posts following the passage of FOSTA-SESTA in 2018. The law makes it illegal to assist, facilitate or support sex trafficking, and removes platforms’ immunity from liability under the Communications Decency Act for user content that does any of those things. In its wake, big tech has made sweeping changes to how it polices sexual content — including changes to algorithms.

“The patriarchy is written into the algorithms.”

- Salty spokesperson

Last November, months after FOSTA-SESTA had been signed into law, Facebook CEO Mark Zuckerberg noted that his company’s artificial intelligence systems proactively flag 96% of posts containing nudity that get removed. He was applauding the systems’ efficiency, but experts have concerns about over-reliance on algorithms for content moderation due to the human bias that’s often coded into them.

Earlier this year, feminist publication Salty crowd-sourced data from Instagram users to understand how different groups are policed on the platform. Among Salty’s findings, which represent some of the limited research into this issue, the data suggested that Instagram is more likely to reject ads from women than men.

“The patriarchy is written into the algorithms,” a Salty spokesperson said. Instagram “needs to be actively working to see and hear [women and marginalized groups]. ... Unless they’re inviting us to have a seat at the table, then we’re going to be written out of the code.”

Close