Skip to content Skip to footer

The Meta Oversight Board Requests Modifications to the Regulations Regarding Nudity Displays

Meta’s independent Oversight Board has ordered the firm to amend its nudity policies, especially for transgender and non-binary persons, after the removal of two Instagram pictures of models with bare chests. Two Instagram pictures by the same person showed a transgender/non-binary pair bare-chested with their nipples covered. 

The postings were intended to raise awareness of a couple members needing top surgery, but Meta’s automatic technologies and human review deleted both posts for sexual solicitation. The Oversight Board reinstated the postings when the user appealed. The Oversight Board argues the case shows Meta’s transgender and non-binary standards are flawed.

According to the Board: “The Oversight Board concludes that deleting these postings violates Meta’s Community Standards, beliefs, and human rights duties. These situations also show Meta’s policy flaws. Meta’s internal instructions to moderators on removing material under the Sexual Solicitation policy is far wider than the policy’s stated purpose or publicly accessible guidelines. As Meta has noted, this confuses users and moderators and leads to material being incorrectly deleted.

The Board emphasises that Meta’s initial removal of these postings was due to a misinterpretation of its own rules, mostly owing to how they’re phrased.

“This policy is founded on gender binaries and male and female bodies. Such an approach makes it unclear how the rules apply to intersex, non-binary, and transgender persons and forces reviewers to make hasty and subjective sex and gender evaluations, which is impractical when filtering material at scale.”

The Board also observes that Meta’s nudity regulations are sometimes “convoluted and poorly defined” and may hinder women, trans, and gender non-binary expression on its platforms. In environments where women customarily go bare-chested, they have a serious effect, and LGBTQI+ persons might be disproportionately harmed, as these incidents indicate. Despite not infringing Meta’s regulations, Meta’s automatic algorithms recognised the material numerous times.

Meta’s Adult Nudity and Sexual Activity policy should be clarified, according to the Board.

That will guarantee human rights for all users. It should also assess if the Adult Nudity and Sexual Activity policy prevents non-consensual picture sharing and whether other rules need to be reinforced. It’s an intriguing decision, reflecting changing images of nudity and their meaning. Meta must adapt its rules to changing social norms to increase acceptability and forward these crucial talks.

Meta’s policy enforcement efforts benefit from the Oversight Board, which shows how external regulation might help social media applications make content judgements.

Meta has been advocating for global governments to establish regulations and standards that all social platforms must follow. That would remove many of the most complicated and sensitive moderating choices from internal leaders and level the playing field for all platforms. Which seems like a better option, but creating universal, worldwide standards is complicated and will need collaboration and consensus.

That’s possible? It’s impossible to tell, but Meta’s Oversight Board experience shows that platform rules require external oversight to meet public expectations.

Leave a comment

Subscribe to the updates!

Our site uses cookies. Learn more about our use of cookies: cookie policy
Our site uses cookies. Learn more about our use of cookies: cookie policy
× .