Meta-owned social media platform Instagram is testing a new feature that will protect users from getting unsolicited nude photos.
An early screenshot tweeted by researcher Alessandro Paluzzi implies that “Nudity protection” technology ‘covers photos that may contain nudity in chat,’ which gives users the option to view them or not.
Instagram parent Meta confirmed to The Verge that this feature is indeed developing. It plans to share more facts in the coming weeks.
Meta spokesperson Liz Fernandez said, “We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive.”
Last year, The Pew Research Center publicized a report that found 33% of women under 35 had been sexually harassed online.
#Instagram is working on nudity protection for chats 👀
ℹ️ Technology on your device covers photos that may contain nudity in chats. Instagram CAN'T access photos. pic.twitter.com/iA4wO89DFd
— Alessandro Paluzzi (@alex193a) September 19, 2022
According to Meta, the technology will not allow the firm to view the actual messages or share them with third parties.
The tech giant correlated these controls to its “Hidden Words” feature, allowing users to automatically filter direct message requests containing offensive content.
Multiple jurisdictions, including California and the UK, have targeted cyber flashing – or sending unwanted nude photos -. If the Online Safety Bill is passed by the UK parliament, it could become a criminal offense.
There are similar rules in other countries. In California, for instance, the state legislature and senate voted unanimously to allow users to sue over unsolicited nude photos and other sexually graphic material last month.