The social media giant is working on a feature to allow users to block unwanted nude photos they receive via direct messages – a “nudity protection” filter to block such photos. While the company confirmed the feature was in development, it didn’t say when it would be released. A spokesperson from Meta, Instagram’s parent company, said in a statement that the new features are designed to protect people’s privacy and control how they receive messages.
Nudity protection will be an optional feature on Instagram. A similar feature was launched last year by Instagram called Hidden Words, which automatically filters out abusive direct messages based on keywords and redirects them to a hidden folder.
Alessandro Paluzzi, an app researcher for the Verge, tweeted this week a screenshot of the company’s upcoming message-protection feature. You are protected from potentially nudity-containing photos in chats by the technology on your device. The photos cannot be accessed by Instagram if nudity is detected in a direct message.” And if nudity is detected in a direct message, Instagram will hide the photos unless the user chooses to view them.
According to a study released in April by the Center for Countering Digital Hate, 125 separate examples of image-based sexual abuse were found in direct messages sent to five high-profile women — including Amber Heard, who has faced widespread hate online in the wake of her legal battle with Johnny Depp. According to the organization, Instagram failed to take action on every single instance of image-based sexual abuse reported to the platform within 48 hours.