Key points:

  • It appears that Instagram is finally taking proactive measures to prevent users from sending unsolicited pictures
  • The application will automatically blur images sent as a Direct Message if it contains nudity
  • The nudity protection technology will cover photos that may contain nudity in chat

Better late than never. It appears that Instagram is finally working on ‘nudity protection’ technology to prevent users from sending unsolicited pictures. After continuous pressure from authorities, Instagram users, and other parties to curb users from sending unwanted ‘nude pictures, the platform is now working toward making the application safe for users. 

YOU MAY LIKE: Brace yourself! NFTs are coming to Instagram soon!

How will the nudity protection technology work?

Researchers Alessandro Paluzzi shared a screenshot that provides a sneak peek into how the nudity protection technology would work. The new feature will blur out images if it thinks contains nudity and let the user decide if they want to see the clear and unblurred image.

Meta, Instagram’s parent company has confirmed that this feature is in development. This function will add an optional user control which will help users protect themselves from unsolicited nude photos and other unwanted messages. 

Meta has also stated that the new technology will not allow even Meta to uncover what’s beneath the blurred image and share them with third parties. Liz Fernandez, Meta’s spokesperson said that they are collaborating with several experts to ensure that new functions safeguard a user’s privacy while giving them control over all the messages they receive. 

The company has said that it will issue an update regarding new features over the next few weeks. 

Why is Instagram introducing nudity protection technology?

According to a report released earlier this year by the Center for Countering Digital Hate, a British NGO, Instagram’s tools have ignored around 90% of abusive direct messages sent as images to high-profile women. In addition, Men frequently sent pornographic photographs, and the “hidden words” option was unable to totally block out expletives such as “b*tch.”

Further, a report published by The Pew Research Center revealed that around 33% of women below the age of 35 were sexually harassed online. 

The development of the new Instagram feature comes at a time when cyber flashing, which involves sending unwanted sexual messages to strangers online, frequently women, may soon be declared a crime in the UK if the Online Safety Bill is approved by Parliament.

However, most of the US does not have laws against cyber flashing, although Texas considers it a misdemeanour since 2019. This is despite the fact that some professionals think it might have an equally negative psychological impact as physical sexual abuse. 

As the calls for a safer internet grow, social media companies are expected to tighten their grip on such issues in the next few years. 

Keep reading iTMunch for the latest social media-related news.

YOU MAY LIKE: Instagram slapped with second largest European Union privacy fine

Feature Image: Photo by Souvik Banerjee on Unsplash