Instagram testing nudity protection feature to tackle sextortion

Instagram feature for sextortion
Rep.Image | Courtesy: Meta
By Arya M Nair, Official Reporter
  • Follow author on

Meta has announced that it is testing new features in Instagram to protect users from financial sextortion and other forms of “intimate image abuse”.

While people overwhelmingly use DMs to share what they love with their friends, family or favorite creators, sextortion scammers may also use private messages to share or ask for intimate images.

To help address this, the new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity and encourages people to think twice before sending nude images.

This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return.

Instagram Nudity protection feature
Image Courtesy: Meta

Nudity protection will be turned on by default for teens under 18 globally, and it will show a notification to adults encouraging them to turn it on.

When nudity protection is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive photos, and that they can unsend these photos if they’ve changed their mind.

Anyone who tries to forward a nude image they’ve received will see a message encouraging them to reconsider.

When someone receives an image containing nudity, it will be automatically blurred under a warning screen, meaning the recipient isn’t confronted with a nude image and they can choose whether or not to view it. We’ll also show them a message encouraging them not to feel pressure to respond, with an option to block the sender and report the chat.

When sending or receiving these images, people will be directed to safety tips, developed with guidance from experts, about the potential risks involved. These tips include reminders that people may screenshot or forward images without your knowledge, that your relationship to the person may change in the future, and that you should review profiles carefully in case they’re not who they say they are.

They also link to a range of resources, including Meta’s Safety Center, support helplines, StopNCII for those over 18, and Take It Down for those under 18.

Nudity protection uses on-device machine learning to analyze whether an image sent in a DM on Instagram contains nudity. Because the images are analyzed on the device itself, nudity protection will work in end-to-end encrypted chats, where Meta won’t have access to these images, unless someone chooses to report them to us.

Instagram new feature
Image Courtesy: Meta

For teens, we’re going even further. We already restrict adults from starting DM chats with teens they’re not connected to, and in January we announced stricter messaging defaults for teens under 16 (under 18 in certain countries), meaning they can only be messaged by people they’re already connected to, no matter how old the sender is.

Now, we won’t show the “Message” button on a teen’s profile to potential sextortion accounts, even if they’re already connected. We’re also testing hiding teens from these accounts in people’s follower, following and like lists, and making it harder for them to find teen accounts in Search results.

We’re also adding new child safety helplines from around the world into our in-app reporting flows. This means when teens report relevant issues, such as nudity, threats to share private images or sexual exploitation or solicitation, we’ll direct them to local child safety helplines where available.

In November, we announced we were founding members of Lantern, a program run by the Tech Coalition that enables technology companies to share signals about accounts and behaviors that violate their child safety policies.

According to Meta, this industry cooperation is critical, because predators don’t limit themselves to just one platform, and the same is true of sextortion scammers. These criminals target victims across the different apps they use, often moving their conversations from one app to another.

Important | Google working on new feature to help users identify unknown callers

YOU MAY LIKE