Meta is introducing new features to blur nudity images on Instagram to protect teens.
Takeaway points
- Meta said it is testing new features to blur images containing nudity or intimate images on Instagram and other Meta apps.
- The aim of introducing these new features is to help protect young people from sextortion and intimate image abuse and to make it hard for scammers and criminals to find and interact with teens.
- Meta said that nudity protection will be turned on by default for teens under 18 worldwide.
New Features to Protect Teens
The parent company of Instagram, Meta, said on Thursday that it is testing new features to blur images containing nudity or intimate images on Instagram and other Meta apps. The aim of introducing these new features is to help protect young people from sextortion and intimate image abuse and to make it hard for scammers and criminals to find and interact with teens.
The tech company said that for years it has been working with experts and others to understand how these scammers work and how to stop them.
“Today, we’re sharing an overview of our latest work to tackle these crimes. This includes new tools we’re testing to help protect people from sextortion and other forms of intimate image abuse, and to make it as hard as possible for scammers to find potential targets – on Meta’s apps and across the internet. We’re also testing new measures to support young people in recognizing and protecting themselves from sextortion scams,” Meta said.
John Shehan, Senior Vice President, National Center for Missing and Exploited Children, explained in a statement that companies have the responsibility of protecting teenagers who use their platforms, and this new feature that Meta is bringing to its platforms is encouraging.
“Companies have a responsibility to ensure the protection of minors who use their platforms. Meta’s proposed device-side safety measures within its encrypted environment is encouraging. We are hopeful these new measures will increase reporting by minors and curb the circulation of online child exploitation,” John said.
Dr. Sameer Hinduja, Co-Director of the Cyberbullying Research Center and Faculty Associate at the Berkman Klein Center at Harvard University, said, “As an educator, parent, and researcher on adolescent online behavior, I applaud Meta’s new feature that handles the exchange of personal nude content in a thoughtful, nuanced, and appropriate way. It reduces unwanted exposure to potentially traumatic images, gently introduces cognitive dissonance to those who may be open to sharing nudes, and educates people about the potential downsides involved. Each of these should help decrease the incidence of sextortion and related harms, helping to keep young people safe online.”
How the New Features are Implemented
Meta said that nudity protection will be turned on by default for teens under 18 worldwide, and we’ll show a notification to adults encouraging them to turn it on. That teens message settings will be stricter so that they cannot be messaged by anyone they are not connected to, and teens who are already connected to these scammers will be shown a safety notice notification.
The company said it’s providing options for people to report DMs that are threatening to share private images. They are also supporting the National Center for Missing and Exploited Children (NCMEC) in developing Take It Down, a platform that lets young people take back control of their intimate images and helps prevent them from being shared online, taking power away from scammers.
Preventive Measures
According to the report, Meta is developing technology to help identify where accounts may be engaging in sextortion scams based on a range of signals that could indicate sextortion behavior, and it is going to take severe actions on any account that engages in sextortion. First, it will delete the account and prevent them from creating new ones, and if necessary, they might report them to the NCMEC and law enforcement.