Artificial intelligence

Snapchat Plans to Watermark Images Created with its generative AI Tools

Snapchat plans to watermark images created with its generative AI tools.

Takeaway Points

  • Snapchat plans to watermark images created with its generative AI tools.
  • The aim of adding these watermarks is to help inform people viewing the images that they were made with AI on Snapchat.

 Why did Snapchat want to watermark images created with its AI?

The technology company Snapchat said on Tuesday that soon it will be adding a watermark to images created with Snap’s generative AI tools when the image is exported or saved to a camera roll. The aim of adding these watermarks is to help inform people viewing the images that they were made with AI on Snapchat.

According to the report, anyone who received an AI-generated image that was created on Snapchat may see a small ghost logo with a widely recognized sparkle icon beside it.  

Snapchat said that “We take seriously our responsibility to design products and experiences that prioritize privacy, safety, and age appropriateness. Like all of our products, AI-powered features have always undergone strict review to ensure they adhere to our safety and privacy principles – and through our learnings over time, we’ve developed additional safeguards.”

What is Red-teaming?

AI red-teaming is an increasingly common tactic used to test and identify potential flaws in AI models and AI-enabled features and implement solutions to improve the safety and consistency of AI outputs, according to Snapchat.

On Feb 27, Snapchat announced Snap’s safety efforts with AI red teaming from HackerOne. Snap said it has been developing new AI-powered functionality to expand its users’ creativity and wanted to test the new features of its Lens and My AI products—Generative AI Lens and Text2Image—to stress-test if the guardrails it had in place to help prevent the creation of harmful content. 

SnapChat partnered with HackerOne on more than 2,500 hours of work to test the efficacy of our strict safeguards, making them one of the early adopters.

Ilana Arbisser, Technical Lead, AI Safety at Snap Inc., said that a picture is worth a thousand words.

“We ran the AI red teaming exercise before the launch of Snap’s first text-to-image generative AI product. A picture is worth a thousand words, and we wanted to prevent inappropriate or shocking material from hurting our community. We worked closely with Legal, Policy, Content Moderation, and Trust and Safety to design this red-teaming exercise.”

About Snapchat

Snapchat is a technology company that believes the camera presents the greatest opportunity to improve the way people live and communicate. They contribute to human progress by empowering people to express themselves, live in the moment, learn about the world, and have fun together. Our products and services are designed to enhance relationships with friends, families, and the places around you.

According to Snapchat, over 800 million people use their platform every month, on average, and On average, over 300 million Snapchatters engage with augmented reality every day.

What is Family Center? 

Family Center is a security and safety feature in Snapchat that helps parents get more insight into who their teens are friends with on Snapchat, and who they have been communicating with, while still respecting their teens’ privacy and autonomy. It’s designed to reflect the way parents engage with their teens in the real world, where parents usually know who their teens are friends with and when they are hanging out, but don’t eavesdrop on their private conversations.

Snapchat’s decision to implement watermarks on images generated through its AI tools represents a proactive step towards safeguarding content integrity and protecting creators’ rights. By embedding unique identifiers into each image, Snapchat aims to deter misuse and unauthorized distribution, fostering a more secure and trustworthy environment for users. 

To Top

Pin It on Pinterest

Share This