acm-header
Sign In

Communications of the ACM

ACM News

This Tool could Protect Your Pictures from AI Manipulation


View as: Print Mobile App Share:
PhotoGuard works by altering photos in tiny ways that are invisible to the human eye, but which prevent them from being manipulated.

Meta research scientists Emily Wenger says tools like PhotoGuard change the economics and incentives for attackers by making it more difficult to use AI in malicious ways.

Credit: MIT Technology Review

Remember that selfie you posted last week? There's currently nothing stopping someone taking it and editing it using powerful generative AI systems. Even worse, thanks to the sophistication of these systems, it might be impossible to prove that the resulting image is fake. 

The good news is that a new tool, created by researchers at MIT, could prevent this.

The tool, called PhotoGuard, works like a protective shield by altering photos in tiny ways that are invisible to the human eye but prevent them from being manipulated. If someone tries to use an editing app based on a generative AI model such as Stable Diffusion to manipulate an image that has been "immunized" by PhotoGuard, the result will look unrealistic or warped.

From MIT Technology Review
View Full Article

 


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account