Photographers are upset because Meta is marking their real photos as “Made with AI.” Meta’s new tag was supposed to help users distinguish between real and computer-generated images, but it’s not foolproof. Even professional photographers like a former White House photographer and a cricket team are finding their authentic images flagged as AI-generated.
What’s Happening & Why This Matters
Former White House photographer Pete Souza told TechCrunch that this could be due to basic photo editing performed by photographers before uploading. For example, if you upload a photo to Photoshop, crop it, and then export it as a new .JPG file, Instagram’s detectors might mistake it for an AI-generated image. This is because Instagram’s AI looks at metadata to verify the authenticity of an image.
As election season approaches, the ability of massive social platforms like Meta to moderate AI-generated content will become increasingly crucial. But with Meta’s current system, the “Made with AI” tag raises the question of whether it actually accomplishes anything if it’s not reliable.
TF Summary: What’s Next
Meta’s labeling of real photos as ‘Made with AI’ has caused confusion and frustration among photographers. As the issue persists, it’s essential for Meta to refine their system to accurately identify AI-generated content without wrongfully tagging authentic images. In the future, Meta must ensure that their AI detection process is reliable and continuously updated to keep up with evolving technology.
— Text-to-Speech (TTS) provided by gspeech