this post was submitted on 18 Sep 2024
155 points (94.3% liked)

Technology

59714 readers
5974 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 2 months ago* (last edited 2 months ago) (1 children)

Even if you assume the images you care about have this metadata, all it takes is a hacked camera (which could be as simple as carefully taking a photo of your AI-generated image) to fake authenticity.

And the vast majority of images you see online are heavily compressed so it’s not 6MB+ per image for the digitally signed raw images.

[–] [email protected] 5 points 2 months ago (1 children)

You don't even need a hacked camera to edit the metadata, you just need exiftool.

[–] [email protected] 0 points 2 months ago* (last edited 2 months ago) (1 children)

It’s not that simple. It’s not just a “this is or isn’t AI” boolean in the metadata. Hash the image, then sign the hash with digital signature key. The signature will be invalid if the image has been tampered with, and you can’t make a new signature without the signing key.

Once the image is signed, you can’t tamper with it and get away with it.

The vulnerability is, how do you ensure an image isn’t faked before it gets to the signature part? On some level, I think this is a fundamentally unsolvable problem. But there may be ways to make it practically impossible to fake, at least for the average user without highly advanced resources.

[–] [email protected] 2 points 2 months ago (1 children)

Cameras don't cryptographically sign the images they take. Even if that was added, there are billions of cameras in use that don't support signing the images. Also, any sort of editing, resizing, or reencoding would make that signature invalid. Almost no one is going to post pictures to the web without any sort of editing. Embedding 10+ MB images in a web page is not practical.

[–] [email protected] 1 points 2 months ago* (last edited 2 months ago)

We aren’t talking about current cameras. We are talking about the proposed plan to make cameras that do cryptographically sign the images they take.

Here’s the link from the start of the thread:

https://arstechnica.com/information-technology/2024/09/google-seeks-authenticity-in-the-age-of-ai-with-new-content-labeling-system

This system is specifically mentioned in the original post: https://www.seroundtable.com/google-search-image-labels-ai-edited-38082.html when they say “C2PA”.