Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Please don't post about US Politics. If you need to do this, try [email protected]
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either [email protected] or [email protected].
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email [email protected]. For other questions check our partnered communities list, or use the search function.
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
view the rest of the comments
I think other answers here are more essential - chain of custody, corroborating evidence, etc.
That said, Leica has released a camera that digitally signs its images, and other manufacturers are working on similar things. That will allow people to verify whether the image is original or has been edited. From what I understand Leica has some scheme where you can sign images when you update them too, so there's a whole chain of documentation. Here's a brief article
It's an interesting experiment, but why would we trust everything that Leica supposedly verified? The same shit with digital signatures and blockchain stuff. We are at the gates of the world where we have zero trust by default and would only intentionally outsource verification to third parties we trust, because penalties for mistakes are growing each day.
I don't think we should inherently. I've thought about the idea of digitally signed photos and it seems sound unless someone is quite clever with electronics. I'm guessing there's some embedded key on the camera that is hard but maybe not impossible to access. If people can hack Teslas for "full autopilot" or run Doom on an ATM machine I'm not confident that this kind of encryption will never be cracked. However, I would hope an expert witness would also examine the camera that supposedly took the picture. I would think it to be impossible for someone to acquire the key without a 3rd party detecting the intrusion.
Today we have EXIFs and it's better to wipe them all of these for privacy reasons. Because every picture you take otherwise contains a lot of your data like geoloc, model, exposuer, etc. That's the angle they are yet to tackle - because most of these things are also leave us vulnerable.
They make Hardware Security Modules (HSMs) that are very difficult to crack, to the point that it is unbreakable at our current technology level. With a strong HSM, a high-bit per-device certificate signed by the company's private key gives you authenticity and validation until the root key or HSM are broken, which is probably good enough for today while we try to figure out something better IMO.
Well as I said, I think there's a collection of things we already use for judging what's true, this would just be one more tool.
A cryptographic signature (in the original sense, not just the Bitcoin sense) means that only someone who possesses a certain digital key is able to sign something. In the case of a digitally signed photo, it verifies "hey I, key holder, am signing this file". And if the file is edited, the signed document won't match the tampered version.
Is it possible someone could hack and steal such a key? Yes. We see this with certificates for websites, where some bad actor is able to impersonate a trusted website. (And of course when NFT holders get their apes stolen)
But if something like that happened it's a cause for investigation, and it leaves a trail which authorities could look into. Not perfect, but right now there's not even a starting point for "did this image come from somewhere real?"
A camera that authenticates the timestamp and contents of an image is great. But it's still limited. If I take that camera, mount it on a tripod, and take a perfect photograph of a poster of Van Gogh's Starry Night, the resulting image will be yet another one of millions of similar copies, only with a digital signature proving that it was a newly created image today, in 2024.
Authenticating what the camera sensor sees is only part of the problem, when the camera can be shown fake stuff, too. Special effects have been around for decades, and practical effects are even older.
You're right, cameras can be tricked. As Descartes pointed out there's very little we can truly be sure of, besides that we ourselves exist. And I think deepfakes are going to be a pretty challenging development in being confident about lots of things.
I could imagine something like photographers with a news agency using cameras that generate cryptographically signed photos, to ward off claims that newsworthy events are fake. It would place a higher burden on naysayers, and it would also become a story in itself if it could be shown that a signed photo had been faked. It would become a cause for further investigation, it would threaten a news agency's reputation.
Going further I think one way we might trust people we aren't personally standing in front of would be a cryptographic circle of trust. I "sign" that I know and trust my close circle of friends and they all do the same. When someone posts something online, I could see "oh, this person is a second degree connection, that seems fairly likely to be true" vs "this is a really crazy story if true, but I have no second or third or fourth degree connections with them, needs further investigation."
I'm not saying any of this will happen, just it's potentially a way to deal with uncertainty from AI content.
Hardware signing stuff is not a real solution. It's security through obscurity.
If someone has access to the hardware, they technically have access to the private key that the hardware uses to sign things.
A determined malicious actor could take that key and sign whatever they want to.
Cameras with stronger security will become more and more important, though on a theoretical level, they could be cracked or forged, but I suppose it's the usual cat and mouse game