Digital signature as a means of non repudiation is exactly the way this should be done. Any official docs or releases should be signed and easily verifiable by any public official.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Maybe deepfakes are enough of a scare that this becomes standard practice, and protects encryption from getting government backdoors.
Hey, congresscritters didn't give a shit about robocalls till they were the ones getting robocalled.
We had a do not call list within a year and a half.
That's the secret, make it affect them personally.
Would someone have a high level overview or ELI5 of what this would look like, especially for the average user. Would we need special apps to verify it? How would it work for stuff posted to social media
linking an article is also ok :)
Depending on the implementation, there are two cryptographic functions that might be used (perhaps in conjunction):
-
Cryptographic hash: An arbitrary amount of data (like a video file) is used to create a “hash”—a shorter, (effectively) unique text string. Anyone can run the file through the same function to see if it produces the same hash; if even a single bit of the file is changed, the hash will be completely different and you’ll know the data was altered.
-
Public key cryptography: A pair of keys are created, one of which can only encrypt data (but can’t decrypt its own output), and the other, “public” key can only decrypt data that was encrypted by the first key. Users (like the White House) can post their public key on their website; then if a subsequent message purporting to come from that user can be decrypted using their public key, it proves it came from them.
a shorter, (effectively) unique text string
A note on this. There are other videos that will hash to the same value as a legitimate video. Finding one that is coherent is extraordinarily difficult. Maybe a state actor could do it?
But for practical purposes, it'll do the job. Hell, if a doctored video with the same hash comes out, the White House could just say no, we punished this one, and that alone would be remarkable.
The best way this could be handled is a green check mark near the video that you could click on it and it would give you all the meta data of the video (location, time, source, etc) with a digital signature (what would look like a random string of text) that you could click on and your browser would show you the chain of trust, where the signature came from, that it's valid, probably the manufacturer of the equipment it was recorded on, etc.
I have said for years all media that needs to be verifiable needs to be signed. Gpg signing lets gooo
Very few people understand why a GPG signature is reliable or how to check it. Malicious actors will add a "GPG Signed" watermark to their fake videos and call it a day, and 90% of victims will believe it.
I just mentioned this in another comment tonight; cryptographic verification has existed for years but basically no one has adopted it for anything. Some people still seem to think pasting an image of your handwriting on a document is "signing" a document somehow.
Huh. They actually do something right for once instead of spending years trying to ban A.I tools. I'm pleasantly surprised.
Bingo. If, at the limit, the purpose of a generative AI is to be indistinguishable from human content, then watermarking and AI detection algorithms are absolutely useless.
The ONLY means to do this is to have creators verify their human-generated (or vetted) content at the time of publication (providing positive proof), as opposed to attempting to retroactively trying to determine if content was generated by a human (proving a negative).
Yeah good luck getting to general public to understand what “cryptographically verified” videos mean
The general public doesn't have to understand anything about how it works as long as they get a clear "verified by ..." statement in the UI.
It could work the same way the padlock icon worked for SSL sites in browsers back in the day. The video player checks the signature and displays the trusted icon.
Democrats will want cryptographically verified videos, Republicans will be happy with a stamp that has trumps face on it.
I don't blame them for wanting to, but this won't work. Anyone who would be swayed by such a deepfake won't believe the verification if it is offered.
I don't think that's what this is for. I think this is for reasonable people, as well as for other governments.
Besides, passwords can be phished or socially engineered, and some people use "abc123." Does that mean we should get rid of password auth?
It would become quite easy to dismiss anything for not being cryptographically verified simply by not cryptographically verifying.
I can see the benefit of having such verification but I also see how prone it might be to suppressing unpopular/unsanctioned journalism.
Unless the proof is very clear and easy for the public to understand the new method of denial just becomes the old method of denial.
This doesn’t solve anything. The White House will only authenticate videos which make the President look good. Curated and carefully edited PR. Maybe the occasional press conference. The vast majority of content will not be authenticated. If anything this makes the problem worse, as it will give the President remit to claim videos which make them look bad are not authenticated and should therefore be distrusted.
It needs to be more general. A video should have multiple signatures. Each signature relies on the signer's reputation, which works both ways. It won't help those who don't care about their reputation, but will for those that do.
A photographer who passes off a fake photo as real will have their reputation hit, if they are caught out. The paper that published it will also take a hit. It's therefore in the paper's interest to figure out how trustworthy the supplier is.
I believe canon recently announced a camera that cryptographically signs photographs, at the point of creation. At that point, the photographer can prove the camera, the editor can prove the photographer, the paper can prove the editor, and the reader can prove the newspaper. If done right, the final viewer can also prove the whole chain, semi-independently. It won't be perfect (far from it) but might be the best will get. Each party wants to protect their reputation, and so has a vested interest in catching fraud.
For this to work, we need a reliable way to sign images multiple times, as well as (optionally) encode an edit history into it. We also need a quick way to match cryptographic keys to a public key.
An option to upload a time stamped key to a trusted 3rd party would also be of significant benefit. Ironically, Blockchain might actually be a good use for this. In case a trusted 3rd can't be established.
the technology to do this has existed for decades and it's crazy to me that people aren't doing it all the time yet
Why not just official channels of information, e.g. White house Mastodon instance with politicians' accounts, government-hosted, auto-mirrored by third parties.
You mean to tell me that cryptography isn't the enemy and that instead of fighting it in the name of "terrorism and child protection" that we should be protecting children by having strong encryption instead??
I'm sure they do. AI regulation probably would have helped with that. I feel like congress was busy with shit that doesn't affect anything.
I salute whoever has the challenge of explaining basic cryptography principles to Congress.
I see no difference between creating a fake video/image with AI and Adobe's packages. So to me this isn't an AI problem, it's a problem that should have been resolved a couple of decades ago.
I think this is a great idea. Hopefully it becomes the standard soon, cryptographically signing clips or parts of clips so there's no doubt as to the original source.
When it comes to misinformation I always remember when I was a kid I'm the early 90s, another kid told me confidently that the USSR had landed on Mars, gathered rocks, filmed it and returned to earth(it now occurs to me that this homeschooled kid was confusing the real moon landing.) I remember knowing it was bullshit but not having a way to check the facts. The Internet solved that problem. Now, by God , the Internet has recreated the same problem.
what if I meet Joe and take a selfie of both of us using my phone? how will people know that my selfie is an authentic Joe Biden?
We need something akin to the simplicity and ubiquity of Google that does this, government funded and with transparent oversight. We're past the point of your aunt needing a way to quickly check if something is obvious bullshit.
Call it something like Exx-Ray, the two Xs mean double check - "That sounds very unlikely that they said that Aunt Pat... You need to Exx-Ray shit like that before you talk about it at Thanksgiving"
Or same thing, but with the word Check, CHEXX - "No that sounds like bullshit, I'm gonna CHEXX it... Yup that's bullshit, Randy."
I've always thought that bank statements should require cryptographic signatures for ledger balances. Same with individual financial transactions, especially customer payments.
Without this we're pretty much at the mercy of trust with banks and payment card providers.
I imagine there's a lot of integrity requirements for financial transactions on the back end, but the consumer has no positive proof except easily forged statements.
This is sadly necessary
I've been saying for a long time now that camera manufacturers should just put encryption circuits right inside the sensors. Of course that wouldn't protect against pointing the camera at a screen showing a deepfake or someone painstakingly dissolving top layers and tracing out the private key manually, but that'd be enough of the deterrent from forgery. And also media production companies should actually put out all their stuff digitally signed. Like, come on, it's 2024 and we still don't have a way to find out if something was filmed or rendered, cut or edited, original or freebooted.
So should Taylor Swift