this post was submitted on 13 Jan 2025
122 points (100.0% liked)

News

24272 readers
3627 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 2 years ago
MODERATORS
 

Summary

Experts warn of rising online racism fueled by X’s generative AI chatbot, Grok, which recently introduced a photorealistic image feature called Aurora.

Racist, fake AI images targeting athletes and public figures have surged, with some depicting highly offensive and historically charged content.

Organizations like Signify and CCDH highlight Grok's ability to bypass safeguards, exacerbating hate speech.

Critics blame X's monetization model for incentivizing harmful content.

Sports bodies are working to mitigate abuse, while calls grow for stricter AI regulation and accountability from X.

top 14 comments
sorted by: hot top controversial new old
[–] FlyingSquid 20 points 2 weeks ago (1 children)

Before everyone starts chiming in with "stop using Twitter" (and I agree), this is a bigger problem. If Grok can create photorealistic images of public figures, something AI image generators provided by Google, Microsoft/OpenAI and Meta are not doing, this problem goes way beyond Twitter. Because the photos will get spread from there.

And sure, someone could host their own AI image generator on their home setup, but this makes it much, much easier.

[–] [email protected] 10 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

I got swamped with downvotes the last time I said this, but I maintain that in the near future we're going to need to digitally sign authentic images. It's simply unfeasible to police the entire internet in an effort to remove fake ones (especially since they are made by bad actors who don't care about any rules anyway) so we need a widespread and easy to use way to distinguish real images instead.

[–] theunknownmuncher 12 points 2 weeks ago (1 children)

Honest question. What will stop someone from getting AI generated images digitally signed as well? Who will be the authority doing the signing?

[–] [email protected] 3 points 2 weeks ago (1 children)

I mean a signature that can be matched against a known one, like GPG.

[–] theunknownmuncher 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I don't think that answered my question, but maybe I just don't understand what you mean.

I could see a world where media outlets and publishers sign their published content in order to make it verifiable what the source of the content is, for a hypothetical example, AP news could sign photographs taken by a journalist, and if it is a reputable source that people trust to not be creating misinformation, then they can trust the signed content.

I don't really see a way that digital signatures can be applied to content created and posted by untrusted users in order to verify that they aren't AI generated or misinformation.

[–] [email protected] 3 points 2 weeks ago (1 children)

I could see a world where media outlets and publishers sign their published content in order to make it verifiable what the source of the content is, for a hypothetical example, AP news could sign photographs taken by a journalist, and if it is a reputable source that people trust to not be creating misinformation, then they can trust the l signed content.

Exactly -- it's a means of attribution. If you see a pic that claims to be from a certain media outlet but it doesn't match their public key, you're being played.

I don’t really see a way that digital signatures can be applied to content created and posted by untrusted users in order to verify that they aren’t AI generated or misinformation, that won’t be easily abused to defeat the purpose

That's the point. If you don't trust the source, why would you trust their content?

[–] theunknownmuncher 3 points 2 weeks ago* (last edited 2 weeks ago)

Ah okay, we are just describing the same thing 👍 I agree, this will be our future

[–] [email protected] 3 points 2 weeks ago (1 children)

How does this address the fact that people don't care whether something is real or fake? You can sign it all you want, but if nobody cares about the signature, you haven't accomplished anything.

[–] [email protected] 2 points 2 weeks ago (1 children)

I think we can win back a lot of people by making it easier to prove it's fake. Right now we're only asking them to take one source's word over the other. We don't need to convince everyone — only to get things back to a normal percentage of village idiots.

[–] [email protected] 1 points 2 weeks ago (1 children)

They don't care that it's fake. The loudest people and the fake accounts will continually post and repost without any consideration for truth.

[–] [email protected] 1 points 2 weeks ago

The loudest people and the fake accounts

Well yeah, they're the ones deliberately spreading it. I don't care about them, I care about the uninformed people in the middle who don't know what's real anymore.

[–] [email protected] -1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Already happening and it's gross, and will erode internet freedom and anonymity. This is advanced internet surveillance and tracking and they are using "stopping the spread of AI misinfo" as the excuse.

Soon all images produces in any manor will be able to practically identify the creator. Hiding tracking data in images is fucked up. This is also something adobe is working on implementing In their tools, so this isn't going to be just AI images.

https://openai.com/index/understanding-the-source-of-what-we-see-and-hear-online/

[–] [email protected] 4 points 2 weeks ago

Already happening and it’s gross, and will erode internet freedom and anonymity.

How? You can remain anonymous and still have a public key. I only need to know that I trust "Zetta"; I don't need your real name and home address.

[–] [email protected] 3 points 2 weeks ago

Start? Tay showed us this back in 2016.