this post was submitted on 23 May 2024
1079 points (98.2% liked)

Technology

55562 readers
4074 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pufferfisherpowder 14 points 1 month ago* (last edited 1 month ago) (2 children)

Google has a deal with reddit as well. https://www.reuters.com/technology/reddit-ai-content-licensing-deal-with-google-sources-say-2024-02-22/?utm_source=reddit.com

But I don't think it's just an issue with the dataset. It's the false promise of these LLMs having a fucking clue what a good search result is and what is not. They don't. They are just good at creating text that sounds plausible. That's not what searching for factually correct information is about though.

[–] Twofacetony 1 points 1 month ago

Thank you for the link. I didn’t realise that Google had a deal with Reddit as well, which explains why it was clearly indexed from Reddit.

I agree that AI doesn’t have a clue what an accurate response is. It’s just not sentient enough to differentiate between shitposting and fact. I also totally agree that an answer given from a search result HAS to be accurate, and we’re heading down a path of a misinformation super highway if LLMs are trained on incorrect data.

[–] [email protected] 1 points 1 month ago

Yep. LLMs are great for bouncing ideas off, and for getting "soft answers", but no-one should ever be looking for factual answers from them.