this post was submitted on 11 Feb 2025
20 points (100.0% liked)

Hacker News

757 readers
685 users here now

Posts from the RSS Feed of HackerNews.

The feed sometimes contains ads and posts that have been removed by the mod team at HN.

founded 5 months ago
MODERATORS
top 5 comments
sorted by: hot top controversial new old
[–] [email protected] 8 points 2 weeks ago

Because they were trained on reddit...

[–] [email protected] 6 points 2 weeks ago

They don't know that they don't know.

[–] pixxelkick 4 points 2 weeks ago

Think about how often you see a question raised and then responded with "I don't know" in the internet data.

Wikipedia, textbooks, book books, forums, wiki how...

The vast majority of time, the format of what you see is:

<short question>?

<long winded answer or how to guide>

And thats what the llms are all trained on, that'd the most common pattern, so llms replicate that pattern.

The majority if their training data is off wikis abd textbooks, which pretty much never have "I dunno" anywhere. If an answer isn't known, instead, the page simply doesn't exist in the first place.

[–] theunknownmuncher 4 points 2 weeks ago (1 children)

This just demonstrates a funadmentally flawed understanding of LLMs... they don't know anything they are generating the text that is stastically likely to follow. They will still generate what is most stastically likely for things they "don't know"

[–] [email protected] 3 points 2 weeks ago

I loved the confident response I got after a few corrections.

"Thank you for your patience and assistance in correcting my mistakes, I will finally provide you with the correct response to your question..." Continues with a completely incorrect response lol