this post was submitted on 18 Jun 2024
88 points (97.8% liked)

TechTakes

1441 readers
180 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout.

I thought that may just be part of the process but double checked with a Google search on day 7 (when there were no bubbles in the container at all).

Turns out I had just grew a botulism culture and garlic in olive oil specifically is a fairly common way to grow this bio-toxins.

Had I not checked on it 3-4 days in I'd have been none the wiser and would have Darwinned my entire family.

Prompt with care and never trust AI dear people...

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 17 points 5 months ago (1 children)

Those are examples of actual hallucinations where something did not happen.

Quoting a joke reddit thread as factual is not hallucinating. There was such a thread, but it wasn't factual and an LLM is wrong to present it as factual.

[–] SzethFriendOfNimi -2 points 5 months ago (2 children)

That’s the issue. LLM’s aren’t trustworthy. They hallucinate.

I presume, as the default, that anything a LLM produces is a hallucination right out of the gate.

[–] [email protected] 13 points 5 months ago

"Hallucination" implies LLMs can meaningfully perceive. They can't, they're not made that way and they have no reason to be.

[–] [email protected] 12 points 5 months ago

We’re arguing language now though, and by definition it isn’t “hallucinating”. By saying that’s what’s happening, you’re unintentionally legitimizing the “AI is making decisions” misinformation.

To get really pedantic, “flashback” would be a better label. It’s not making things up whole cloth, just repeating stuff way out of context.