this post was submitted on 05 Dec 2024
528 points (94.4% liked)
Technology
60055 readers
3735 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
And some of those citations and quotes will be completely false and randomly generated, but they will sound very believable, so you don't know truth from random fiction until you check every single one of them. At which point you should ask yourself why did you add unneccessary step of burning small portion of the rainforest to ask random word generator for stuff, when you could not do that and look for sources directly, saving that much time and energy
I guess it depends on your models and tool chain. I don't have this issue but I have seen it for sure, in the past with smaller models no tools and legal code.
You do have this issue, you can't not have this issue, your LLM, no matter how big the model is and how much tooling you use, does not have criteria for truth. The fact that you made this invisible for you is worse, so much worse.
If I put text into a box and out comes something useful I could give a shit less if it has a criteria for truth. LLM's are a tool, like a mannequin, you can put clothes on it without thinking it's a person, but you don't seem to understand that.
I work in IT, I can write a bash script to set up a server pivot to an LLM and ask for a dockerfile that does the same thing, and it gets me very close. Sure, I need to read over it and make changes but that's just how it works in the tech world. You take something that someone wrote and read over it and make changes to fit your use case, sometimes you find that real people make really stupid mistakes, sometimes college educated people write trash software, and that's a waste of time to look at and adapt... much like working with an LLM. No matter what you're doing, buddy, you still have to use your brian.