this post was submitted on 28 Sep 2023
318 points (98.5% liked)
Technology
60073 readers
3459 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
People, and the internet are no different. The vast majority of information that exists is incomplete or wrong at some level. Skepticism is always required but assessing any medium by its performance without premeditated bias is the only intelligent approach that can grow with improving technology. Very few people are running the larger models (like a 65B or larger) that they fully control in an environment where they control every aspect of the LLM. I have such a setup on my hardware running offline. On its own my system is ~95% accurate on the tasks I use it for and it is more accurate at these than results I find when searching the internet.
There are already open source offline models specifically designed to work on scientific white paper archives where every result cites the source from its database.
Agents are a class of AI where the AI is running a multi-model system and where one model can send the prompt to more specialized models or a series of models equip to check and verify a response and do things like cite sources or verify against a database.