this post was submitted on 23 Jul 2023
47 points (89.8% liked)
Asklemmy
43806 readers
1376 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
For now, I think LLMs are not relevant in its current form, other than helping with writing out ideas.
I think in the future there's an LLM that uses Google searches to infer specific information. Basically the assistant that every tech company on this planet pretended to have at so e point, but now it actually exists.
Other than that, probably not much. Maybe translation, maybe predicting the truthfulness of information, maybe converting data or writing code, but all these things require a variety of different specialized AIs that are designed specifically for these use cases. I don't see that being an actually commonly used thing until 2030. Maybe we'll have some of these things by 2025, but I have a feeling it will be ok, and not good enough to really substitute the human resources.