this post was submitted on 21 Jan 2024
138 points (94.8% liked)
Asklemmy
43989 readers
1546 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy π
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You know half the shit chatgpt says isn't true, right?
I have not found that to be the case at all. While not perfect, it is miles above Google Search and has not more errors than the misinformation any search will yield. It is a significant business advantage as well and those who are not embracing are missing out.
I think businesses should be at a disadvantage of all things. Business caused millions of people to starve to death in Bengal.
How so? What size of business? I am a business of just 1 person.
I've found bing ai is quite good if you ask for the source after anything it spits out.
These models can invent a source. Their only incentive is to have a convincing conversation with you. They are unconcerned with the truth.
What I mean is I use it to get the links to those sources. Like when you use Wikipedia as a jumping off point. I don't think we're at the point yet where we have the problem Wikipedia sometimes has that the sources used sometimes themselves just cite Wikipedia.
The links to Wikipedia are actual citations to real sources. LLMs basically just generate something that looks like the link to a credible source which might support what it's said. It doesn't care if its "source" actually supports what it says.
I read an interesting article a few years ago about the Wikipedia source problem. It did a dive into how sources that seem legitimate on Wikipedia can and up citing sources that are less so. They were able to trace back the citations to Wikipedia itself. So no, they're not always real sources.
Which is why you read the page it has linked for you as a source. Unless you're trying to say it full on generates a page for you.
It's okay for things that are pretty low-stakes. If you ask for cooking or cleaning advice and it hallucinates you're still at square zero regardless.
Unless it tells you to mix bleach and ammonia π