this post was submitted on 13 May 2024
34 points (72.4% liked)
Technology
59447 readers
5299 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've had this argument with friends a lot recently.
Them: it's so cool that I can just ask chatgpt to summarise something and I can get a concise answer rather than googling a lot for the same thing.
Me: But it gets things wrong all the time.
Them: Oh I know so I Google it anyway.
Doesn't make sense to me.
This is why I do a lot of my Internet searches with perplexity.ai now. It tells me exactly what it searched to get the answer, and provides inline citations as well as a list of its sources at the end. I've never used it for anything in depth, but in my experience, the answer it gives me is typically consistent with the sources it cites.