Ok.
> uses search engine
> search engine gives generative AI answer
God dammit
> scroll down
> click search result
> AI Generated article
This is a most excellent place for technology news and articles.
Ok.
> uses search engine
> search engine gives generative AI answer
God dammit
> scroll down
> click search result
> AI Generated article
> search engine gives generative AI answer
> It cites it source, so can't be that bad right?
> click link to source
> It's an AI generated article
Oh no.
AI will give the correct, real source and then still make shit up. Bing linked to bulbapedia to tell me wailord was the heaviest Pokemon. Bulbapedia knows it isn't close, bingpt doesn't know shit.
The uncertainty has gripped the world in fear. I go to hug my wife for comfort. She is ~~cake~~Gen AI.
Don’t be ridiculous. It’s more like Google search result you click is an ad rather than an organic search result, and that ad… is an ad that’s ai generated… god damnit
The Internet was a great resource for sharing and pooling human knowledge.
Now generative AI has come along to dilute knowledge in a great sea of excrement. Humans have to hunt through the shit to find knowledge.
To be fair, humans were already diluting it in a great sea of excrement, the robots just came to take our job and do it even faster and better.
"How to make a pie"
Here's how to make a pie:
Gather ingredients:
Cooking Process:
Don't forget to glue it all together at the end. Real chefs use epoxy
Google search results are often completely unrelated so it's not any better. If the thing I'm looking for is obscure, AI often finds some thread that I can follow, but I always double check that information.
Know your tool limits, after hundreds of prompts I've learned pretty well when the AI is spitting bullshit answers.
Real people on the internet can be just as wrong and biased, so it's best to find multiple independent sources
Biggest reason I stopped using Google
When search engines stop being shit, I will.
In general I agree with the sentiment of the article, but I think the broader issue is media literacy. When the Internet came about, people had similar reservations about the quality of information, and most of us learned in school how to find quality information online.
LLMs are a tool, and people need to learn how to use them correctly and responsibly. I’ve been using Perplexity.AI as a search engine for a while now, and I think they’re taking the right approach. It employs LLMs at different stages to parse your query, perform web searches on your behalf, and summarize findings. It provides in-text citations as well, which is an opportunity for a media-literate person to confirm the validity of anything important.
Ok but may I point you to the reality that internet spread misinformation is a critically bad problem at the moment
Where was all this coming from? Well, I don’t know what Stern or Esquire’s source was. But I know Navarro-Cardenas’, because she had a follow-up message for critics: “Take it up with Chat GPT.”
The absolute gall of this woman to blame her own negligence and incompetence on a tool she grossly misused.
Okay, but what else to do with it?
Who else is going to aggregate those recipes for me without having to scroll past ads a personal blog bs?
And when the search engines shove it in your faces and try to make it so we HAVE to use it for searches to justify their stupid expenses?
This is why Melon and the AI chud brigade are so obsessed with having a chatbot (sorry, “AI”) that always agrees with them: a stupid number of people think LLMs are search engines, or worse, search engines but better, some diviner of truth.
Information is not truth. A do or die slogan for the 21st century.
Eh....I got it to find a product that met the specs I was looking for on Amazon when no other search worked. It's certainly a last resort but it worked. Idk why whenever I'm looking to buy anything lately somehow the only criteria I care about are never documented properly...
It's useful to point you in the right direction, but anything beyond that necessitates more research
I mean, it gave me exactly what I asked for. The only further research was to actually read the item description to verify that but I could have blindly accepted it and received what I was looking for.
Obvious problem is obvious.
garbage in, garbage out.