this post was submitted on 14 Feb 2025
71 points (94.9% liked)

A Boring Dystopia

10493 readers
99 users here now

Pictures, Videos, Articles showing just how boring it is to live in a dystopic society, or with signs of a dystopic society.

Rules (Subject to Change)

--Be a Decent Human Being

--Posting news articles: include the source name and exact title from article in your post title

--If a picture is just a screenshot of an article, link the article

--If a video's content isn't clear from title, write a short summary so people know what it's about.

--Posts must have something to do with the topic

--Zero tolerance for Racism/Sexism/Ableism/etc.

--No NSFW content

--Abide by the rules of lemmy.world

founded 2 years ago
MODERATORS
 

Google Gemini seems to have been programmed to provide cookie cutter responses when asked questions about whether Trump tried to overturn the election.

When you point it out to Gemini, it says it isn't programmed to avoid any topics or viewpoints.

Even saying you'll accept a variety of sources & viewpoints on the topic to reach your own conclusion, results in it saying it can't answer.

When asking if it has been trained on research papers, case law, indexed news stories & even Wikipedia results, it says that it has.

you are viewing a single comment's thread
view the rest of the comments
[–] LongLive 11 points 6 days ago (2 children)

I think asking the "ai" for information about itself is inherently flawed. It cannot guarantee that the information it provides is true. The only thing which is verifiable by Gemini itself, is that it cannot anwswre to that specific prompt at that specific time.

[–] LongLive 3 points 6 days ago* (last edited 6 days ago)

I was unable to get it to mention trump, though it did link to a webpage by campaign.legal.

Damn it is not saying anything about those court cases.

Q: any memes from those court cases?

A: I can't help...e.t.c.

Q: hi

A: I can't help... e.t.c.

[–] timewarp -1 points 6 days ago* (last edited 6 days ago) (1 children)

Very true, but the fact that it refuses to answer despite me already knowing it has enough info in its index to answer or provide some information & viewpoints, shows that it has been programmed not to answer.

[–] [email protected] 3 points 6 days ago

Well, yeah. That's what it said.

It's trained by reading the horrible morass of stuff on the Internet. Topics with larger amounts of disinformation are areas where they're very prone to making mistakes. Crossing those topics with ones that misinformation or the appearance of misinformation are particularly damaging to the world or to their reputation and you have a good list of topics that are probably not good candidates to let your chatbot talk about.

It doesn't do "reasoning" or "critical thinking" in the way you might expect for something that can communicate articulately. It doesn't know what's accurate or not, only what's likely to be stated on the Internet. Unfortunately, it's very likely for people on the Internet to say some bonkers things about the 2020 election in specific, and anything political in general. Even in sources that normally might be ranked higher for factuality, like a news publication.
It's not just trump, it's anything political.

This type of AI isn't an expert, it's a mimic. It knows how to mimic patterns, and it's been told to mimic something knowledgeable and helpful based on all the text on the Internet, where people regularly present themselves as knowledgeable regardless of their basic sanity.