this post was submitted on 04 Mar 2025
24 points (90.0% liked)

ChatGPT

9200 readers
59 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 2 years ago
MODERATORS
 

Been liking Alex O'Connor's ChatGPT explanation videos and ChatGPT related experiments.

Alex O'Conner makes content related to philosophy and religion but I particularly enjoyed, in addition to this video, one where he gaslights ChatGPT using moral dilemmas.

In this video he tells you the reason why it is so hard to get ChatGPT to do this. Short Answer: most images you find of wine are either empty glasses or partially full because who fills their wine to the top?

you are viewing a single comment's thread
view the rest of the comments
[–] billwashere 3 points 1 day ago (1 children)

And this is a prime example of why these trained models will never be AGIs. It only knows what it’s been trained on and can’t make inferences or extrapolations. It’s not really generating an image’s much as really quickly photoshopping and merging images it already knows about.

[–] HoneyMustardGas 1 points 1 day ago

It's just patterns of pixels. It recognizes an apple as just a bunch of reddish pixels etc, then when given an image of a similar colored red ball, or something, it is corrected until it ceases to recognize something not an apple as an apple. It really does not know what an apple looks like to begin with. It's like declaring a variable. The computer does not know what the variable really means just what to equate it to.