this post was submitted on 26 Feb 2024
1486 points (94.9% liked)

Microblog Memes

5858 readers
2868 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] dejected_warp_core 2 points 9 months ago* (last edited 9 months ago)

I agree.

The most succinct way I can make this argument to the layperson is that "AI", as it exists today, is terrifyingly good at mimicry. But that's all it can do. Attributing more to this synthetic neural network makes about as much sense as saying a parrot understands grammar and syntax because it can perfectly reproduce a few words in the right context, or with the right prompt.

From this vantage point, we can clearly see how this technology is severely limited. It can be asked to synthesize new outputs, but that's merely an extrapolation of the input training set. While this isn't all that different from what people can, and often do, it's not a fully rational intelligence that solves problems outside that framing. For that, one needs a general intelligence, capable of extrapolating meaning from context and generating novel concepts.

Moreover, if you want an AI to generate something, you first need to define the general ballpark for the right answer(s). Data gathering, cleaning, categorization (tagging), is a big labor problem that feeds into the AI itself. So there are also a lot of real world problems that don't fit this model for a whole bunch of reasons. Like not having a working dataset at all, information that doesn't digitize well, or areas that are too small to properly feed this process in the first place. People function just fine in those spaces, so again, we can see a gap that is not easily closed.