this post was submitted on 10 Aug 2023
53 points (89.6% liked)
Technology
59092 readers
4946 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
tl;dr, you're right but I'm more concerned about the media and lay people misusing the term, and it just annoys me personally too
I am sorry, I should've been more clear, and I do genuinely appreciate your comment as it is very useful information. I am referring to lay people using the term 'AI'. These institutions and individuals, who I agree are the most trusted on the topic, use the term because it is becoming the term broadly recognized. I have no problem with these people using the term, because they are aware of the context they use it in. I have a problem with people who are not versed in the topic using it, and that is what I agree with Connover on. When the media, and more broadly the general public, use the term AI, they are usually considering it to be a big spooky thing that could come for everyone's jobs, and especially on the Internet the term is used to perpetuate a grift. I don't wish to argue either, I just wanted to append this to clear up what I meant. And don't get me wrong, I'm no expert on the topic, as much as I made it sound like that, I'm just tired of the media and the actual grifters misusing the term without actually understanding the connotations. Artificial Intelligence sounds like something that can replace a human mind, and to some extent, a lot of generative LLMs can do that, but they aren't intelligent, they are just large algorithmic guessing machines, and using AI just feels to me to be misleading in that sense. I know this comes down to personal opinion in the end, at least on my part, but you are right, and I'm just tired of hearing people spout about AI like it's some existential threat as it is now.
Wow that ran on, sorry if you read the whole thing lol. Also you don't gotta reply, I just wanted to clarify what I really meant to say.
Also ETA: sorry I sounded so snarky, I'm just so tired of this whole topic
replying despite your warning. i also won't be offended if you don't read. and the frustration is fair.
TLDR: intelligence is weird, complex, and abstract. it is very difficult for us to comprehend the complex nature of intelligence alien to our own. the human mind is a very specific combination of different intelligent functions.
funny you mention about the technology not being an existential threat, as the two researchers that i'd mentioned were recently paired at the monk debate arguing against the "existential threat" narrative.
getting into the deep end of the topic, i think most with a decent understanding of it would agree it is a form of "intelligence" alien to what most people would understand.
technically a calculator can be seen as a very basic computational intelligence, although very limited in capability or purpose outside of a greater system. LLMs mirror the stochastic word generation element of our intelligence, and a lot of weird neat amazing things that come with the particular type of intelligent system that we've created, but it definitely lacks much of what would be needed to mirror our own brand of intelligence. it's so alien in function, yet so capable at representing information that we are used to, it is almost impossible not to anthropomorphise.
i'm currently excited by the work being done in understanding our own intelligence as well
but how would you represent a function so complex and abstracted as this in a system like GPT? if qualia is an emergent experience developed through evolution reliant on the particular structure and makeup of our brains, you would need more than the aforementioned system at any level of compute. while i don't think the principle function would be impossible to emulate, i don't think it'd come about by upscaling GPT models. we will develop other facsimiles more aligned with the specific intentions we have for the tool the intelligence is designed and directed to be. i think we can sculpt some useful forms of intelligence out of upscaled and altered generative models, although yann lecun might disagree. either way, there's still a fair way to go, and a lot of really neat developments to expect in the near future. (we just have to make sure the gains aren't hoarded like every other technological gain of the past half century)
Here is an alternative Piped link(s): https://piped.video/Ak5DdazBOow
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.