this post was submitted on 10 Jul 2023
354 points (91.7% liked)

Technology

59599 readers
3107 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Which of the following sounds more reasonable?

  • I shouldn't have to pay for the content that I use to tune my LLM model and algorithm.

  • We shouldn't have to pay for the content we use to train and teach an AI.

By calling it AI, the corporations are able to advocate for a position that's blatantly pro corporate and anti writer/artist, and trick people into supporting it under the guise of a technological development.

(page 2) 39 comments
sorted by: hot top controversial new old
[–] [email protected] 2 points 1 year ago

Aren’t these sentences exactly the same

[–] [email protected] 2 points 1 year ago

We shouldn’t have to pay for the content we use to train and teach an AI.

If you replace "AI" with "person" it's not true so why would it be for AI?

[–] Greenskye 1 points 1 year ago (2 children)

We shouldn’t have to pay for the content we use to train and teach an AI

Wait people think that sounds reasonable?

load more comments (2 replies)
[–] DrQuint 1 points 1 year ago

If we're unmasking tech, LLM's right now are also all just Computer Vision models with a lot of more abstraction layers thrown at them. Nothing but fit assessment machine with a ludicrous amount of extra steps.

I am convinced this is all pedantry, and these models are going to become the de facto basis for true AI at some point. It was already weird enough that this type of tech got discovered from the goal of checking if an image has a cat or not.

[–] [email protected] 1 points 1 year ago (1 children)

What is meant by the term "AI" has definitely shifted overtime. What I would have considered to be an AI back in time is nowadays referred to as an "AGI". So they simply changed the language. LLMs are not really capable of "intelligence" they are just automated statistics. On the other hand what really is intelligence? The output does appear intelligent. Maybe in the end it does not matter how it is generated.

[–] [email protected] 1 points 1 year ago

There a great Wikipedia article which talk about it. Basically AI has always been used as a fluid term to describe forms of machine decision making. A lot of the times it’s used as a marketing term (except when it’s not like during the AI Winter). I definitely think that a lot of the talk about regulation around “AI” is essentially trying to wall off advanced LLMs to the companies who can afford to go through the regulation paperwork while making sure those who are pushing for regulation now stay ahead. However, I’m not so sure calling something AI vs LLMs will make any difference when it comes to actual intellectual property litigation due to how the legal system operates.

[–] [email protected] 0 points 1 year ago

It’s just a happy coincidence for them, they call it AI because calling it “a search engine that steals stuff instead of linking to it and blends different sources together to look smarter” wouldn’t be as interesting to clueless financial markets people

load more comments
view more: ‹ prev next ›