this post was submitted on 10 Jul 2023
354 points (91.7% liked)

Technology

59599 readers
3107 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Which of the following sounds more reasonable?

  • I shouldn't have to pay for the content that I use to tune my LLM model and algorithm.

  • We shouldn't have to pay for the content we use to train and teach an AI.

By calling it AI, the corporations are able to advocate for a position that's blatantly pro corporate and anti writer/artist, and trick people into supporting it under the guise of a technological development.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 15 points 1 year ago

In fairness, AI is a buzzword that came out well before LLMs. It's used to mean "tHe cOmpUtER cAn tHink!". We play against "AI" in games all the time, but they arent AI as we know it today.

ML (machine learning) is a more accurate descriptor but blah doesn't have the same pizzazz as AI does.

The larger issue is that innovation is sometimes done for innovation's sake. Profits gets mixed up there and a board has to show profits to shareholders and then you get VCs trying to "productize" and monetize everything.

What's more is there are only a handful of players in the AI space, but because they are giving API access to other companies, those companies are building more and more sketchy uses of that tech.

It wouldn't be a huge deal if LLMs trained on copywritten material and then gave the service away for free. As it stands, some LLMs are churning out work that could be protected under copywrite law by humans (AI work can't be copywritten under US law), and turning a profit.

I don't think "it was AI" will hold up in court though. May need to do some more innovation.

Also there are some LLMs being trained on public domain info, to avoid copywrite problems. But works go into the public domain after 70 years past the copywrite holder's death (disney being the biggest extender of that rule), so your AI will be a tad out dated in it's "knowledge".