this post was submitted on 20 Sep 2023
556 points (95.6% liked)
Technology
59671 readers
4136 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean this isn't miles away from what the writer's strike is about. Certainly I think the technology is great but after the last round of tech companies turning out to be fuckwits (Facebook, Google etc) it's only natural that people are going to want to make sure this stuff is properly regulated and run fairly (not at the expense of human creatives).
As it stands now, I actually think it is miles away.
Studio's were raking in huge profits from digital residuals that weren't being passed to creatives, but AI models aren't currently paying copyright holders anything. If they suddenly did start paying publishers for use, it would almost certainly exclude payment to the actual creatives.
I'd also point out that LLM's aren't like digital distribution models because LLM's aren't distributing copyrighted works, at best you can say they're distributing a very lossy (to use someone else's term) compressed alternative that has to be pieced back together manually if you really wanted to extract it.
No argument that AI should be properly regulated, but I don't think copyright is the right framework to be doing it.