this post was submitted on 20 Sep 2023
556 points (95.6% liked)

Technology

58103 readers
4077 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AbouBenAdhem 54 points 1 year ago (56 children)

The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.

Ok, so why not wait until those hypothetical violations occur and then sue?

[–] [email protected] -4 points 1 year ago (17 children)

Because that is far harder to prove than showing OpenAI used his IP without permission.

In my opinion, it should not be allowed to train a generative model on data without permission of the rights holder. So at the very least, OpenAI should publish (references to) the training data they used so far, and probably restrict the dataset to public domain--and opt-in works for future models.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago) (4 children)

I don't see why they (authors/copyright holders) have any right to prevent use of their product beyond purchasing. If I legally own a copy of Game of Thrones, I should be able to do whatever the crap I want with it.

And basically, I can. I can quote parts of it, I can give it to a friend to read, I can rip out a page and tape it to the wall, I can teach my kid how to read with it.

Why should I not be allowed to train my AI with it? Why do you think it's unethical?

[–] Anonymousllama 3 points 1 year ago

Next if you come up with some ideas of your own fantasy environment after watching game of thrones, they'll want to chase you down considering they didn't give you expressed permission to be "inspired" by their work 🙄

[–] gmtom 2 points 1 year ago

Its amazing how many people are against overly restrictive copyright rules that hamper creativity.... until it involves AI.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

And basically, I can. I can quote parts of it, I can give it to a friend to read, I can rip out a page and tape it to the wall, I can teach my kid how to read with it.

These are things you're allowed to do with your copy of the book. But you are not allowed to, for example create a copy of it and give that to a friend, create a play or a movie out of it. You don't own the story, you own a copy of it on a specific medium.

As to why it's unethical, see my comment here.

[–] [email protected] 3 points 1 year ago (1 children)

I agree, the ownership is not absolute.

However, just as a person does not own the work of an author, the authors do not own words, grammar, sentences or even their own style. Similarly, they do not own the names of the characters in their books or the universe in which the plot is happening. They even do not "own" their own name.

So the only question remaining becomes whether is AI allowed to "read" a book. In the future authors might prohibit it, but hey, we're just going to end up with a slightly more archaic-speaking GPT over time because it will not train on new releases. And that's fine by me.

[–] [email protected] 1 points 1 year ago (1 children)

I think that in the end it should be a matter of licenseship (?). The author might give you the right to train a model on it, if you pay them for it. Just like you'd have get permission if you want to turn their work into a play or a show.

I don't think the argument (not yours, but often seen in discussions like these) about "humans can be inspired by a work, so a computer should be allowed to be as well" holds any ground. For it would take a human much more time to make a style their own, as well as to recreate large amounts of it. For a ai model the same is a matter of minutes and seconds, respectively. So any comparison is moot, imho.

[–] [email protected] 1 points 1 year ago (1 children)

But the thing is, it's not similar to turning their work into a play or a TV show. You aren't replicating their story at all, they put words in a logical order and you are using that to teach the AI what the next word logically could be.

As for humans taking much more time to properly mimic style, of course that's true (assuming untrained). But an AI requires far more memory and data to do that. A human can replicate a style with just examples of that style given time. An AI needs to scrape basically the entire internet (and label it, which takes quite some time) to be able to do so. They may need different things but it's ridiculous to say that they're completely incomparable. Besides, you make it sound like AI is it's own entity that wasn't created, trained, and used by humans in the first place.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

It's not the same as turning it into a play, but it's doing something with it beyond its intended purpose, specifically with the intention to produce derivatives of it at an enormous scale.

Whether or not a computer needs more or less of it than a human is not a factor, in my opinion. Actually, the fact that more input is required than for a human only makes it worse, since more of the creators work has to be used without their permission.

Again, the reason why I think it's incomparable is that when a human learns to do this, the damage is relatively limited. Even the best writer can only produce so many pages per day. But when a model learns to do it, the ability to apply it is effectively unlimited. The scale of the infraction is so exponentially more extreme, that I don't think it's reasonable to compare them.

Lastly, if I made it sound like that, I apologise, that was not my intention. I don't think it's the models fault, but the people who decided to (directly or indirectly by not vetting their input data) take somebody's copyrighted work and train an LLM on it.

[–] [email protected] 1 points 1 year ago

I don't think the potential difference between how much damage can be caused is a reasonable argument. After all, economic damages to writers from others copying, plagiarizing their work or style or world is limited not because it's hard for humans to do so, but because we made it illegal to make something so similar to another person's copyrighted work.

For example, Harry Potter has absolutely been copied to the extent legally allowed, but no one cares about any of those books because they're not so similar that they affect the sales of Harry Potter at all. And that's also true for AI. It doesn't matter how closely it can replicate someone's style or story if that replication can never be used or sold due to copyright infringement, which is already the case right now. Sure you can use it to generate thousands of books that are just different enough to not get struck down, but that wouldn't affect the original book at all.

Now, to be fair, with art you can be more similar to others art, because of how art works. But also, to be fair, the art market was never about how good an artist was, it was about how expensive the rich people who bought your art wanted it to be for tax purposes. And I doubt AI art is valuable for that.

[–] [email protected] 0 points 1 year ago

Ownership is never absolute. Just like with music - you are not allowed to use it commercially i.e. in your restaurant, club, beauty salon, etc. without paying extra. You are also not allowed to do the same with books - for example, you shouldn't share scans online, although it's "your" book.

However, it is not clear how AI infringes on the rights of authors in this case. Because a human may read a book and produce a similar book in the same style legally.

load more comments (12 replies)
load more comments (50 replies)