this post was submitted on 15 Jun 2023
17 points (90.5% liked)

Open Source

31044 readers
919 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
 

I really don't want my photos, writings, etc to be used for things like StableDiffusion or ChatGPT, but some of them I still want to release under an open access license that's free for others to use in conventional derivative works, just specifically not AI. Does such a license exist?

Or at the very least, if my work is to be used to train AI, then I think the final models and neural networks themselves need to be open source and also free for anyone to use (as in, people should be able to download and run the AI on their own computer, not have to use the company's web app. Does CC-BY-SA protect against this since it requires that any derivative works also be released under the same license? Does it work like GPL in that regard?

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 1 year ago (1 children)

I have no actual answer, but given the very messy state of AI legality right now, I imagine it could be a while before we're even able to define everything well enough to establish a solid legal framework for this sort of thing.

That said, I'd be happy to be proven wrong - this is definitely an important idea for society moving forward.

[–] [email protected] 0 points 1 year ago (2 children)

I mean, is copyright not specifically designed (by the big corporations mind you) to default to not allowing content to be used unless permission is explicitly given by the rights holder? So shouldn't the answer to whether any content can be used is a big NO unless the author or distributor specifically allows it to be used?

[–] azuth 2 points 1 year ago (1 children)

There's this thing called fair use .

The usage is clearly limited as can be determined by size of trained materials versus size of models. I would argue the use is transformative enough, after all you got from text/image inputs to effectively a tool that can produce texts and images.

[–] [email protected] -1 points 1 year ago (1 children)

If I write a story too similar to a Disney movie I will get sued, yet this is okay? Wtf

[–] [email protected] 1 points 1 year ago

I am not a lawyer. This is not a legal advice.

I just want to note you here that ideas (in creations) alone are not subject to copyright (see: Wikipedia). (Ideas involving a particular invention can be subject to patents but Disney movie is not an invention, so this is not the case).

This way, if you watch a movie, become inspired by it, and just take the idea of the story, and implement in your creation with different characters, then you should be fine. However, you cannot use the assets of that movie (original names, fashion, specific attributes of characters, scenery).

At the end of the day, for such things, you may want to consult your advocate (a.k.a. lawyer). Court practice makes DOs and DON'Ts much clearer.

[–] [email protected] 1 points 1 year ago

I'm not a copyright lawyer or a regular lawyer or even a well-informed citizen really, so I couldn't really say. Certainly in this case, if a project is under a more permissive license, I imagine the intention could be argued either way as far as AI is concerned.

[–] [email protected] 7 points 1 year ago (2 children)

In theory, a copyleft licence should work. The problem however is a) how are you going to find out and b) how are you going to enforce it?

[–] [email protected] 1 points 1 year ago

So, CC-BY-SA, which would require AI training database to be copyleft...

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

You're going to enforce it the same way you enforce the license on every random github project you make: you don't. Didn't they discover that Windows xp contained gnu code in it or something? I mean who prevents corporations from stealing random open source code for use in their shitty corporate closed source projects? It's not like anyone would ever find out.

[–] [email protected] 4 points 1 year ago

I don't think the legal protections will be effective to prevent AIs from being trained on your works, because the data sets used to train the models are scraped from art sharing websites and it won't be possible to identify that your art was part of the training set, legally or not. A better way is to use a tool like Glaze, which modifies your artwork so it looks the same to a human viewer but introduces errors when fed to an AI model.

[–] [email protected] 4 points 1 year ago

You might have more luck asking on https://opensource.stackexchange.com/. I'd certainly hope that somebody using data from an AI trained on that image should be required to give attribution or shouldn't be allowed to use it if modification is not allowed.

[–] [email protected] 3 points 1 year ago

I worked in data collection for an AI project (in a specialized domain, so no text or picture), and pour lawyer guided us on two things: first, we needed explicit approval to use some data, and second, if someone retracted their approval, we could keep the data as long as we couldn't "trace it back" to that person... I ended up leaving the project.

[–] Protegee9850 2 points 1 year ago* (last edited 1 year ago)

No such a license does not exist. It is very likely that stable diffusion and the like are more akin to fair use machines than plagiarism ones, transformative enough to rely on a fair use defense even if you use all-rights-reserved and don’t go open at all. EFF has a good explainer imo. https://www.eff.org/deeplinks/2023/04/how-we-think-about-copyright-and-ai-art-0 Source am lawyer with copyright background

[–] [email protected] 2 points 1 year ago (1 children)

I think one of the problems right now is the lack of a proper legal definition of what is AI doing with your material. A human learning how to create original work by reading your work would not be required to cite it. The question is why and how exactly is AI doing something different.

[–] [email protected] 0 points 1 year ago

But if a human straight up copies someone else's writing, that's illegal. AI spits out word for word passages from training data all the time.

[–] TheCakeWasNoLie 1 points 1 year ago* (last edited 1 year ago)

This is an excellent question, seeing as these AIs are mostly trained from publicly available materials. Brian Lunduke created his own The Lunduke Content Usage License 1.0 in which he tasks violating AI companies a hefty fine.

One way to look at this is that AI training bots could be configured to start avoiding any content that falls under this license, fearing these fines. This effect would seem to be the most likely way to successfully protect content from training AI, much likelier in any case than trying to find out any violations after the fact.

load more comments
view more: next ›