this post was submitted on 15 Jun 2023
17 points (90.5% liked)
Open Source
31931 readers
212 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon from opensource.org, but we are not affiliated with them.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is an excellent question, seeing as these AIs are mostly trained from publicly available materials. Brian Lunduke created his own The Lunduke Content Usage License 1.0 in which he tasks violating AI companies a hefty fine.
One way to look at this is that AI training bots could be configured to start avoiding any content that falls under this license, fearing these fines. This effect would seem to be the most likely way to successfully protect content from training AI, much likelier in any case than trying to find out any violations after the fact.