this post was submitted on 01 Sep 2023
192 points (93.6% liked)
Technology
59423 readers
3199 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don't need to know the background of a piece of art to know it's art. I've seen AI generated pieces that touch me, and I've seen "real art" that I do not consider art. How can this be if you're right?
The obvious answer is that art isn't defined by who created it or how it was created, but instead it's defined by the interpretation of whoever views it. An artist using generative AI to make something great is no less art than if they used a brush and canvas, and a non-artist doing the same doesn't suddenly make it "not art".
My point was within the context of the argument saying that it's okay for copyrighted art to be fed to an ai without the artists permission because ai learns like people do and is essentially doing what people do to each other.
But AI don't participate in culture and they're not embodied entities. So they don't have the relational capacity to get art, as I understand it. And therefore they don't learn art in the same way people do, because they're not touched by art the way people are.
It's fine for ai to be used to make art. But to feed ai copyrighted art so the style can be mimicked, automated, and profited from.. that feels a lot more like theft to me then if I went to the art museum and tried to ape a Picasso.
AI don't participate in culture, but the people who make them do, and fair use protects their right to reverse engineering, indexing, and other forms of analysis that create new knowledge about works or bodies of works. These models consist only of original analysis of the training data in comparison with one another, which were selected by their creators based on their learned experiences and preferences.
These are tools made by humans for humans to use, we are in control of the input and the output. Every time you see generative AI output, it's because someone out there made the decision to share. Restricting these models is restricting the rights of the people that use and train them. Mega-corporations will have their own models, no matter the price. What we say and do here will only affect our ability to catch up and stay competitive.
I recommend reading this article by Kit Walsh, a senior staff attorney at the EFF if you haven't already. The EFF is a digital rights group who most recently won a historic case: border guards now need a warrant to search your phone. I'd like to hear your thoughts.
Alright, the article convinced me of the legal argument.
Morally and ethically I think my main issue is knowing that for profit corporations will be putting many of my flesh and blood favorite artists out of work without any sort of compensation.
Really we just need UBI. I think the issue is less about plagiarism and more about livelihood for most people worried about it.
Capitalism trained us to see anything we do as a way to amass wealth. As more things approach post-scarcity, it's going to drive more people to enforce artificial scarcity to keep prices up, like jewelers do with diamonds.
It's not all downsides. There are plenty of free and open source generative models that anyone can use. Ordinary people have new ways to express themselves creatively, learn new things, and entertain themselves, and improve their lives. We're already connecting with each other in ways we couldn’t before, and inspiring one another to get out and start creating.
Here is an alternative Piped link(s): https://piped.video/watch?v=q1TjszE0vDc
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source, check me out at GitHub.
I'll read the links.
Personally, I don't have an issue with people copyrighting things they use an AI to make. I'll let you know if my opinion changes on fair use of already copyrighted work being used to make (commercial) AI.