this post was submitted on 16 Feb 2024
787 points (98.5% liked)
Technology
60022 readers
3338 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This will be an unpopular opinion here.
I'm not against AI but the rules have to be in laws and regulations. First, AI can't use copyrighted material without paying for it. It can't either use material without asking individually.
The second point is that AI can't created copyrighted material. Whatever an AI created, it's free of copyright and everyone can use it.
Third, an AI can't be a blackbox. It has to be comprehensive how it works and what the AI is doing. A solution would be to have source available code.
Fourth, AI can't violate laws, create and push misinformation, and material used for misinforming.
And, of course, anything created using AI has to be indentified as such.
The money is in what the AI can do, the quality of the result, and the quality of the code. All the other things isn't valuable.
Your third point is an active research topic, we can’t explain exactly what generative (and other) models do beyond their generic operation.
If we could explain it, it would just be another rules engine 😅
Most of those laws are unenforcable and some are even undetectable.
Your ideology is getting in the way of objective fact.
Are you kidding? #3 is the second most possible one of that set, it's just a matter of setting up Reproducible / Deterministic Builds.
If you can't replicate a result with control of the software version + the arts input + the randomness seed, then "something else is going on".
The only way to make a clear text LLM is to convert most of the hard storage that humanity produces for the next ten years into storage, and we'd need about 1/4 the processing power of bitcoin mining to have it run at ChatGPT speeds.
Even said, blackbox self-modifying AIs will be the models that win the usefulness wars, and if one country outlaws them then the only result is they will have no defense against countries that don't feel the need to comply with them.
You really don't understand how LLM data blobs are created, do you? Nor do you understand how ridiculously compressed it is?
I imagine that if AI devs didn't sneak around copying people's works in bulk but instead asked for permission or paid for a license, artists wouldn't hate it like they do now.
My gut feeling says that's not entirely true. Generative AI has so many qualities that make could it offensive to so many people, I think we were going to see a pushback from artists regardless. The devs' shitty training practices just happened to give the artists a particularly strong case for grievances.
Yeah artists were fine with publishing companies doing this since the dawn of literacy but this time it is completely different
I'm fine with this as long as the "pay & ask" has an exception for non-commercial, open source projects, otherwise it would mean that only corpos can create models, and everyone else is SOL and thoroughly fucked, because they will pay a license fee to the platforms, and the platforms will just add a new TOS element that by using the platform you consent and withdraw your rights to compensation.