this post was submitted on 16 Feb 2024
787 points (98.5% liked)
Technology
60024 readers
3593 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Most of those laws are unenforcable and some are even undetectable.
Your ideology is getting in the way of objective fact.
Are you kidding? #3 is the second most possible one of that set, it's just a matter of setting up Reproducible / Deterministic Builds.
If you can't replicate a result with control of the software version + the arts input + the randomness seed, then "something else is going on".
The only way to make a clear text LLM is to convert most of the hard storage that humanity produces for the next ten years into storage, and we'd need about 1/4 the processing power of bitcoin mining to have it run at ChatGPT speeds.
Even said, blackbox self-modifying AIs will be the models that win the usefulness wars, and if one country outlaws them then the only result is they will have no defense against countries that don't feel the need to comply with them.
You really don't understand how LLM data blobs are created, do you? Nor do you understand how ridiculously compressed it is?