this post was submitted on 08 Dec 2024
457 points (94.5% liked)
Technology
59882 readers
3274 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Imagine if the engineers for computers were just as short sighted. If they had stopped prioritizing development when computers were massive, room sized machines with limited computing power and obscenely inefficient.
Not all AI development is focused on increasing complexity. Much is focused on refinement, and increasing efficiency. And there’s been a ton of progress in this area.
This article and discussion is specifically about massively upscaling LLMs. Go follow the links and read OpenAI's CEO literally proposing data centers which require multiple, dedicated grid-scale nuclear reactors.
I'm not sure what your definition of optimization and efficiency is, but that sure as heck does not fit mine.
Sounds like you’re only reading a certain narrative then. There’s plenty of articles about increasing efficiency, too.