this post was submitted on 06 May 2024
346 points (95.1% liked)
Technology
60012 readers
2615 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sounds like some sensationalized bullshit. They don't give a single number or meaningful statement and they are paywalled.
I don't disagree that they should back up their claim, but it does intuitively make sense. AI - GPT LLMs in particular - are typically designed to push the limits of what modern hardware can provide - essentially eating whatever power you can throw at it.
Pair this with a huge AI boom and corporate hype cycle, and it wouldn't surprise me if it was consuming an incredible amount of power. It's reminiscent of Bitcoin, from a resource perspective.
No, it makes no sense. India has over a billion people. There's no way that amount of computing power could just magically have poofed into existence over the past few years, nor the power plants necessary to run all of that.
This is a future prediction, not a current observation.
I'm not saying it's correct as a prediction, but "where are the extra power plants" is not good counter-argument.
A couple of months ago the average temperature where I live was well below freezing. Now it's around twenty degrees C.
By this time next year it'll be thousands of degrees!
The current LLM'S kinda suck, but companies have fired huge swaths of their staff and plan in putting LLMs in their place. Either those companies hire back all those workers, or they get the programs to not suck. And making LLMs actually capable of working unsupervised will take more and more energy.
My take is that LLMs are absolutely incredible... for personal use and hobby projects. I can't think of a single task I would trust an LLM to perform entirely unsupervised in a business context.
Of course, that's just where LLMs are at today, though. They'll improve.
LLM's will probably improve at an exponential level - similar to how cpu's did in the 80s/90s. in ~10 generations the LLMs will likely be very useful
Sure, but it's simply not physically possible for AI to be consuming that much power. Not enough computers exist, and not enough ability to manufacture new ones fast enough. There hasn't been a giant surge of new power plants built in just the past few years, so if something was suddenly drawing an India's worth of power then somewhere an India's worth of consumers just went dark.
This just isn't plausible.
If only there had been another widespread, wasteful prior use of expensive and power hungry compute equipment that suddenly became less valuable/effective and could quickly be repurposed to run LLMs...
Pretty sure the big AI corps aren't depending on obsolete second-hand half-burned-out Ethereum mining rigs for their AI training.