this post was submitted on 03 Mar 2025
398 points (96.7% liked)
Technology
63717 readers
3432 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm gonna disagree - it's not like DeepSeek uncovered some upper limit to how much compute you can throw at the problem. More efficient hardware use should be amazing for AI since it allows you to scale even further.
This means that MS isn't expecting these data centers to generate enough revenue to be profitable, and they're not willing to bet on further advancements that might make them profitable. In other words, MS doesn't have a positive outlook for AI.
Exactly. If AI were to scale like the people at OpenAI hoped, they would be celebrating like crazy because their scaling goal was literally infinity. Like seriously the plan that openai had a year ago was to scale their AI compute to be the biggest energy consumer in the world with many dedicated nuclear power plants just for their data centers. That means if they dont grab onto any and every opportunity for more energy, they have lost faith in their original plan.
If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
When all the Telcos scaled back on building fiber in 2000, that was because they didn't have a positive outlook for the Internet?
Or when video game companies went bankrupt in the 1980's, it was because video games were over as entertainment?
There's a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that's the reason AI is over.
It doesn't make any sense to compare games and AI. Games have a well-defined upper bound for performance. Even Crysis has "maximum settings" that you can't go above. Supposedly, this doesn't hold true for AI, scaling it should continually improve it.
So: yes, in your analogy, MS would still buy a new video card this year if they believed in the progress being possible and reasonably likely.
If buying a new video card made me money, yes.
This doesn't really work, because the goal when you buy a video card isn't to have the most possible processing power ever and playing video games doesn't scale linearly so having an additional card doesn't add anything.
If I was mining crypto, or selling GPU compute (which is basically what ai companies are doing) and the existing card got an update that made it perform on par with new cards, I would buy out the existing cards and when there are no more, I would buy up the newer cards, they are both generating revenue still.
But this is the supposition that not buying a video card makes you the same money. You're forecasting free performance upgrades so there's no need to spend money now when you can wait and upgrade the hardware once software improvements stop.
And that's assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.