this post was submitted on 16 Oct 2024
171 points (98.9% liked)

Technology

59896 readers
2748 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Treczoks 22 points 1 month ago (16 children)

This is a sign of ARM approaching the "enough" level. I remember the times when it was actually important to buy the latest PC at least every other year to have enough power to run a basic office suite or similar programs with acceptable speed.

Nowadays, you can staff offices with about any PC off the shelf - it is powerful and big enough to fulfill the needs of the majority of users. Of course there are servers, there are power users, engineers running simulations, and of course gamers who need more power, and who still fuel the cutting edge of PC building. But the masses don't need to be cutting edge anymore. A rather basic machine is enough.

Here comes the ARM: For many years, ARM-based chips were used as SOCs, running anything from washing machines to mobile phones. But they have grown bigger and faster, and I can see them approaching the point that they can cover the basic needs of the average office and home user - which would be a damn big chunk of the market. It would be enough for those needs, but it would be cheaper and in many aspects less troublesome than Intel and AMD. Take for example power consumption in relation to computational power, where ARM is way better than the old and crusty x86 architecture. And less power leads to less cooling requirements, making the machines smaller, more energy efficient, and less noisy.

I can see ARM-based systems approaching this enough level, and I can see that Intel and ARM are deadly afraid of that scenario.

[–] [email protected] 7 points 1 month ago* (last edited 1 month ago) (1 children)

basic needs of the average office and home user

I mean, ARM chips have been at that level of performance for at least a decade by now. Normal people's most strenuous activity is watching Youtube, which every cellphone since what? 2005? could do.

power consumption in relation to computational power

The thing is that's very much not the actual situation for most people.

Only Apple really has high performance, very low power ARM chips you can't really outclass.

Qualcomm's stuff is within single-digit percentage points of the current-gen AMD and Intel chips both in power usage, performance, and battery life.

I mean, that's a FANTASTIC achievement for a 1st gen product, but like, it's not nearly as good as it should be.

The problem is that the current tradeoff is that huge amounts of the software you've been using just does not work, and a huge portion of it might NEVER work, because nobody is going to invest time in making it behave.

(Edit: assuming the software you need doesn't work in the emulation layer, of course.) You might get Photoshop, but you won't get that version of CS3 you actually own updated. You might get new games, but you probably won't get that 10 year old one you like playing twice a year. And so on.

The future might be ARM, but only Apple has a real hat in the ring, still.

(Please someone make better ARM chips than Apple, thanks.)___

[–] IMALlama 2 points 1 month ago

Qualcomm's stuff is within single-digit percentage points of the current-gen AMD and Intel chips both in power usage, performance, and battery life

Back in June, the new Snapdragon X processors were a lot more efficient than their x86 based counterparts. I can personally attest to much lower levels of heat generation.

The problem is that the current tradeoff is that huge amounts of the software you've been using just does not work, and a huge portion of it might NEVER work, because nobody is going to invest time in making it behave.

I agree with the sentiment, but IMO this is a PC and Windows problem. I would also extend this beyond pure comparability. I say this for a few reasons

  • I lose about 5% charge/day with my laptop asleep. It does wake up very quickly, but 5%/day feels like a lot. At this point, I don't think Microsoft has a strong incentive to really optimize the kernal for efficiency
  • Historic massive variability in hardware across devices also makes it hard to optimize efficiency, although the current crop of snapdragon x laptops seem to have less variability
  • One of the strengths of windows is that it can run applications written 20+ years ago fairly reliably. There's a ton of software that's still floating around that hasn't been actively supported in years. I don't see all of these software companies desiring to port their code over, especially without guarantees that the market will adopt ARM (the Apple approach) or until they see the ARM adoption rate go up (the current Windows approach)

All that said, I've had zero issues with emulation so far. I never personally used a M1 max when they launched, but from reports of that era the current Windows experience is at least as good as that.

load more comments (14 replies)