this post was submitted on 10 Jun 2024
960 points (98.3% liked)
Programmer Humor
19809 readers
1115 users here now
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It really isn't. Modern mobile cpus barely sip power.
PL2 on a 14900T is 106W
Edit: I'm an idiot, T series is low power socketed, not mobile. 14900HX has a TDP of 55W but boosts short term to 157W, which is still pretty ridiculous
My 11950H (and all other “full power” Intel mobile CPUs) have a PL1 of >100 watts (109 for mine), and mine a PL2 of 139 watts. This laptop is about an inch thick.
Nothing about this laptop sips power, I’ve gotten as bad as 30 minutes of battery life out of a 90 watt hour battery not playing games.
If you meant cell phones and tablets, that's mostly due to the different architecture. RISC processors are super energy efficient, which also makes them much cooler to run.
x86-64 is a CISC architecture, which tends to be much more power hungry. There are only a couple of very low power Celeron CPUs that work under 10W of TDP, while that's very common among phones' CPUs.
In many cases it's actually RISC under the hood and uses an interpreter to translate the CISC commands and run them in the most optimal manner on the silicon
ARM and RISC-V absolutely scale up to multi-hundred watt server CPUs quite easily. Just look at the Ampere systems you can rent from various VPSes for example
The big benefit that ARM and RISC-V have is they have no established backwards compatibility to keep carrying technical debt forwards. ARM versions their instruction sets and software has to be released for given versions of ARM cores, and RISC-V is simply too new to have any significant technical debt on the instruction set side.
Atom cores were notable for focusing the architecture on some instructions then other instructions would be a slog to execute, so they were really good at certain things and for desktop use (especially in the extremely budget machines they got shoved into) they were painful. Much like how eCores are now. They're very carefully architected for power efficiency, and do their jobs extremely well, but an all eCore CPU is a slog for desktop use in many cases