this post was submitted on 19 Oct 2024
391 points (99.0% liked)

Technology

59882 readers
4657 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Buffalox 5 points 1 month ago (1 children)

Lack of competition results in complacency and stagnation.

This is absolutely true, but it wasn't the case regarding 64 bit x86. It was a very bad miscalculation, where Intel wanted bigger more profitable server marketshare.
So Intel was extremely busy with profit maximization, so they wanted to sell Itanium for servers, and keep the x86 for personal computers.

The result was of course that X86 32 bit couldn't compete when AMD made it 64bit, and Itanium failed despite HP-Compaq killing the worlds fastest CPU at the time the DEC Alpha, because they wanted to jump on Itanium instead. But the Itanium frankly was an awful CPU based on an idea they couldn't get to work properly.

This was not complacency, and it was not stagnation in the way that Intel made actually real new products and tried to be innovative, but with the problem that the product sucked and was too expensive for what it offered.

Why the Alpha was never brought back, I don't understand? As mentioned it was AFAIK the worlds fastest CPU when it was discontinued?

[โ€“] [email protected] 1 points 1 month ago

so they wanted to sell Itanium for servers, and keep the x86 for personal computers.

That's still complacency. They assumed consumers would never want to run workloads capable of using more than 4 GiB of address space.

Sure, they'd already implemented physical address extension, but that just allowed the OS itself to address more memory by enlarging the page table. It didn't increase the virtual address space available to applications.

The application didn't necessarily need to use 4 GiB of RAM to hit those limitations, either. Dylibs, memmapped files, thread stacks, various paging tricks, all eat up the available address space without needing to be resident in RAM.