this post was submitted on 17 Nov 2024
35 points (100.0% liked)

Hardware

1109 readers
330 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
top 7 comments
sorted by: hot top controversial new old
[โ€“] theunknownmuncher 17 points 3 months ago

From house fires to datacenter fires ๐Ÿฅฒ they grow up so fast

[โ€“] over_clox 10 points 3 months ago (1 children)

Nvidia? Overheating?

Some things never change.. ๐Ÿ”ฅ

[โ€“] [email protected] 2 points 3 months ago

Nvidia ~~Thermi~~ Fermi never forget

[โ€“] [email protected] 6 points 3 months ago

120kW per rack

I knew GPU compute took a lot of energy, but I didn't realize it was 120 kW per rack. That is a stupid amount.

[โ€“] [email protected] 4 points 3 months ago
[โ€“] UnfortunateShort 3 points 3 months ago (1 children)

Oof, combined with demand and their strong MI300 series, AMD might finally gain some meaningful marketshare in data centers

[โ€“] Acters 2 points 3 months ago

Really hope amd can compete in the power efficiency market, just like how they are crushing Intel with the very power efficient x3d chips for gaming.