And DIY ML use cases (local LLM, video/audio upscaling, image generation).
Hardware
All things related to technology hardware, with a focus on computing hardware.
Rules (Click to Expand):
-
Follow the Lemmy.world Rules - https://mastodon.world/about
-
Be kind. No bullying, harassment, racism, sexism etc. against other users.
-
No Spam, illegal content, or NSFW content.
-
Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.
-
Please try and post original sources when possible (as opposed to summaries).
-
If posting an archived version of the article, please include a URL link to the original article in the body of the post.
Some other hardware communities across Lemmy:
- Augmented Reality - [email protected]
- Gaming Laptops - [email protected]
- Laptops - [email protected]
- Linux Hardware - [email protected]
- Mechanical Keyboards - [email protected]
- Microcontrollers - [email protected]
- Monitors - [email protected]
- Raspberry Pi - [email protected]
- Retro Computing - [email protected]
- Single Board Computers - [email protected]
- Virtual Reality - [email protected]
Icon by "icon lauk" under CC BY 3.0
Hope it will be possible to use two cards together for 48GB of VRAM ๐ค.
But does it support Pytorch
Just have to install 5 different Python versions and somehow debug the tools to properly link them to each individual version.
If I understand correctly, these cards are less energy efficient than at least nVidia's. Which makes me sad that nobody takes this into the account - both environmentally and price wise.
Less efficient than NVIDIA, but still more efficient than last gen.
They just entered the market, it will take some time to mature. The B580 is definitely a great contender when you can get it for MSRP. In my country in W Europe I'd pay โฌ329 which is $344, so the benefit of a sharply priced alternative to a 4060 with 50% more VRAM is moot as its actually the same price or more for me.
It is a second gen product though, and it's leaps ahead of first gen. Imagine where the next architecture could be. Meanwhile Nvidia relies heavily on making the chips gigantic to improve performance between generations.
Also these cards will greatly improve as the drivers mature, while Nvidia are already matured and have 25 years of baggage
I certainly hope that they improve, and become a serious competition, however, we saw that AMD is kinda stuck behind NVidia and Intel is anything but guaranteed to succeed, specially by seeing their power hungry CPU downfall.
It's their second generation. You cannot possibly expect them to be competitive with the big players who are established for many years already. For what its worth, I think Intel is doing pretty decently here already.
Probably more efficient on an absolute basis. I believe 5090 will be 600 watts and 5080 will be 400 watts.
You should compare it to a model that's in the same range, not those monsters.