this post was submitted on 24 Jan 2024
143 points (93.3% liked)
Linux
48655 readers
1585 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you avoid Nvidia, it have been ready for many years. And to be honset, not sure X11 was really stable with Nvidia either. My main issue with Wayland, is that X doesn't have multi dpi support... and for that I really cannot blame Wayland. Also, Skype doesn't have screensharing, well, they actually had for a while, but then removed it... still, hard to blame on Wayland.
But as a general rule, if you have Nvidia, then you are not allowed to complain about anything... that was your choice, and with Nvidia under Linux, all bets are off. I thought that was clear a long time a go, especially after Linus not so subtle outburst.
Machine learning pays my bills, and I never had a choice on my graphics card brand. To be sure, I wanted an AMD for the open source drivers, but CUDA remains essential to me. RocM support from AMD is a joke, and isn't anywhere close to an alternative. Reseachers release code that only runs on CUDA for a good reason. To say that I don't get to complain is going too far
Exactly. You'd think with the two things they're really competitive on being raw flops and memory, they'd be a viable option for ML and scientific compute, but they're just such a pain to work with that they're pretty much irrelevant.
You get to complain to Nvidia, not Linux developers and maintainers.
That's true, but it also wasn't fair to be a Wayland detractor then.
Nvidia needed to do stuff to make that combination viable, and their delay in doing so wasn't anyone's fault but Nvidia’s