this post was submitted on 01 Sep 2023
122 points (85.5% liked)
Linux
48052 readers
808 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It doesn't require a meaningful or measurable difference in CPU/GPU to scale my third monitor. That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise. In all cases nearly all of the CPU power is going to the multitude of applications not drawing more pixels.
The concern about battery life is also probably equally pointless. People are normally worrying about scaling multiple monitors in places where they have another exciting innovation available... the power cord. If you are kicking it with portable monitors at the coffee shop you are infinitely more worried about powering the actual display more so than GPU power required to scale it. Also some of us have actual desktops.
There just aren't. It's not blurry. There aren't artifacts. It doesn't take a meaningful amount of resources. I set literally one env variable and it works without issue. In order for you to feel you are justified you absolutely NEED this to be a hacky broken configuration with disadvantages. It's not its a perfectly trivial configuration and Wayland basically offers nothing over it save for running in place to get back to the same spot. You complain about the need to set an env var but to switch to wayland would be a substantial amount of effort and you can't articulate one actual benefit just fictional deficits I can refute by turning my head slightly.
Your responses make me think you aren't actually listening for instance
Please attend more carefully. Scaling and High DPI was a thing on X back when Wayland didn't work at all. xrandr supported --scale back in 2001 and high DPI support was a thing in 2012. Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit. Those of us who aren't in junior high school needed things like High DPI and scaling back when Wayland wasn't remotely usable and now that it is starting to get semi usable I for one see nothing but hassle.
I don't have a bunch of screen tearing, I don't have bad battery life, I have working high DPI, I have mixed DPI I don't have a blurry mess. These aren't actual disadvantages this is just you failing to attend to features that already exist.
Imagine if at the advent of automatic transmissions you had 500 assholes on car forums claiming that manual transmission cars can't drive over 50MPH/80KPH and break down constantly instead of touting actual advantages. It's obnoxious to those of us who discovered Linux 20 years ago rather than last week.
Let me summarize this with your own statement, because you certainly just went out and disregarded all things I said:
Yeah, you are now just outright ignoring people's opinion. 2 hours of battery life - statistical noise, pointless. Laptops - who neeeeeeeeds that, we have desktops!! Lack of fractional scaling which people literally listed as a "disadvantage" of Wayland before it got the protocol - yeah, I guess X11 is magic and somehow things are not blurry on X11 which has the same problem when XRandR is used.
Do I need to quote more?
Also, regarding this:
Maybe you should take note of when Wayland development had actually started picking up. 2008 was when the idea came up. 2012 was when the concrete foundation started being laid.
Not to mention that it was 2021 when Fedora and Ubuntu made it default. Your experience in 2018 is not representative of the Wayland ecosystem in 2021 at all, never mind that it's now 2023. The 3 years between 2018-2021 saw various applications either implementing their first support, or maturing their support of Wayland. Maybe you should try again before asserting a bunch of opinions which are outdated.
Wayland was effectively rebuilding the Linux graphics stack from the ground up. (No, it's not rebuilding the stack for the sake of it. The rebuilding actually started in X.org, but people were severely burned out in the end. Hence Wayland. X.org still contains an atomic KMS implementation, it's just disabled by default.)
4 years of designing and 8 years of implementation across the entire ecosystem is impressive, not obnoxious.
Something makes me think that you aren't actually using it 20 years ago.
Maybe it's just my memory of the modelines failing me. Hmmm... did I just hallucinate the XFree86 server taking down my system?
Oh noes, I am getting old. Damn.
You ably demonstrate your own inability to listen. The monitor on my right hand side right here as I type this isn't blurry there is no amount of proving that it MUST be blurry that is more persuasive than the fact that as I type this I'm looking at it.
Furthermore I didn't say that the existence of desktops obviated the need to worry about the impact of resolution/scaling on battery life. I said that the impacts on battery life were both minimal and meaningless because mixed DPI concerns by definition concerns exclusively desktops and laptops which are plugged into external monitors at which time logically your computer is also plugged into power. In fact the overwhelming configuration for those which use external monitors is a dock which delivers both connectivity to peripherals and powers. If you are using a desktop OR a plugged in laptop the benefits of scaling more efficiently is zero.
I started using Linux with the release of the very first release of Fedora then denoted Fedora "Core" 1. I'm not sure how you hallucinated that Wayland got 4 years of design and 8 years of implementation. First off by the end of the month it will be 15 years old so you fail first at the most basic of math. Next I'm guessing you want to pretend it got four year of design to make the second number look less egregious.
With graphics programming relatively in its infancy X11 didn't require 15 years to become usable and Apple took how many years to produce their stack was it even one? Working with incredibly powerful hardware, with a wide variety of approaches well understood and documented 15 years is downright embarrassing. Much as I enjoy Linux the ecosystem is kind of a joke.
Or was it you?
2012-2021, or to clarify "Late 2012 to early-mid 2021" seems to be 8-point-something years to me. I dunno, did mathematics change recently or something?
I hope you do understand that graphics weren't as complicated back then. Compositing of windows was not an idea (at least, not a widely spread one) in the 90s. Nor was sandboxing an idea back then. Or multidisplay (we hacked it onto X11 later through XRandR). Or HDR nowadays. Or HiDPI. Or touch input and gestures. We software rendered everything too, so DRI and friends weren't thought of.
In a way... you are actually insulting the kernel developers.
It's not my fault if their work is of poor quality. Here is how people actually experience Wayland out of the box.
https://www.reddit.com/r/Fedora/comments/xdvy7z/multimonitor_scaling_in_wayland_is_totally_broken/ Oh I know its now 11 months old and someone even suggested a magic incantation you can insert into the Linux version of the Windows Registry that might fix some of the problems but this is 2022. In 2015 Wayland proponents were already promoting it as ready for prime time.
Lmao, so we just gonna ignore the people complaining x11...