Will Cinnamon merge this with Muffin (Hopefully since ik you can use Gamescope to also get HDR)
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
One day someone will explain to us what HDR is and why we should care.
Currently most monitors use 16bits for colour (65,536 different possible colours).
The human eye can see about 10,000,000.
HDR / True colour is 24bits, 16,777,216 colour variations, which is more than what humans can see.
You should care because it means images on your device will look true to life, especially as screens get brighter materials like gold will look much nicer.
That's not right. Most monitors use 8 bits per color / 24 bits per pixel, though some are still using 6 bpc / 18bpp.
HDR doesn't mean or really require more than 8bpc, it's more complicated than that. To skip all the complicated details, it means more brightness, more contrast and better colors, and it makes a big difference for OLED displays especially.
That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.
Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.
I have a laptop with HDR and back when I was still using Windows I don't think I've ever used it either. It felt like the hardware equivalent to those programs that add screenspace shaders over games lol. Maybe if I played a game or watched a movie that supports HDR I'd change my mind but right now I am clueless. Maybe with the new GNOME
Don't worry. Everyone who needs it uses Windows.
See what I did there?
how does that work if the Wayland color management thing still isn't merged?
About fucking time!
Is HDR useless on OLED?
HDR shines the most on OLED. Pun not intended. 😅
So HDR make picture brighter? I thought it make color more vibrant that's why I thought it's useless on OLED XD
No, HDR can’t make your monitor brighter than it is. But it can take full advantage of the brightness and contrast of modern displays in a way SDR cannot. In almost every case HDR looks better than SDR but brighter and/or more contrasty displays take the most advantage.
In a more technical sense, SDR content is mastered with a peak brightness of 100 nits in mind. HDR is mastered for a peak brightness of 1000 nits, sometimes 2000 nits and the resulting improved contrast.
If you don’t watch movies in HDR on a modern TV, you’re not taking full advantage of its capabilities.
It's brightest on qled 😂
OLED is practically contingent on HDR colour space for optimal experience. You wouldn't want to limit yourself to SDR on that type of display.