this post was submitted on 15 Dec 2024
186 points (95.1% liked)
Technology
59974 readers
3473 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
On HDMI 2.1, you can do 8K30fps before you have to compress the stream with DSC, which is "visually lossless", so actually lossy. We don't even have 5K120fps or 4K240fps without compression. These are common refresh rates for gaming. So you could say that the highest resolution that supports all use cases without compromises is 1440p. That's definitely not enough even by today's standards. I think you're underestimating the time it takes for standards to reach widespread adoption. The average viewer is not going to have access to it until the technology is cheap and enough time has passed for at least a few hundred million units to have reached the market. If you think you're going to be using it in 2030 or later for your "average broadcast", then it needs to be designed today.
Of course HDMI is shit for reasons you mention. DisplayPort is better, but it's not an open standard either and it supports DRM as well. But designing for higher bandwidth is necessary and it has nothing to do with HDMI's problems.