this post was submitted on 21 May 2024
59 points (91.5% liked)
Hardware
5037 readers
1 users here now
This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.
Rules:
- Posts must be relevant to electronic hardware
- No NSFW content
- No hate speech, bigotry, etc
founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I posted this in another thread but there are some applications where this display technology is actually needed. For example with VR/AR having a 1000Hz display would mean each frame is only displayed for 1ms. Being that quick would mean the headset would be able to better display the micro movements your head and body makes which inturn reduces the disconnect and motion sickness people get with VR/AR.
yes, but, it reduces the computation cycle for that frame to 1ms.
as a dev, that's daunting.
90hz is generally enough for most people to not get motion sick. Some headsets do 120 which is like 8ms frame time. Humans can barely detect a flash of light that lasts for that long.
The last sentence is simply incorrect. Humans can detect single photons in specific environments. https://www.nature.com/articles/ncomms12172
In real environments it depends very much on the brightness of the flash of light.
You absolutely can tell the difference between 90Hz and 1kHz. Just draw a squiggly line! See this video for a rather dramatic demonstration:
Microsoft Research: High Performance Touch
This is a demonstration of latency, not frame rate. Did you intend to link something different?
A 1000Hz display necessarily has a latency of 1ms between frames. For 100Hz, that’s 10ms.
But this is only the lower bound. You have to include all other sources of latency, such as software, input hardware, drivers, graphics card, etc.
Ahhh, now I see the connection! It's the update interval. I had to chew on it for a minute but the math checks out.