this post was submitted on 16 Jan 2025
28 points (100.0% liked)

Hardware

873 readers
281 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 5 points 1 day ago (1 children)

It was rather obvious Nvidia was fudging performance results via framegen. Now let’s see those numbers verified independently and we’ll land in a typical generation to generation improvement probably. I wouldn’t be surprised if neural shaders demo was done by an actual demoscene coder because it looked like something someone could do in 64k demo competition and not like something that a developer could do at scale without a horde of artists.

I hope that things like input lag, frame persistence and frame time consistency become metrics we use to judge GPUs as soon as possible because current ones are no longer representative of hardware performance.

[–] I_Has_A_Hat -1 points 1 day ago* (last edited 1 day ago) (1 children)

"Nvidia is faking their boosted performance by using technology that boosts performance!"

I seriously do not get the complaints about DLSS. Does it make games look better with more frames? Oh it does? And only people doing a frame-by-frame analysis to find things to complain about can tell any difference? Ok, so what's the issue? Seems like people are only complaining because it uses the scary AI words and they absolutely refuse to acknowledge that AI could be useful in any way, shape, or form.

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

I’m not complaining about AI upscaling or image quality, I’m complaining that real performance gets lost when you focus on FPS only the way Nvidia presents it.

Modern frame-gen has miraculously small latency cost but what’s important is that it doesn’t fix latency or inconsistent frame pacing at all. You’ll smooth out frame pacing issues at high enough FPS which cool but that’s it. When you’re at too low of a frame rate you won’t fix latency by adding more AI interpolated frames (because they don’t include your inputs) but by churning out real frames faster (upscaled is still real in this context). Once your base frame rate is at good enough level then frame gen sees diminishing returns and then you wonder what was the point in the first place. See Immortals of Aveum on Xbox Series X as an example.