this post was submitted on 11 Nov 2024
34 points (97.2% liked)
PC Gaming
8568 readers
369 users here now
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
One of the main points of this video is that 1080p testing is the only thing you should be looking at for CPU benchmarks (to the point that HWUnboxed is no longer doing 2k/4k testing in the future I think?), and although I was skeptical at first, the future-proof section did finally convince me. The new problem with this line of thinking is that you really need to be cross-referencing a GPU benchmark to figure out what a real world 2k/4k scenario will look like for the CPU you're interested in.
I don't understand this line of thinking. I mean from the point of view of a reviewer and maybe some enthusiasts it make sense but for the average Joe like me with a 5900x that only play games at 4k and who is planning to upgrade to a 5800/5900 GPU when they release, all we want to know is: Will this new CPU improve my FPS? And if so by how much.
As I understand it, the assertion is that the 1080p FPS is the same as 2k/4k FPS, assuming that you have an infinitely powerful GPU. So the 1080p FPS is your max potential FPS at any resolution with the CPU, and then you need to look at a GPU 2k/4k chart to see how much FPS it can achieve from that target. HWUnboxed also reasons that gamers are not blindly using ultra settings, so in real scenarios people are going to be lowering their settings to try to achieve a specific FPS target anyway. They also mention that lowering ingame settings doesn't usually affect the CPU FPS benchmark.
So in summary, the 1080p CPU benchmark is the ~highest possible target you can achieve, and then it's up to your GPU and ingame settings to decide how much of that target you can reach. It's a little more difficult to grasp and calculate mentally, but it prevents the 2k/4k benchmark data from showing what is effectively misleading "point in time" data that will not be useful if you have a different GPU or ingame settings. This is most clearly demonstrated by re-reviewing older CPUs in the future-proof section and showing that putting massive GPUs on old CPUs puts the FPS benchmarks of all resolutions to roughly the same value - i.e. the CPU doesn't truly have an effect w/r/t resolution, it's mainly just the GPU.
Yes, I understand the theory but at the end of the day I watch reviews so I don’t have to interpolate data or make guessing. If I see something like this I can clearly see that, given the same GPU I can expect that my 5900x should be around 6% slower than the 9800X 3D at 4k on average and here I can see that it is around 15% slower at 1440p which is useful for when using DLSS. This makes clear for me that an upgrade is not worth it. Those are the kind of details I want to view on reviews.