this post was submitted on 11 Sep 2024
124 points (96.3% liked)
Games
32907 readers
1278 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It’s hard to tell if this would have been the case in past generations. I think this only became true once we crested the graphical plateau where all games look “good enough” in HD.
Not like console gamers were ever given a choice, but PC gamers kept wanting PC ports for more frames over the 30 fps standard. Graphics were already good during the PS4 era and PS5 is still crutching so hard on PS4 games during their PS5 pro showcasing. Now console users wanting the same after finally getting the option over a decade later I think shows they aren't too different from PC gamers in loving frames.
That sends me back to when people in online discussions regularly claimed anything above 60 fps is pointless because the human eye can‘t see more than that anyway
That claim is such a pet peeve of mine. That's not even how our eyes work, and it's demonstrably untrue.
It can even be proven false by rapidly moving the mouse cursor across the screen very quickly and the lack of motion blur.
Even of the eye notices it, it's not really a big deal most of the time unless you play some real time multiplayer game, and going from 60 to 120 literally doubles the amount of frames that the GPU needs to process, thus raising the GPU requirement for no fucking reason 99% of the time.