this post was submitted on 07 Sep 2023
50 points (96.3% liked)

PC Master Race

15063 readers
32 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
 

Lately we've been talking about games not performing well enough on current hardware. It's had me wondering just what we should be asking for. I think the basic principle is that components from the last 5 years should be adequate to play current-generation titles at 1080p60. Not at max settings, of course, but certainly playable without resorting to DLSS and FSR.

It makes me wonder: is it really so much to ask? There are games from 10+ years ago that still look great or at least acceptable. Should we expect new games like Starfield to be configurable to be as demanding as an older game like Portal 2 or CS:GO. If the gameplay is what really matters, and games of the 2010s looked good then, why can't we expect current games to be configurable that low?

From what I've seen, users of the GTX 1070 need to play Starfield at 720p with FSR to get 60fps. What's better? Getting 60fps by playing at 720p with FSR, or playing at 1080p with reduced texture resolution and model detail?

It shouldn't even be that hard to pull off. It should be possible to automatically create lower detail models and textures, and other details can just be turned off.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

Game dev working in a veteran studio here (not a veteran myself but lots of exposure to vets)

I agree, modern games spec their minimum a bit too high for what is possible. I am very against when games judge their performance using DLSS and FSR. I think they are perfectly good tools for giving MORE fps or allowing the use of higher resolutions without tanking the performance, but modern games need to stop using it as the baseline for performance.

But the very last line you mentioned really isn't true, at all. Rendering is incredibly complicated. Automatically creating lower detail models and textures is not simple, LODs and lower res assets are made easier with tools but it is still a complex process and requires lots of efforts by many talented artists. Ensuring they work well in your engine is not an automatic process.

I know somebody is going to mention something like blah blah nanite blah blah lumen blah blah unreal engine, but unreal engine is not a fix all for everything. We don't want the games industry to be all using a single game engine, that is unhealthy for software, the games industry, and locks all talent to a single piece of software. Also lumen and nanite don't even help with performance on lower end devices, they are mostly designed for mid to high end graphics as both are intensive processes on their own.

Then there's the whole thing about modern rendering techniques. Ever since the birth of graphics, hardware has constantly improved and they've changed and things have had to be left behind...

  • Fixed pipeline cards vs programmable pipeline graphics cards brought a huge challenge to developers at the time because they now had to support both types of cards. We no longer support fixed pipeline cards as they are obsolete.

  • Compute shaders allowed for offloading work to the gpu and also modern enhanced post effects and rendering techniques. This meant games had to support both a compute and non compute solution for necessary graphics effects. This is not an easy process. Modern games tend to require compute cores as all modern and last gen consoles support some sort of compute shader support I believe, as well as modern gpus.

The same will happen with modern rendering techniques and raw gpu power. The difference between a 980ti and a 4080ti is absolutely insane, and the advent of Ray tracing and AI cores has widened the gap even more. Devs need to make concessions and cut off a certain range of hardware to make achieving their games possible. Tech innovations allow game devs to use tools, methods and realise concepts that were previously either impossible, or significantly affected due to technological limitations, but they can't make those innovations if they are held back by a much older set of hardware that can't do what modern hardware can. That balance is important, and some games (teardown for example) need to leave behind aging hardware so that the game is actually possible.

That said, I know for a fact that if they can make a game run on a switch, or an xbox one, or a ps4, then they can most definitely make the game run on a graphics card of that time. Game devs do a lot of hacky shit to get games to run on old hardware like that (ps4 came out ten years ago), so I understand if it doesn't quite reach that level of optimisation, but if your game runs on a ps4 it should run pretty well at low graphics on a 980ti.

Anyways I'm pretty tired so mind any mistakes, but the issue isn't just "game devs are lazy", there's so many layers to it. The tools for games nowadays are vast, but they are still incredibly hard to make as complexity of games continues to rise, so issues you face are likely issues that software engineers struggle and struggle to resolve. Not saying games like starfield don't deserve criticism, just saying to be mindful and check your assumptions before assuming that it's a simple problem to solve.