this post was submitted on 31 Aug 2024
-17 points (32.7% liked)
Games
32726 readers
2515 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
No it's literally how software works. New hardware comes out, you do more with the hardware, old hardware can't do the new things and runs worse.
I get what you're saying, but I've been upgrading my PC over the years and still noticed that with games of big game companies, they care less and less about performance. I firmly believe that publishers, in an attempt to cut costs, tell the game studio to not prioritize performance, while trying to rely on software like super resolution algorithms, to make their games run. In some instances they reused old game Engines for a new and bigger game, for example with Cyberpunk, Stellaris and Elden Ring. Smaller developers are doing everything they can to make a game run smoothly. The best example for this is Factorio. That is my opinion and I totally understand your point of view.
So, I agree there's some amount of that. You also have things like Dice (the studio that makes Battlefield) where they lost their veteran development team to poor internal management.
There are also some (now fairly large) studios that are just absolutely terrible at game performance like Studio Wildcard (makers of the Ark games).
There's definitely some of this too. I believe the bigger issue is that games have gotten so much bigger and more expensive to develop. Making and shipping a game that runs with 4k textures, dynamic (possibly ray traced) lighting, variable rate shading (instead of manual level-of-detail systems), etc is a lot to get right.
A common thing with any software development is to take advantage of newer abstractions that make your life easier. For instance, I'm fairly confident Hunt Showdown 1896 has moved to some form of variable rate shading instead of level-of-detail (in pre-1986 when you zoomed in on some of the trees they'd literally change shape when they flipped between the models in the worst case; I've yet to see that post-1896). Not having to make a bunch of models and having the software "just figure out" good lower-poly models for things that are sufficiently far away is presumably a huge productivity boost. Similarly, when ray-traced lighting becomes the standard a lot of game development will get easier because setting up lighting won't (per my understanding) require as many tricks. In both cases, it's both less work for developers and a better result for players with the hardware to run it.
Old engines aren't necessarily a bad thing (if they're appropriately updated) and I think people focus too much on the engine vs the game play. Take Starfield, I've heard a lot of people complain about it on forums for copying a similar formula as some of Bethesda's past titles.
The issue almost certainly isn't the engine used, but the design choices associated with using that engine (and the decision to not make new things work).
Linux, Darwin (MacOS), Windows, Chrome, Firefox, etc are all long running software projects (as are Unreal Engine, Unity, Source Engine, CryEngine, etc). Occasionally, someone throws out their current product entirely and replaces it, but normally there are incremental upgrades made to provide the new functionality that's desired.
The performance profile of something like Factorio vs Cyberpunk, Elden Ring, or Hunt Showdown is extremely different.