this post was submitted on 31 Jul 2023
63 points (93.2% liked)
Games
32996 readers
1146 users here now
Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.
Weekly Threads:
Rules:
-
Submissions have to be related to games
-
No bigotry or harassment, be civil
-
No excessive self-promotion
-
Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts
-
Mark Spoilers and NSFW
-
No linking to piracy
More information about the community rules can be found here.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I want to rant real quick.
I want to preface this by saying I'm not a game developer, but I have played a fair share Unreal Engine games and my honest opinion as a consumer is that it is a literal plague especially in the indie game world. Show me 1 second of gameplay of any game and I could tell you with 100% certainty whether it's an unreal engine game or not. And the main issue isn't the engine itself, I bet its a fine engine that can do everything that a developer needs it to do.
The main issue in general gaming but most noticeably in UE is the absolute horrible TAA antialiasing. Somehow we went from crisp and sharp looking games in 2010 to absolutely blurry messes today. UE is the biggest offender, every single on of their games uses TAA as its main AA method and only with the sharpening filter turned to a 100, is it barely serviceable. And on top of the blurriness you have visual artifacts especially in Picture-in-Picture (PiP) rendering, so forget realistic scopes or mirrors or particle effects. And if you decide to use any other method for AA, all the characters hair looks like an unacceptable flickering wiremesh. We always see these tech demos of amazing lighting and huge open landscapes rendered in realtime with UE but it all amounts to nothing if everything is blurred beyond recognition.
The second biggest gripe is the abysmal performance. Sure if a game looks good you can expect it to be a little bit more demanding on the hardware side. But thanks to TAA, no UE game actually looks good. So you're just left with the hardware demands. But in the past, if your PC couldn't handle a game at max setting you just tone them down a little bit and "viola" your game runs good. That is absolutely impossible with UE. I have 3 UE games that I regularly play, and the difference between lowest and max settings on all of them is ~5 FPS. So your game looks like a PS2 game and you get barely any performance gain, awesome, good job UE. Not to mention that in an attempt to maximize "performance" most NPCs that are further than 50m are rendered at 5 FPS, looks realy good on those big open landscapes with amazing lighting.\
I am sure that all of those problems are solved if the engine is in the hands of a talented developer that knows what their doing and puts value on visual clarity and performance. But that is not what the vast majority of UE developers do. UE feels to me like a modular package. You just slap things together and it supposedly works. But you can't expect to create art by just slapping things together. It also feels like UE tries to become the jack of all trades but master of none to appeal to the broadest market so that Epic can cash in on all that licensing money.
These are real issues but what is the alternative?
Most other engines are not better. Creating a new engine is very expensive, takes time, and is risky.
It seems like Unity is the go to engine for 2D applications. But I'm always surprised how much developers can squeeze out of it for 3d games. Konami could get their heads out of their asses and sell the Fox Engine or make it publicly available since they aren't using it anymore. The CryEngine always looked stellar and is available for licensing.
I just dont understand, is the Unreal Engine so much cheaper and better for development than any alternative? Is Epics support better than any competitors? Why does it seem like every 2nd indie or double A title uses UE?
We also have more and more developers transfer to UE for sequels even if they already have a working engine. (Insurgency: Source, Insurgency Sandstorm: Unreal)
Our studio uses Godot. It's fantastic and open source.
I just watched the 2022 desktop/console showcase, I've only played Brotato and Cassette Beasts (Switch) out of all of them. Looks very clean but so far mostly focuses on 2D and 2.5D games. I also saw a VR game in the showcase. Looks very interesting.
It is great for 2D. The 3D is definitely getting there, but it's not on the same level as Unreal or Unity yet. I think within a year or two, It should catch up to Unity at least though. We're super pleased with it.
CDPR are switching from their proprietary Red Engine to UE5 as well.
Yeah anyone who says that studios should just develop an engine or that it's not that hard should look to cyberpunk. Most bugs there were engine related, and all of its performance woes were too.
I'm actually sad, it ended up being a fine engine after they fixed it up for a year, and it'd be nice to have some more alternatives to unreal
I completely agree, on both counts. I'm sad about the demise of Red Engine too, especially since the look of Cyberpunk was one of the things they nailed. Not just graphically, but things like small character movement animations during dialogues and facial expressions.
I'm fearful that the upcoming Cyberpunk 2 (when it releases in 10 years) will lose a lot of identity by being Unrealified.
Those character animations are an engine agnostic problem. That’s on the art department, any engine can handle it with ease.
That's possible. I know rigging for facial expressions used to be a big thing and was very different between engines, but at this point perhaps every option is at a sufficient enough level for it not to matter.
Unreal is way more versatile and easier to use than CryEngine, and a lot more capable for AAA game development than Unity. Looking at UE5, none of these alternatives have equivalents for features like Nanite or Lumen.
I've seen the presentation of Nanite and Lumen a month ago and they seem like very interesting technologies. I still haven't seen a game implement Nanite to get a significant performance boost though. Lumen is more of a filmmakers tool, since lighting in games is often preferred to be more stylized than realistic. But this also brings up another issue with UE. The constant updates distract developers from actually fulfilling their vision and finishing the game. Early Access titles often stagnate development to update to a new engine version and implement new technologies, instead of providing content and bugfixes. And if you don't update the to a new engine, the community whines about it. So the devs have no choice. The versitality has its price, it's like UE tries to become jack of all trades, but master of none in an effort to provide everybody with a platform.
Lumen is not a filmmakers tool. Fortnite is already using it in production on current gen consoles, and Immortals of Aveum will be using it exclusively when it launches later this month.
Nanite is about eliminating LoD pop in without a performance penalty. I wouldn’t expect games to run faster, only look better.
Another big factor is developer engine knowledge. It's expensive to train developers on a new or unpopular engine when you can hire plenty of devs who are already familiar with a popular engine like Unreal. 343i continues to have this issue with Halo Infinite running on their Slipspace engine, which is why (IIRC) they're switching to Unreal for future games.
There isn’t a great alternative. SSAA is way too expensive, and old anti-aliasing techniques do not work well with shader-heavy games or really fine detail.
The fucktaa crowd would rather just live with really nasty shimmering and other artifacts of aliasing, or they have obnoxiously expensive setups that can drive SSAA or displays with really high pixel densities. Personally I think they’re crazy. I find most TAA implementations look way better on my 27” 1440p monitor than no AA.
Lemmy needs its own /r/fucktaa
I wonder why exactly somebody decided that the search for a perfect AA method has to stop TAA. We went from jaggy edges to edge detection and oversampling (MSAA) being the standard in 2000-2012 but people where unsatisfied with the performance tank so we needed a lighter method. So we got post processing AA like SMAA which is a scam and does absolutely nothing or FXAA which simlpy applies a blur filter to edges. Not the most elegant solutions but they will do if you can't effort to use MSAA. Then TAA came around the corner and I dont even know how it looks so bad, because it sounds fine on paper. Using multiple frames to detect differences in contrast and then smoothing out those diffrences seems like an OK alternative, but it should've never become the main AA method.
I've honestly expected the AA journey to end with 4K resolution being the standard. AA is mostly a matter of pixeldesity over viewing distance. Mobile games have mostly no AA because their pixel density is ridiculous, Console games also rarely have AA because you sit 10 feet away from the screen. PC being the only outlier but certainly having the spare power to run at higher resolutions than consoles. But somewhere along the way, Nvidia decided to go all in on Raytracing and Dynamic Resolution instead of raw 4K performance. And Nvidia basiacly dictates where the gaming industry goes.
So I honestly blame Nvdia for this whole mess and most people can agree that Nvidia has dropped the ball the last couple of years. Their Flagship cards cost more than an all consoles from Sony, Microsoft and Nintendo combined. They cost more than mid-high range gaming laptops. And the raw power gain has been like 80% over the last 10 years, because they put all their R&D into gimmicks.
I got quite the good AA by rendering the screen at 4k and letting the graphic card underscale it into the screen's 1080p resolution. No AA needed, looks fiine.
That is basically MSAA without the edge dection. Rendering in 4K and downscaling is the dirtiest but most effective AA method. But downscaling the whole screen also applies to UI elements, this often times results in tiny blurry fonts if the UI isn't scaled appropriately. But more and more games have started to add a render resolution scale option that goes beyond 100% without affecting the UI. Downscaling also causes latency issues. I can run Metal Gear Solid 5 at a stable 60 FPS at 4K but the display latency is very noticeable compared to 1440p at 60.
I miss the time when you could just disable the games native AA and force MSAA through Nvidia control panel. But most newer titles dont accept Nvdias override, especialy Unreal games.
MSAA only samples the geometry multiple times, not the whole scene. It doesn’t work very well in games with a lot of shaders and other post process work, which is basically every game made in the last decade.
What GP is describing is SSAA (Super sampled anti-aliasing).
Thats what I meant by edge detection. I think part of the downfall of MSAA in modern gaming is foliage. Nowadays every field in videogames is filled with lush grass, same goes for trees and bushes. They aren't flat textures of low poly models anymore. Most engines use completely different rendering methods for foliage to get the 1000s of swaying leafs and grass on screen with minimum performance impact. But having to detect all the edges of every single piece of grass and apply oversampling to it, would make any game run at single digit frames. There are certainly a few other things that GPUs have to render in bulk to justfy novel rendering methods, but foliage is by far the best example. So I can understand why post processing AA is easier to implement. But is TAA really the best we cab do? Especially because things like swaying grass becomes a green blob through TAA. Slow and fine movement like swaying is really the bane of temporal sampling.
That is an insanely expensive solution to this problem. You are cutting performance by 75% or more to make that possible, meaning your 30 FPS game could be doing 120 if you stuck to native 1080p.
That's the thing, my game is running at 60+, and I don't need more.
In any case new graphic cards AR prepared to run for 4k games, so having a 1080p screen which which I'm content is a godsend performance wise, it let's me do stuff like this without practical performance losses.
Well there's always DLSS and FSR. I don't even use AA anymore cause DLSS Balanced looks so much better than even native resolution + 8x MSAA.
MSAA doesn’t do anything for modern games because just about every surface has multiple pixel shaders applied on top. This is why few games bother to support it.
Yes I agree, I wrote another comment about how I think the prevalence of realistic foliage in modern games might have been the biggest factor in MSAAs abandonment.
DLSS and especially FSR, are basically TAA repurposed for upscaling.
But contrary to the vast majority of TAA implementations, they are actually good.
I wish could experience DLSS. I'm still rocking a 1080Ti, so no DLSS for me, only FSR. But, in my opinion, FSR is such a visual downgrade for a minuscule performance boost. Especially in PvP games, where you can get killed by a single pixel, playing at a curbed resolution is a dealbreaker. I've heard DLSS looks a lot better than FSR but I'm going to run the 1080Ti till it dies, since it still runs nearly everything maxed out at 1440p.
It gets even worse when non-game applications use those frameworks designed for games. Like Stud.io - virtual LEGO building CAD. Even if you don't touch the thing, it still renders 60 frames per second. Whenever I use it, the fans run high even when it is idling. And don't even think of running this on a battery-powered laptop...
I could be wrong, but I think on a lot of complaints like that, the issue ends up being at least partially the fault of the studio using the engine being too lazy to adjust Unreal's defaults like that. I'd be surprised if it doesn't let you turn off rendering and preserve the current image on screen.
I don't know if they are using Unreal or Unitiy, or whatever, but it really sucks, and it is also so f-ed up that it won't properly run in the Wine environment under Linux.