this post was submitted on 08 Aug 2023
314 points (94.6% liked)

Linux Gaming

14793 readers
46 users here now

Gaming on the GNU/Linux operating system.

Recommended news sources:

Related chat:

Related Communities:

Please be nice to other members. Anyone not being nice will be banned. Keep it fun, respectful and just be awesome to each other.

founded 4 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 98 points 11 months ago (3 children)

I'm glad they found a way, but at the same time - what the hell? Why is it OK for game devs of this magnitude to have a hardcoded hardware list? Look for feature support, not a string that is easy to manipulate outside your control!

[–] [email protected] 34 points 11 months ago (1 children)

The problem in this case is that they automatically trigger XeSS, which isn't bad unto itself (unless it can't be deactivated, which this sounds like).

The GPU does support XeSS but it crashes on Linux. If they just added a toggle/cmd flag to disable the feature changing the vendorId wouldn't be necessary.

[–] [email protected] 11 points 11 months ago (1 children)

Could the game developers simply add this toggle for XeSS?

[–] [email protected] 34 points 11 months ago (1 children)

LinUX iS nOt A sUpPoRrtEd PlaTfOrm

[–] [email protected] 12 points 11 months ago* (last edited 11 months ago) (1 children)

I am playing Baldur's Gate 3 (can't say enough great things about this game) and they had a toggle in the game setting for upscaling options. And DLSS runs great with my Linux PC. I thought I heard Larian say they are trying to get XeSS too.

You can enable either Nvidia's DLSS or AMD's FSR via the settings menu.

[–] [email protected] 8 points 11 months ago

I imagine Larian care. Especially since they're pushing Steamdeck support.

The reason this is a "supported platform" issues is that the developers of Hogwarts legacy know their supported platforms support XeSS, so any work that is not "just turn it on" is additional work for no gain.

[–] [email protected] 11 points 11 months ago

I would bet money that Intel's dev rel team worked closely with Avalanche to add XeSS support to sell more Intel GPUs.

Most likely the Hogwarts devs were said, "sure, do whatever you want on your own hardware, just don't you dare break anything on any other platform while we're trying to ship". The easiest way to green light this and know nothing else would be affected would be to hard code everything behind Intel's vendor IDs.

So this probably isn't a case of Intel working around a game dev's code, it's probably a case of Intel working around its own code.

[–] [email protected] 8 points 11 months ago (2 children)

IIRC, with an Nvidia card DXVK will spoof an AMD card in a lot of games because otherwise the game will try to interact with the Windows Nvidia drivers which aren't there.

[–] [email protected] 7 points 11 months ago

You remember correctly. From the DXVK conf file:

# Report Nvidia GPUs as AMD GPUs by default. This is enabled by default
# to work around issues with NVAPI, but may cause issues in some games.
#
# Supported values: True, False

# dxgi.nvapiHack = True
[–] [email protected] 4 points 11 months ago

nteract with the Windows Nvidia drivers which aren’t there

Funny story. I was trying to get RayTracing working under Wine for a few days and finally found the solution (needed to download the nvlibs zip from GitHub and run the installer).

Couple weeks later I went back into Wine and it was broken. After another 3 days of struggling, I decided to redownload nvlibs and run the installer, when I noticed it only symlinks the needed libraries into WINEPREFIX. Me, being the resource miser I am, had removed the folder from ~/Downloads when I thought I was done with it ...