this post was submitted on 22 Nov 2024
50 points (100.0% liked)

PC Gaming

6676 readers
9 users here now

Rule #1: Be civil

Rule #2: No spam, memes, off-topic, or low-effort posts/comments

Rule #3: No advertisements

Rule #4: No streams, random gameplay videos, highlights, or shorts

Rule #5: No erotic games or porn

Rule #6: No facilitating piracy

Rule #7: No duplicates

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Ptsf 4 points 1 month ago* (last edited 1 month ago)

Not to defend nvidia entirely, but there are physical cost savings that used to occur with actual die wafer shrinkage back in the day since process node improvements allowed such a substantial increase in transistor density. Improvements in recent years have been lesser and now they have to use larger and larger dies to increase performance despite the process improvements. This leads to things like the 400w 4090 despite it being significantly more efficient per watt and causes them to get less gpus per silicon wafer since the dies are all industry standardized for the extremely specialized chip manufacturering equipment. Less dies per wafer means higher chip costs by a pretty big factor. That being said they're certainly... "Proud of their work".