this post was submitted on 04 Aug 2023
53 points (96.5% liked)

PC Gaming

6725 readers
27 users here now

Rule #1: Be civil

Rule #2: No spam, memes, off-topic, or low-effort posts/comments

Rule #3: No advertisements

Rule #4: No streams, random gameplay videos, highlights, or shorts

Rule #5: No erotic games or porn

Rule #6: No facilitating piracy

Rule #7: No duplicates

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 2 years ago

They're not that different, really. CUDA processing cores are the most used in AI training, and those are the main processors used in both Nvidia's consumer desktop cards and machine learning enterprise cards. As "AI" is on the rise, more and more of the supply of CUDA processors and VRAM chips will be diverted to enterprise solutions that will fetch a higher price from deals with corporations. Meaning there will be less materials available for the consumer-level GPU supply, which will drive prices up for normal consumers. NVIDIA has been banking on this for a long time; that's why they don't care about overpricing the consumer market and have been trying to push people towards cloud-based GeForce Now subscription models where you don't even own the hardware and just basically rent the processing power to play games.

Also just to be anal, the 3090 and 4090 have 24Gb of vram, not 32Gb. And unlike gaming nowadays you can distribute the workload to multiple GPU's in one system, or over a network of machines.