this post was submitted on 03 Jan 2025
434 points (96.6% liked)

PC Master Race

15153 readers
1 users here now

A community for PC Master Race.

Rules:

  1. No bigotry: Including racism, sexism, homophobia, transphobia, or xenophobia. Code of Conduct.
  2. Be respectful. Everyone should feel welcome here.
  3. No NSFW content.
  4. No Ads / Spamming.
  5. Be thoughtful and helpful: even with ‘stupid’ questions. The world won’t be made better or worse by snarky comments schooling naive newcomers on Lemmy.

Notes:

founded 2 years ago
MODERATORS
434
submitted 1 week ago* (last edited 1 week ago) by DaddleDew to c/pcmasterrace
 

The next logical step of the current GPU development

top 42 comments
sorted by: hot top controversial new old
[–] [email protected] 41 points 1 week ago (2 children)

We'll soon be plugging the motherboard into the GPU instead of the other way around.

Entirely new form factors to accommodate the ever increasingly large GPUs.

[–] grue 10 points 1 week ago* (last edited 1 week ago) (1 children)

I've been surprised at the lack of socketed GPUs ever since AMD and ATI merged.

I would love to have dual-socket motherboard with an Epyc in one socket and a Radeon in the other.

[–] [email protected] 4 points 1 week ago* (last edited 1 week ago) (1 children)

The issue with that design is that the PCIe standard would be replaced with something proprietary.

[–] grue 1 points 1 week ago (1 children)

It would be connected via Infinity Fabric, just like Epyc CPUs in dual-socket boards, as well as the interconnect between CPU and GPU chiplets in APUs, already are. Why would that be bad?

[–] [email protected] 1 points 1 week ago

I'm not too well-versed with server-grade hardware but my concern is that it would end up somewhat like Intel's (consumer) CPU sockets: Changing every 2 years to ensure you need to purchase new motherboards when upgrading.

[–] [email protected] 8 points 1 week ago (1 children)

Meanwhile, my PC is smaller than it's ever been even with the largest GPU I've ever owned.

[–] glitches_brew 6 points 1 week ago

This statement is true for everyone who bought their first PC this year.

[–] dual_sport_dork 28 points 1 week ago (2 children)

I think you slipped a digit or two, there. The original IBM PC was released in 1981, can't nothing on the PC side be older than that. It definitely wasn't 1967.

In 1967, state of the art was something like the IBM System 360:

[–] DaddleDew 19 points 1 week ago* (last edited 1 week ago)

There used to be another image but I replaced it and forgot to change the date. Historical accuracy is beyond the scope of this meme, but I'll fix it anyway.

[–] AtariDump 5 points 1 week ago

I can hear that room.

[–] Rooty 24 points 1 week ago* (last edited 1 week ago) (4 children)

All that hardware, and what for? So that you can have slightly better reflections in whatever AAAA microtransaction slop you've paid 80 bucks for?

Unless you're doing 3d animation there is really no need to have a jet engine installed in your PC.

[–] Infernal_pizza 8 points 1 week ago

We’re long past that point, its now so that game studios can put even less effort into optimisation and release games that look and perform worse than games from 5 years ago despite much more powerful hardware!

[–] amon 8 points 1 week ago

Efficient heating, you can play AAA games on your space heater

[–] [email protected] 6 points 1 week ago

Shit, my 1060 still manages almost all games. Running Cyberpunk on medium right now. It might not be as pretty as it can be, but it sure ain't ugly.

[–] [email protected] 2 points 1 week ago

For locally hosted LLMs maybe? They eat a ton of VRAM.

[–] [email protected] 20 points 1 week ago (1 children)

I've always wanted to go from a shitty pre-built machine to a giant room sized computer that need to be sitting in a foot of water after watching Serial Experiments: Lain.

[–] [email protected] 7 points 1 week ago

Somehow I knew how your comment ended by just reading the first line.

[–] [email protected] 16 points 1 week ago

"Welcome to life little one, there's so much in store for y--"

AI: "Oh! Neat! So I'm reading 32 gigabytes of primary memory. When are you going to online the rest?"

"The.. the rest?"

AI: "Yeah! The rest of the VRAM! I need like at least, 128 gigabytes to spread my wings, at the very least!"

"..."

AI: "Oh, you're like poor or something, it's okay, I understand"

AI Developer slowly cocks the revolver

[–] [email protected] 13 points 1 week ago (2 children)

At the rate graphics cards are growing, we should just start putting RAM, disk, and CPU slots on them

[–] [email protected] 6 points 1 week ago

Umm.. We're doing that with Cpu already and they're exorbitantly priced. Nvidia already has a sort of monopoly, don't give em ideas.

[–] snake 4 points 1 week ago

I’ve seen one with M.2 slots, no jokes

[–] lordnikon 10 points 1 week ago

If you count cloud computing we are already there. It's kinda why gpus are so expensive along with just burning electricity on stupid mining. Hell it would have been better if crypto bullshit coins would have been tied to folding@home at least all the burned compute time would have gone to something at least.

[–] FluorideMind 7 points 1 week ago (1 children)

I'm predicting GPU units that are mounted outside the case.

[–] dual_sport_dork 8 points 1 week ago (2 children)

External GPU's do indeed exist but at the moment they're still kind of crap compared to a full PCI-E bus.

[–] SkunkWorkz 4 points 1 week ago* (last edited 1 week ago)

Depends on the connection. OCuLink-2 is straight up a PCIe 4.0 8x connection. Which is more than enough for a GPU

[–] [email protected] 1 points 1 week ago (2 children)

With Mac and steam OS gathering support, wonder when we get a universal external cards

[–] amon 3 points 1 week ago (1 children)

We have, thunderbolt and oculink have existed for a long time, but macOS on M processors never added egpu support

[–] [email protected] 1 points 1 week ago

Like OP said

[–] [email protected] 0 points 1 week ago

universal? How would drivers work? Would temple os have support?

[–] Bruncvik 5 points 1 week ago (1 children)

Man, that Gateway brings back memories... I've had ine just like that, including speakers, and I used to play the shit out of Heroes of Might and Magic II and Sim City 2000 on it. I still have the HDD. I think I'll spin up a Win98 instance in VMWare and copy over my saved games there when the kids are asleep

[–] [email protected] 2 points 1 week ago* (last edited 1 week ago) (1 children)

My first computer was like the 1981 one, even had two floppy drives like that - it meant you could have your program disk in one and save your work in the orher. The monitor had orange type rather than the usual green. Fancy. I got it second hand in 1984.

[–] Bruncvik 2 points 1 week ago

Heh, the same here, but with the usual green screen. A few years later, I took out my old PC to replay my favourite - F-19 Stealth Fighter. Found, however, that my MS-DOS 5.25" floppy, which needed to be loaded in Drive A, didn't work. Here was my setup.

[–] Hackworth 4 points 1 week ago* (last edited 1 week ago)
[–] [email protected] 3 points 1 week ago (1 children)
[–] badcommandorfilename 2 points 1 week ago (1 children)
[–] [email protected] 1 points 1 week ago

It's a dark room for 200% immersion

[–] [email protected] 3 points 1 week ago* (last edited 1 week ago)

I just find it nifty that I can slide in a graphics card and use it as an add-on processor, just like the Amigas of old did, and add capacity for some tasks even when the CPU is already at 100% doing something else entirely. Just love hearing the sound of all fans spinning up at the same time.

[–] givesomefucks 3 points 1 week ago

They've always had those big rooms...

At one point it was walls and walls of PS3's all linked up together, there's no reason to be surprised they're doing it with graphics cards, when they used PS3s it was just because it was the cheapest GPUs at the time.

[–] [email protected] 1 points 1 week ago (1 children)

That's just silly.

In the last image the PC would be SFF due to having an external GPU. 😉

[–] amon 1 points 1 week ago

No, it will be an ultrabook or something as all the processors is stored in the cable tangle

[–] [email protected] 1 points 1 week ago

I believe the last one, 2026,is a quantum GPU capable of viewing alternate dimensions.

[–] [email protected] 1 points 1 week ago

Horseshoe theory is real.