this post was submitted on 31 Jan 2025
66 points (91.2% liked)
Technology
61346 readers
3946 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It's for running AI on the GPU. You need a really expensive PC GPU to get more than like 16 GB of RAM or whatever, so the bottleneck for large AI models is swapping in and out data from system RAM over PCIe.
The Mac has an SoE with unified memory, where the GPU can access all 192 GB at full speed, which is perfect for AI workloads where you need the GPU to access all the RAM. There's always a tradeoff where the PC GPUs have faster processors (since they have a way bigger power budget), but the Mac GPU has faster memory access, so it's not always a slam-dunk which is better.
APUs/Integrated GPUs on PCs also have unified memory but they always targeted the low end so aren't as useful.
In order to run say something like deep-seek R1 the full fat version you need to stick something like eight Mac minis together.
Doesn't seem like a single device is really going to be capable long term, since it doesn't seem like it's capable right now.