this post was submitted on 11 Oct 2023
506 points (92.6% liked)
Technology
59708 readers
5537 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is why I, as a user, am far more interested in open-source projects that can be run locally on pro/consumer hardware. All of these cloud services are headed down the crapper.
My prediction is that in the next couple years we'll see a move away from monolithic LLMs like ChatGPT and toward programs that integrate smaller, more specialized models. Apple and even Google are pushing for more locally-run AI, and designing their own silicon to run it. It's faster, cheaper, and private. We will not be able to run something as big as ChatGPT on consumer hardware for decades (it takes hundreds of gigabytes of memory at minimum), but we can get a lot of the functionality with smaller, faster, cheaper models.
Hundreds of gigabytes of memory in consumer PCs is not decades away. There are already motherboards that accept 128 GB.
You're right, I shouldn't say decades. It will be decades before that's standard or common in the consumer space, but it could be possible to run on desktops within the next generation (~5 years). It'd just be very expensive.
High-end consumer PCs can currently support 192GB, and that might increase to 256 within this generation when we get 64GB DDR5 modules. But we'd need 384 to run BLOOM, for instance. That requires a platform that supports more than 4 DIMMs, e.g. Intel Xeon or AMD Threadripper, or 96GB DIMMs (not yet available in the consumer space). Not sure when we'll get consumer mobos that support that much.
Technically I could upgrade my desktop to 192GB of memory (4x48). That's still only about half the amount required for the largest BLOOM model, for instance.
To go beyond that today, you'd need to move beyond the Intel Core or AMD Ryzen platforms and get something like a Xeon. At that point you're spending 5 figures on hardware.
I know you're just joking, but figured I'd add context for anyone wondering.
Don't worry about the RAM. Worry about the VRAM.