this post was submitted on 10 Feb 2025
545 points (95.8% liked)
Technology
62064 readers
5111 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well, to be 100% fair, it's all total bullshit.
I use a 16b reduction of deepseek-r1 on my pc at home and it's definitely not total bullshit. It's 10gb of local model that can solve mathematics and physics problems for you or program in python or bash. It doesn't hallucinate (or I haven't been able to elicit it), it's aware of the extents of its knowledge. It works incredibly fast on an old ryzen 1600 with 6600xt. Having an open source reasoning AI that takes 10gb of SSD and about 13 gb of ram is so weird that the only thing weirder is seeing smart people dismiss it as bullshit out of hand.
It's still an LLM right? I'm going to have to take issue with your use of the word 'reasoning' here