this post was submitted on 27 Jan 2025
47 points (89.8% liked)
World News
558 readers
919 users here now
Rules:
- Be a decent person
- No spam
- Add the byline, or write a line or two in the body about the article.
Other communities of interest:
founded 4 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
OK, hold on, so I went over to huggingface and took a look at this.
Deepseek is huge. Like Llama 3.3 huge. I haven't done any benchmarking, which I'm guessing is out there, but it surely would take as much Nvidia muscle to run this at scale as ChatGPT, even if it was much, much cheaper to train, right?
So is the rout based on the idea that the need for training hardware is much smaller than suspected even if the operation cost is the same... or is the stock market just clueless and dumb and they're all running on vibes at all times anyway?
I thought everyone knew stocks were all vibes by now, private market might improve with competition but a public stock will always pick the most flashy option even if it's shit just for appeal or they quite literally lose everything if it goes slightly wrong
Everything I've seen from looking into it seems to imply it's on par for training and performance as other (LLM only) models.
I feel like I'm missing something here or that the market is "correcting" for other reasons.
The market is correcting because the AI bubble is gonna pop and someone needs to be left holding the bag.
Deepseek is the based on either llama or qwen, but can be put on top of any model?
I tested qwen which sucked dick IMHO
Now deepseek qwen is best thing I tried locally