this post was submitted on 10 Apr 2024
28 points (93.8% liked)

LocalLLaMA

2290 readers
22 users here now

Community to discuss about LLaMA, the large language model created by Meta AI.

This is intended to be a replacement for r/LocalLLaMA on Reddit.

founded 2 years ago
MODERATORS
28
submitted 8 months ago* (last edited 8 months ago) by [email protected] to c/[email protected]
 

From Simon Willison: "Mistral tweet a link to a 281GB magnet BitTorrent of Mixtral 8x22B—their latest openly licensed model release, significantly larger than their previous best open model Mixtral 8x7B. I’ve not seen anyone get this running yet but it’s likely to perform extremely well, given how good the original Mixtral was."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 11 points 8 months ago* (last edited 8 months ago) (1 children)

They've done it that way for the previous models, too. I suppose it's to add a bit of "mystery" around it and give people some riddle to solve.

[–] pennomi 1 points 8 months ago (1 children)

Likely they’re trying to get in before Llama 3 drops, because I suspect that’s all people will talk about for a fair bit.

[–] [email protected] 3 points 8 months ago

Probably not. Since they're doing exactly this (drop a magnet link out of the blue) for the third time or so in a row... It's more likely a scheme than related to current happenings.