brucethemoose

joined 8 months ago
[–] brucethemoose 51 points 1 week ago* (last edited 1 week ago) (15 children)

I don't think anyone wants a hot war in NK, and I'm not sure what good it would do.

Europe needs to (and should have) get off their butts and send every piece of hardware they have to Ukraine though, even cutting edge ones. Maybe even enforce a no-fly zone. As I keep asking, what are they waiting for... Spain to invade France? No, they built all this stuff to deter Soviet aggression, and its just sitting there, rotting instead of doing its job. If Ukraine would have stayed secure, they basically would never have to worry about this again.

Now they have no excuse. Russia clearly has no shame. And it's almost (but not quite) too late.

[–] brucethemoose 2 points 1 week ago* (last edited 1 week ago) (4 children)

Bitnet is theoretical now and unsupported by NPUs anyway.

Basically they are useless for large models :P

The IGPs on the newest AMD/Intel IGPs are OK for hosting models up to like 14B though. Maybe 32B if with the right BIOS, if you don't mind very slow output.

If I were you, on a 3080, if you keep desktop vram usage VERY minimal, I would run TabbyAPI and a 4bpw exl2 quantization of Qwen 2.5 14B coder, instruct, and RP finetune... pick your flavor. I'd recommend this one in particular.

https://huggingface.co/bartowski/SuperNova-Medius-exl2/tree/4_25

Run it with Q6 cache and set the context to like 16K, or whatever you can fit in your vram.

I guarantee this will blow away whatever llama (8b) setup you have.

[–] brucethemoose 1 points 1 week ago* (last edited 1 week ago)

What's ironic is that the local llm/diffusion communities will not touch these. They're just too slow, and impossibley finicky to set up with models big enough for people to actually want.

AMD's next gen could change that, but they've already poisoned the branding. Good job.

[–] brucethemoose 1 points 1 week ago (6 children)

NPUs are basically useless for LLMs because no software supports them. They also can't allocate much memory, and they don't support the exotic quantization schemes modern runtimes use very well.

And speed wise, they are rather limited by their slow memory busses they're attached to.

Even on Apple, where there is a little support for running LLMs on NPUs, everyone just does the compute on the GPU anyway because its so much faster and more flexible.

This MIGHT change if bitnet llms take off, or if Inte/AMD start regularly shipping quad channel designs.

[–] brucethemoose 1 points 1 week ago* (last edited 1 week ago)

Apple is also much faster because the integrated graphics are actually usable for LLMs.

The base M is just a big faster than an Intel/AMD laptop if you can get their graphics working. The M Pro is 2x is fast (as its memory bus is 2x as wide). The M Max is 4x as fast.

AMD is coming out with something more competitive in 2025 though, Strix Halo.

[–] brucethemoose 9 points 1 week ago (13 children)

Unless you have an Nvidia card.

I've been on linux for years, I work the Nvidia libraries all the time, I alternate booting wayland and X... I even use my AMD IGP as output these days, instead of the Nvidia card.

And I STILL hold my breath wondering if I'm going to get a blackscreen, and have to go into tty mode or boot from a usb stick to investigate and fix it.

[–] brucethemoose 10 points 1 week ago (1 children)

Honestly, hardly anyone I know IRL knows specifics about Israel and Palestine.

[–] brucethemoose 10 points 1 week ago* (last edited 1 week ago)

Imagine if it found its way into Musk's feed!

I would not be against this. I absolutely know how.

We can rebuild him. We have the technology...

[–] brucethemoose 2 points 1 week ago

Beerhall Coup

I was not aware of this.

Wonderful...

[–] brucethemoose -1 points 1 week ago

Older Americans remember an old Israel, struggling to just get by.

Younger Americans... largely don't know anything about it. They know it's some country in the Middle East, but have no idea what's going on in gaza. TBH I did not growing up. It's not taught in school, it's not common knowledge unless your origins/family are from the region.

[–] brucethemoose 3 points 1 week ago

They need to take their phones away in class.

I hate to sound so salty and old, but jeez.

[–] brucethemoose 25 points 1 week ago (2 children)

I think they will become a Chinese vassal.

I don't know exactly what that'd look like, but the pure economic/political dynamics make it seem kinda inevitable.

view more: ‹ prev next ›