this post was submitted on 18 Nov 2024
13 points (100.0% liked)

Hardware

681 readers
423 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 1 year ago
MODERATORS
top 4 comments
sorted by: hot top controversial new old
[–] Alphane_Moon 7 points 17 hours ago (1 children)

Lenovo claimed it won 24 percent of the PC market and saw AI PCs account for over ten percent of notebook sales. Group president Luca Rossi said PC sales should improve as buyers have two good reasons to upgrade: end of life for devices bought to run Windows 10 which are now ripe for replacement, and a desire to adopt AI PCs and offer users new experiences.

Windows 10 EOL is a fair argument. I have yet to see any research showing consumers looking to buy "AI PCs" specifically. I suspect most people just get a new laptop and it happens to be an "AI PC".

[–] Brkdncr 3 points 17 hours ago (1 children)

Enterprise is different. Lots of business decision makers are prepping their workforce for AI, and do t want to put their data on someone’s cloud. Local AI will be a big deal.

[–] Alphane_Moon 3 points 17 hours ago (1 children)

Are the current crop of NPUs really suitable for this?

I play around with video upscaling and local LLMs. I have a 3080 which is supposed to be 238 TOPS. It takes about 25 min to upscale a ~5 min SD video to HD (sometime longer depending on the source content). The "AI PC" NPUs are rated at around ~50 TOPs, so that would be a massive increase in upscale time (closer to 2 hours for ~5 min SD source).

I also have a local LLM that I've been comparing against ChatGPT. For my limited use case (elaborate spelling/typo/style checking), the local LLM (llama) works comparable to ChatGPT, but I run it on a 3080. Is this true for local LLMs that run on NPUs? I would speculate that more complex use cases (programming support?), you would need even more throughput from your NPU.

I have much more experience with upscaling though and my experiments/usage of local LLMs is somewhat limited compared to ChatGPT usage.

[–] Brkdncr 5 points 16 hours ago

No one said it was a smart decision. It’s just one that’s being made.