this post was submitted on 17 Apr 2024
257 points (89.5% liked)
Technology
60010 readers
3806 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Welcome to the world of venture capitalism. It's all "come on, guy! This is the next thing! Trust me bro!"
But by that token energy is the possibility collusion and cooperation within the industry, to front these technologies in board rooms and to shareholders. The problem is the question of the how and why. We can compare the current AI boom with the crypto boom.
The crypto boom just made NVIDIA more exploitative and fronted scams, grifts and rugpulls in the form of smart contracts and NFTs.
Everyone pretty much abandoned it, like in the gaming industry, because being associated with crypto was tantamount to being declared a plague bearer.
Then we see NPUs being integrated into SoC's by Intel, AMD, Apple, etc, platforms like Hugging face, frameworks like pytorch.
Sure, there's a crapton of illegal data harvesting and new swathes of content farms, as well as the premonition of mass layoffs in the future. But all these things are strictly speaking speculation.
I personally think that some of the moves being made to distribute AI processing is good, because it is far better to having access to AI processing from within the SoC of your device, rather than being locked to the GPU market. But the question still remains.
Will localised SLM's, LLM's and stable diffusion really take off? Or will these NPU's be gangrenous limbs come the next decade? Will we all have to bend over to our AGI overlords? Only time will tell.
Place your bets.
What's surprising is more and more people keep falling for it.
Like 20-30 years ago it made sense. But we have "kids" all the way up to their 20s who literally grew up in this over hyped environment that still believe all this bullshit is days away from changing the world.
It's like a cult insisting the Messiah is coming back tomorrow, and every night the day "to morrow for sure"
seems like chip designers are being a lot more conservative from a design perspective. NPUs are generally a shitton of 8 bit registers with optimized matrix multiplication. the “AI” that’s important isn’t the stuff in the news or the startups; it’s the things that we’re already taking for granted. speech to text, text to speech, semantic analysis, image processing, semantic search, etc, etc. sure there’s a drive to put larger language models or image generation models on embedded devices, but a lot of these applications are battle tested and would be missed or hampered if that hardware wasn’t there. “AI” is a buzz word and a goalpost that moves at 90 mph. machine learning and the hardware and software ecosystem that’s developed over the past 15 or so years more or less quietly in the background (at least compared to ChatGPT) are revolutionary tech that will be with us for a while.
blockchain currency never made sense to me from a UX or ROI perspective. they were designed to be more power hungry as adoption took off, and power and compute optimizations were always conjecture. the way wallets are handled and how privacy was barely a concern was never going to fly with the masses. pile on that finance is just a trash profession that requires goggles that turn every person and thing into an evaluated commodity, and you have a recipe for a grift economy.
a lot of startups will fail, but “AI” isn’t going anywhere. it’s been around as long as computers have. i think we’re going to see a similarly (to chip designers) cautious approach from companies like Google and Apple, as more semantic search, image editing, and conversation bot advancements start to make their way to the edge.
Generative AI will, at the least, always be used to produce something that's in very high demand: mediocre, derivative, repetitive crap.
Because mediocre crap is often good enough, especially if it's nearly free. So I expect most spam to be AI generated in the very near future.
I would prefer builtin not very powerful FPGA's getting into consumer hardware.