this post was submitted on 06 Jul 2024
1024 points (97.3% liked)

Technology

59980 readers
3967 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 179 points 5 months ago (7 children)

Don't worry folks, if we all stop using plastic straws and take 30 second showers, we'll be able to offset 5% of the carbon emissions this AI has!

[–] [email protected] 33 points 5 months ago* (last edited 5 months ago) (10 children)

Google ghg emissions in 2023 are 14.3 million metric tons. Which are a ridiculous percentage of global emissions.

Commercial aviation emissions are 935.000 million metric tons by year.

So IDK about plastic straws or google. But really if people stopped flying around so much that would actually make a dent on global emissions.

Don't get me wrong, google is a piece of shit. But they are not the ones causing climate change, neither is AI technology. Planes, cars, meat industry, offshore production... Those are some of the truly big culprits.

[–] masquenox 33 points 5 months ago (4 children)

But they are not the ones causing climate change

The owners of google are capitalists. They are as responsible for climate change as any other capitalist.

load more comments (4 replies)
[–] [email protected] 7 points 5 months ago (2 children)

i cant afford to ride airplanes. you are welcome.

load more comments (2 replies)
load more comments (8 replies)
load more comments (6 replies)
[–] [email protected] 79 points 5 months ago (1 children)

The annoying part is how many mainstream tech companies have ham-fisted AI into every crevice of every product. It isn't necessary and I'm not convinced it results in a "better search result" for 90% of the crap people throw into Google. Basic indexed searches are fine for most use cases.

[–] [email protected] 16 points 5 months ago (1 children)

As a buzzword or whatever this is leagues worse than "agile", which I already loathed the overuse/integration of.

[–] [email protected] 7 points 5 months ago (2 children)

Before AI it was IoT. Nobody asked for an Internet connected toaster or fridge...

load more comments (2 replies)
[–] Raxiel 72 points 5 months ago (1 children)

If only Google had a working search engine before AI

[–] [email protected] 50 points 5 months ago

Yes, but now we can get much worse results and three pages of ads for ten times the energy cost. Capitalism at its finest.

[–] set_secret 59 points 5 months ago (1 children)

And yet it's still garbage....like their search

[–] [email protected] 14 points 5 months ago (1 children)

With adblock enabled I feel like their results are often better than for example Duckduckgo. I recently switched to using DDG as my standard search engine but I regularly find myself using Google instead to get the results I'm looking for.

[–] Ledivin 7 points 5 months ago (1 children)

Interesting, I'm actually the exact opposite. I always start with Google, because it's usually good enough, but whenever it takes 2-3 tries to get something relevant, I switch to ddg and get it first try.

[–] [email protected] 9 points 5 months ago* (last edited 5 months ago) (8 children)

My issue is mostly with image search results. DDG's images tend to be less relevant than Google's. DDG also lacks "smart" results (idk the official term).

For example when you search "rng 25" on Google, it will immediately present you with a random number between 1 and 25. On DDG you have to click on one of the search results and then use some website to generate the number.

Or when searching for the results of a soccer game, Google will immediately present all the stats to you, while on DDG you will only find some articles about it.

Of course it really depends on the kind of search and I'm sure DDG will regularly have better results than Google too.

load more comments (8 replies)
[–] [email protected] 44 points 5 months ago (8 children)

AI is just what crypto bros moved onto after people realized that was a scam. It's immature technology that uses absurd amounts of energy for a solution in search of a problem, being pushed as the future, all for the prospect of making more money. Except this time it's being backed by major corporations because it means fewer employees they have to pay.

[–] pycorax 10 points 5 months ago (1 children)

There are legitimate uses of AI in certain fields like medical research and 3D reconstruction that aren't just a scam. However, most of these are not consumer facing and the average person won't really hear about them.

It's unfortunate that what you said is very true on the consumer side of things...

load more comments (1 replies)
load more comments (7 replies)
[–] [email protected] 43 points 5 months ago (3 children)

I skimmed the article, but it seems to be assuming that Google's LLM is using the same architecture as everyone else. I'm pretty sure Google uses their TPU chips instead of a regular GPU like everyone else. Those are generally pretty energy efficient.

That and they don't seem to be considering how much data is just being cached for questions that are the same. And a lot of Google searches are going to be identical just because of the search suggestions funneling people into the same form of a question.

[–] kromem 16 points 5 months ago (2 children)

Exactly. The difference between a cached response and a live one even for non-AI queries is an OOM difference.

At this point, a lot of people just care about the 'feel' of anti-AI articles even if the substance is BS though.

And then people just feed whatever gets clicks and shares.

load more comments (2 replies)
[–] [email protected] 12 points 5 months ago (2 children)

I hadn't really heard of the TPU chips until a couple weeks ago when my boss told me about how he uses USB versions for at-home ML processing of his closed network camera feeds. At first I thought he was using NVIDIA GPUs in some sort of desktop unit and just burning energy...but I looked the USB things up and they're wildly efficient and he says they work just fine for his applications. I was impressed.

[–] [email protected] 8 points 5 months ago

The Coral is fantastic for use cases that don't need large models. Object recognition for security cameras (using Blue Iris or Frigate) is a common use case, but you can also do things like object tracking (track where individual objects move in a video), pose estimation, keyphrase detection, sound classification, and more.

It runs Tensorflow Lite, so you can also build your own models.

Pretty good for a $25 device!

[–] [email protected] 8 points 5 months ago

Yeah they're pretty impressive for some at home stuff and they're not even that costly.

load more comments (1 replies)
[–] jj4211 33 points 5 months ago

The confounding part is that when I do get offered an "AI result", it's basically identical to the excerpt in the top "traditional search" result. It wasted a fair amount more time and energy to repeat what the top of the search said anyway. I've never seen the AI overview ever be more useful than the top snippet.

[–] [email protected] 32 points 5 months ago* (last edited 5 months ago)

Its not even hidden, people just give zero fucks about how their magical rectangle works and get mad if you try to tell them.

[–] blackwateropeth 31 points 5 months ago

And it’s only 10x more useless :)

[–] [email protected] 28 points 5 months ago (14 children)

The results used to be better too. AI just produces junk faster.

load more comments (14 replies)
[–] [email protected] 25 points 5 months ago (1 children)

If only they did what DuckDuckGo did and made it so it only popped up in very specific circumstances, primarily only drawing from current summarized information from Wikipedia in addition to its existing context, and allowed the user to turn it off completely in one click of a setting toggle.

I find it useful in DuckDuckGo because it's out of the way, unobtrusive, and only pops up when necessary. I've tried using Google with its search AI enabled, and it was the most unusable search engine I've used in years.

[–] [email protected] 11 points 5 months ago (1 children)

DDG has also gotten much worse since the introduction of AI features.

load more comments (1 replies)
[–] [email protected] 23 points 5 months ago (23 children)

whats up with these shit ass titles? It's not even REMOTELY hidden, it takes two fucking seconds of googling to figure this shit out.

The entire AI industry was dependent on GPU hardware manufacturers, and nvidia is STILL back ordered (to my knowledge)

This is like saying that crypto has a hidden energy cost.

[–] [email protected] 11 points 5 months ago* (last edited 5 months ago) (1 children)

It's hidden in the sense that the normal user does not see the true cost on their energy bill. You perform a search and get the result in milliseconds. That makes it easy to get the false impression that it's just a minor operation. It's not like driving a car and watching the the fuel gauge and see the consumption.

Of course one can research how much energy Google consumes and find out the background – IF you're interested. But most people just use tech and do not question or even understand.

load more comments (1 replies)
load more comments (22 replies)
[–] just_another_person 18 points 5 months ago* (last edited 5 months ago) (1 children)

To be fair, it was never "hidden" since all the top 5 decided that GPU was the way to go with this monetization.

Guess who is waiting on the other side of this idiocy with a solution? AMD with cheap FPGA that will do all this work at 10x the speed and similar energy reduction. At a massive fraction of the cost and hassle for cloud providers.

load more comments (1 replies)
[–] repungnant_canary 14 points 5 months ago (5 children)

I'm genuinely curious where their penny picking went? All of tech companies shove ads into our throats and steal our privacy justifying that by saying they operate at loss and need to increase income. But suddenly they can afford spending huge amounts on some shit that won't give them any more income. How do they justify it then?

[–] [email protected] 7 points 5 months ago* (last edited 5 months ago)

It's another untapped market they can monopolize. (Or just run at a loss because investors are happy with another imaginary pot of gold at the end of another rainbow.)

load more comments (4 replies)
[–] afraid_of_zombies 7 points 5 months ago

This is terrible. Why don't we build nuclear power plants, rollout a carbon tax, and put incentives for companies to make their own energy via renewables?

You know the shit that we should have been doing before I was born.

[–] homesweethomeMrL 7 points 5 months ago

Wow AI is just so amazing

[–] [email protected] 6 points 5 months ago (1 children)

I'm surprised it's only 10x. Running a prompt though a llm takes quite a bit of energy, so I guess even the regular searches take more energy than I thought.

load more comments (1 replies)
load more comments
view more: next ›