this post was submitted on 28 Jan 2025
359 points (99.2% liked)

Technology

61212 readers
5540 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Last week, Copilot made an unsolicited appearance in Microsoft 365. This week, Apple turned on Apple Intelligence by default in its upcoming operating system releases. And it isn't easy to get through any of Google's services without stumbling over Gemini.

Regulators worldwide are keen to ensure that marketing and similar services are opt-in. When dark patterns are used to steer users in one direction or another, lawmakers pay close attention.

But, for some reason, forcing AI on customers is acceptable. Rather than asking "we're going to shovel a load of AI services into your apps that you never asked for, but our investors really need you to use, is this OK?" the assumption instead is that users will be delighted to see their formerly pristine applications cluttered with AI features.

Customers have not asked for any of this. There has been no clamoring for search summaries, no pent-up demand for the revival of a jumped-up Clippy. There is no desire to wreak further havoc on the environment to get an almost-correct recipe for tomato soup. And yet here we are, ready or not.

Without a choice to opt in, the beatings will continue until AI adoption improves or users find that pesky opt-out option.

top 37 comments
sorted by: hot top controversial new old
[–] [email protected] 13 points 1 day ago

For the user data. That's it. That's why it exists. That and the dream of replacing some jobs.

[–] PrivacyDingus 9 points 1 day ago

lmao, who is going to OPT INto the anti-privacy replace-your-job machine?

[–] [email protected] 97 points 1 day ago (7 children)

I am convinced that everyone really has to make this work one way or another because so goddamn much money was - and still is - being spent on this garbage.

[–] [email protected] 76 points 1 day ago (3 children)

We're being asked at work "up our CoPilot usage" to justify the license costs. Pretty sad when you need to be forced to use it.

[–] BroBot9000 39 points 1 day ago (1 children)

Please refuse and at every opportunity let them know how stupid they are for wasting that money.

[–] spankmonkey 7 points 1 day ago (1 children)

Use it and then explain how much of a waste of time it was to get the wrong results.

[–] [email protected] 43 points 1 day ago* (last edited 1 day ago) (1 children)

No, that just plays into their hands. If you complain that it sucks, you're just "using it wrong".

Its better to not use it at all so they end up with dogshit engagement metrics and the exec who approved the spend has to explain to the board why they wasted so much money on something their employees clearly don't want or need to use.

Remember, they won't show the complaints, just the numbers, so those numbers have to suck if you really want the message to get through.

[–] BroBot9000 10 points 1 day ago

This! ☝️

[–] ObviouslyNotBanana 3 points 1 day ago

Just because you brought up copilot, I think people need to see this

[–] eager_eagle 2 points 1 day ago

lmao my workplace encourages use / exploration of LLMs when useful, but that's stupid

[–] [email protected] 22 points 1 day ago* (last edited 1 day ago)

Correct. It's about metrics. They're making AI opt-out because they desparately need to pump user engagement numbers, even if those numbers don't mean anything.

It's all for the shareholders. Big tech has been, for a while now, chasing a new avenue for meteoric growth, because that's what investors have come to expect. So they went all in on AI, to the tune of billions upon billions, and came crashing, hard, into the reality that consumers don't need it and enterprise can't use it;

Transformer models have two fatal flaws; the hallucination problem - to which there is still no solution - makes them unsuitable for enterprise applications, and their cost per operation make them unaffordable for retail customer applications (ie, a chatbot that gives you synonyms while you write is the sort of thing people will happily use, but won't pay $40 a month for).

So now the C-suites are standing over the edge of the burning trash fire they pushed all that money into, knowing that at any moment their shareholders are going to wake up and shove them into it too. They've got to come up with some kind of proof that this investment is paying off. They can't find that proof in sales, because no one is buying, so instead they're going to use "engagement"; shove AI into everything, to the point where people basically wind up using it by accident, then use those metrics to claim that everyone loves it. And then pray to God that one of those two fatal flaws will be solved in time to make their investments pay off in the long run.

[–] [email protected] 24 points 1 day ago

Yeah, it's sunk cost fallacy all the way down. We're just being harvested because...fuck us I guess.

[–] [email protected] 14 points 1 day ago

It's a combination of sunk cost and FOMO.

[–] drahardja 2 points 1 day ago

This is it. “We spent so damn much money on this, we gotta see some NUMBERS on the dashboard!”

[–] SlopppyEngineer 2 points 1 day ago

They have to put it into everything and have people and apps depend on it before the AI bubble pops so after the pop it's too difficult to remove or break the dependency. As long as it's in there they can charge subscription and fees for it.

[–] [email protected] 0 points 1 day ago

"coPilot, what is the Sunk Cost Fallacy?"

[–] [email protected] 28 points 1 day ago* (last edited 1 day ago)

AI is data hungry. If they opt you in, they also get more data from you automatically.

Even if everyone opts out on day 2, they get a crap ton of free stuff in the meantime.

[–] [email protected] 5 points 1 day ago* (last edited 1 day ago)

Because most people are too lazy and/or stupid to bother up opting out. If they had an authentic metric for demand then they couldn't trick the share value into going up.

Edit: fixed autocowrecks

[–] orclev 32 points 1 day ago

Because they need a constant stream of data to feed the models. If people had to opt in then they'd be less likely to do so and the models would starve and become less accurate and therefore less valuable to sell. Remember the trained model is the valuable piece of the entire thing, that's what companies pay money to gain access to. There's no point in sitting on all that user data if they can't turn it into a marketable product by feeding it into a model.

[–] [email protected] 19 points 1 day ago

lmao when have tech companies ever given a semblance of a shit about consent. It’s an industry that has deep roots in misogynistic nerds and drunk frat bros

[–] gaael 13 points 1 day ago

what happened to asking permission first?
Masculine energy I guess.

[–] ilmagico 20 points 1 day ago (3 children)

At least there's still an opt out...

[–] spankmonkey 27 points 1 day ago (1 children)

Odds on the opt out actually opting you out instead of pretending to?

[–] Xanthobilly 1 points 1 day ago
[–] [email protected] 12 points 1 day ago
[–] TheTechnician27 11 points 1 day ago

Yeah, it's called switching away from these shit-ass invasive products wherever possible.

[–] r_deckard 6 points 1 day ago (1 children)

I tried co-pilot. Once.

I asked it "why does Windows Defender peg my HDD* at 100%"

The reply was "I don't know, but here are some google searches that might help"

Microsoft's own co-pilot doesn't seem to have access to microsoft products, so now I uninstall/deactivate it every opportunity I can.

*yes, a HDD. Not ideal for performance these days, but it's the last laptop I have with a HDD, and I use it for experiments.

[–] witten 3 points 1 day ago

I call bullshit. Because no LLM ever says, "I don't know." It just confidently invents an answer out of thin air.

Only mostly facetious here...

[–] [email protected] 7 points 1 day ago* (last edited 1 day ago)

I guessed it was the classic "first one comes free, the next one we'll see". They are offering AI for free to make people dependent on the technology, once people can't live without their AI, then corporations will start profiting from subscriptions.

[–] [email protected] 8 points 1 day ago (1 children)

Because very few people are ever going to turn an optional feature on, whether they would ultimately like it or not. They need to be shown it. If they hate it, they will turn it off.

[–] [email protected] 8 points 1 day ago

They can be offered to choose.

Opt-out is always bad because it is meant to exploit users who are not aware that a certain feature can be turned off. Even among those who do, not all are confidently going through settings.

[–] [email protected] 8 points 1 day ago* (last edited 1 day ago) (1 children)

It is to circumvent E2E encryption and get all your data. here is a good video that gives a great summary: https://youtu.be/yh1pF1zaauc

[–] tabular 5 points 1 day ago (1 children)

I heard Windows was adding a "snapshot" and guessed that's what they were doing, but uhh.. are there really "AI Screens" hardware doing this too?

[–] [email protected] 2 points 1 day ago

Not yet on pc, but iPhone 16 has a chip already and the new copilot certified laptops will have it as well. MacOS has the capabilities already with that Daemon mentioned in the video.

The era of surveillance is making huge progress every week.

[–] paraphrand 5 points 1 day ago

Data collection. And hoping to stumble upon consistently useful use cases yet-unknown.

[–] AA5B 3 points 1 day ago* (last edited 1 day ago)

The reason is probably the indexing takes a lot of time. You can’t just turn AI on and expect it to work. You need to turn it on and wait for it to dig through all your data and index it for fast retrieval

It’s kind of funny that even for “dumb search”, Microsoft turned on indexing by default and always made it difficult to disable. However the search was so crappy that there was never a benefit to it.

[–] [email protected] 1 points 1 day ago

Because they are trash that wish to force their garbage on us. Next question.