this post was submitted on 12 Jun 2024
77 points (92.3% liked)

Technology

60129 readers
3600 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Why are people happy or approving of AI on apple products, when it seems like the same thing was treated (rightly) horribly when Microsoft just did it.

Is Apple doing it better in some way? Both said it'll be local only, but then Apple is doing some cloud processing now. Do people really just trust Apple more???

all 43 comments
sorted by: hot top controversial new old
[–] garretble 55 points 6 months ago (1 children)

The biggest thing in the last couple of weeks is Microsoft showing off the half baked Recall "feature" that let your computer take photos of basically everything you do. The idea that you could search for something you did in the past using normal language is interesting, but the implementation was terrible. So that's a big strike against MS, so much so they now are recalling the beta release of that. MS doesn't have a good track record with things that are supposed to be local that somehow end up not local; I believe there was a big issue on xbox where local screenshots were still being monitored by the cloud somewhere. MS also loves shoving ads down your throat and turning back on features you have explicitly turned off. There's no trust.

Apple certainly has their own issues, but as others have said, they have at least outwardly been a privacy first company, at least in marketing materials. They were one of the first to build in "secure enclaves" into phones and PCs so biometrics couldn't get off of your device, for example. There's a bit of a history, earned or otherwise, that Apple isn't doing bad things with your data, so when they say their AI junk is private it's easier to swallow.

That said, I still have yet to find a use for any of this AI junk across all platforms. I wish it all just stayed in the realm of intelligently making your photos a little sharper or whatever and not hallucinating things out of whole cloth. I'm actually happy my iPhone isn't new enough to take advantage of this new stuff.

[–] [email protected] 5 points 6 months ago (2 children)

I’m excited for it. I don’t think hallucinations will be a huge concern. Knowing about all (or most) of the content on my devices is a MUCH easier prospect than knowing everything about everything, which is an idea that OpenAI or Google certainly aren’t trying too hard to refute about their models

[–] jacksilver 5 points 6 months ago (1 children)

Is there any good videos/articles detailing real world use cases for this stuff. I watched a couple of things on Recall, but there wasn't much in terms of what I would actually use it for. While I do have issues from time to time finding things, it doesn't feel like that big of a mover for the cost (privacy or compute).

[–] [email protected] 8 points 6 months ago

I thought Apple's WWDC keynote showed some good uses for it, but you're right, it kind of is just incremental, and may or may not be worth the privacy/compute cost. I personally am mostly excited that Siri will be able to contextualize my calendars, notes, messages, etc. There are lots of bits of information I've lost over the years, that isn't actually lost, but just buried, and current search just isn't up to the task of finding it. Or searching through notes: instead of having to remember when I took a note and where I asked it, I can just ask Siri a question and it'll basically search through my notes and find the answer.

I also think it's going to completely change academic research. Instead of going to Jstor and using a traditional search bar, you could just tell the AI assistant what you're thinking about, what your theories are, etc, and it will search the catalog and find relevant sources for you. It removes a layer of friction, which I think will make a lot of people more efficient/effective.

The main argument I see against it is "well that is all well and good, but none of that will matter when the internet is full of AI-generated crap." I mean yeah, that's true, but the internet is already full of non-AI-generated crap. Sifting through the shitty ads and "sponsored posts" has already made the internet nearly unusable IMO. That's a bigger problem that we need to deal with, that's separate from AI.

[–] garretble 1 points 6 months ago (1 children)

Yeah, I agree with this take, though I did see an article that quoted Tim as saying they wouldn't be able to totally get rid of hallucinations, so I'm still a little reserved on it all.

[–] [email protected] 3 points 6 months ago

I think healthy skepticism is always a good thing. A lot of people seem to be looking at this tech as a panacea, which it absolutely isn’t. It’s still really important that we have the ability to identify when it may be hallucinating, just like we really need the ability to think critically about literally anything on the internet.

[–] lemmylommy 35 points 6 months ago

Well. One company stared down the FBI when they wanted assistance unlocking a terrorists phone, because it would weaken security for everyone else.

The other keeps adding „features“ to my operating system that are designed to siphon data from me, they build at the very least misleading dialogs for those „features“ to trick me into enabling them (not even allowing „no“ as a choice, usually it’s just „yes“ or „not now“) and even when meticulously disabled they have a tendency to magically re-enable themselves after updates.

Who would you trust more?

[–] [email protected] 27 points 6 months ago (2 children)

Apple knows how to market their stuff. They’ve built a strong reputation among their customers.

[–] vermyndax 26 points 6 months ago (1 children)

I actually like Apple’s approach to AI more than all of the others. I don’t care for Microsoft’s implementation at all. I just try to avoid Microsoft in general on top of that, so no need to complain about it.

But I do think Apple’s approach to AI from a privacy and implementation perspective is what I would prefer from a software vendor.

[–] [email protected] 11 points 6 months ago (1 children)

I like how apple seems to understand where ai is useful and doesnt shove it everywhere, its also opt-in.

I dont like apple but their ai implementation is quite nice

[–] Semi_Hemi_Demigod 9 points 6 months ago

Just the fact that it asks for permission before contacting ChatGPT is head and shoulders above every other company's implementation.

[–] [email protected] 19 points 6 months ago (1 children)

i want neither anywhere near me or my devices

[–] [email protected] 3 points 6 months ago

No transcription?

Only verbatim search?

No multi-step queries?

[–] crossover 19 points 6 months ago* (last edited 6 months ago) (1 children)

Apple lay out some details here: https://security.apple.com/blog/private-cloud-compute/

They control the cloud hardware. Information used for cloud requests is deleted as soon as the request is done. Everything end-to-end encrypted. Server builds are publicly available to inspect. And all of this is only used unless the on-device processing can’t handle a request.

If somebody wanted to actually create a private AI system, this is probably how they’d do it.

You can disagree with this or claim somehow that they are actually accessing and selling people’s data, but Apple are going out of their way to show (and cryptographically prove) how they’re not. It would also be incredible fraudulent and illegal for them to make these claims and not follow through.

[–] Redex68 3 points 6 months ago* (last edited 6 months ago)

To add to that, apart from the Apple cloud processing, data can be sent to OpenAI if a prompt is deemed too complex, but even then you're asked whether or not you want it to talk to OpenAI's servers each time, and apparently OpenAI isn't allowed to store any of that data, tho idk how much I'd trust that part.

They also claim that whenever data is sent off device, only the data directly relevant to the prompt is sent.

[–] Thekingoflorda 15 points 6 months ago (1 children)

I think it’s also partly because apple’s business model is more compatible with not earning money from selling your data (not saying they are also doing that). You buy a windows keys once and that’s it, but apple customers often buy into their ecosystem meaning that apple can earn recurring money from overpriced hardware.

[–] Telodzrum 8 points 6 months ago (1 children)

It's not just that hardware price that benefits Apple. Their services revenue was 30% of total revenue last year and it's growing. Between the App Store cut, AppleTV+, iCloud, and all the others, they are running headlong into a world where selling an iPhone is a smaller piece of the pie than selling apps and cloud storage is.

[–] Thekingoflorda 2 points 6 months ago (1 children)

Damn… 30% is actually way more than I thought. Profit margins are probably lower, right?

[–] Telodzrum 2 points 6 months ago (1 children)

They have to be, but even still it’s digital goods. So, price of production is peanuts once development is covered — just infrastructure.

[–] Thekingoflorda 1 points 6 months ago

Not with apple tv+, I believe only netflix makes profit of streaming.

[–] [email protected] 13 points 6 months ago

You mean Apple I vs Microsoft AI 😅

[–] [email protected] 12 points 6 months ago* (last edited 6 months ago)

Unsure, but I think it’s because one of apple’s main talking points in recent years has been privacy (think the Apple logo with the lock). Whether it’s true or not, Apple has built trust with its users (misplaced or otherwise), whereas MS has lost a lot of that trust (especially with the recent Recall fiasco).

I think both companies are storing and using user data and telling you they aren’t, but I think it’s like what you said. People for whatever reason trust Apple more

[–] helpImTrappedOnline 10 points 6 months ago

From what I saw,

MS Recall is a 24/7 AI monitor system that captures everything you look at and saves it for later. They didn't even do the bare minimum for protecting the data, it was just dumped in an unencytped folder where anyone get wholesale access to the data. All trust has been lost.

Apple is using AI as a tool to improve specific tasks/features that a user invokes. Things like assistant queries and the new calculator. They have said some promising things in regards to privacy, specificly with the use of ChatGPT - any inquiry sent to ChatGPT will ask the user permission first and obscure their IP. This shows they care enough to try, they have not lost our trust - but we remain skeptical.

[–] [email protected] 9 points 6 months ago (1 children)

@[email protected] people trust apple, for some reason.

i don't think they should - apple made a deal with open-ai, and are an american company living under their laws (#schremsii), but people love and trust them for some reason.

[–] whereisk 1 points 6 months ago (2 children)

See how much an exploit for iPhone vs Android will run you in the open market.

Also how fast a discovered security hole will be patched and distributed to the fleet between the two systems.

Most Android phones will never get a patch, some will get it 6 - 12 months later and very few within the month.

Also one is run by an advertising company.

[–] [email protected] 3 points 6 months ago

On the other hand, many parts of Android, including the default system WebView, are updated from the Play Store like regular apps, and don't need a full OS update.

[–] [email protected] 1 points 6 months ago

@[email protected] yeah, software sucks. your point?

[–] HeyThisIsntTheYMCA 8 points 6 months ago

trust is a funny word

[–] paraphrand 7 points 6 months ago

Look into how Apple is designing the private compute cloud for this, it’s pretty cool. And unlike what other companies are doing with the data that passes through their LLMs.

[–] mojoaar 7 points 6 months ago (1 children)

I dont trust either in regards to my data - the same goes for Google and Meta for that matter 🙂

But in regards to what I saw in the WWDC video vs M365 Copilot, I have to say I'm looking forward to see what Apple brings in regards to functionality. To be specific it is the actions part I'm looking forward to see.

We have been POC'ing M365 Copilot at work and I have to say that for something they charge 30 dollars a month (with no ability to do actions) the feeling is: meeeh...

[–] dojan 5 points 6 months ago (1 children)

Apple talks big about privacy but we only have their word for it, and they’re a corporation just as likely to lie and muddle things to fool their customers. That’s my main problem concerning Apple and privacy.

On paper their approach to ML is probably the best I’ve seen from any corp, and it’s probably the best I could’ve hoped to see since there’s no way they’d outright not go down that route.

[–] [email protected] 1 points 6 months ago

They mentioned in their keynote they will have ways for third part companies to verify their claims of privacy. How this will look I am not sure, but it’s a good step.

[–] [email protected] 6 points 6 months ago* (last edited 6 months ago)

It's about precieved reputation. I have to admit Apple is really good at software and integration with a somewhat balanced approach, although I will rate their business practices a negative score.

[–] simplejack 1 points 6 months ago* (last edited 6 months ago)

Re: the cloud processing.

It looks like Apple has 3 models. 1 local, 2 cloud.

The cloud model is closed, light weight and the dumbest of the 3. The second model is Apple owned, in the cloud for more compute, and but also something that appears to not learn from the confidential data that people are submitting to it. So this things is probably better than the local model, but still much dumber than GPT or Gemini.

Third model is whatever external LLM you plug into it. OpenAI is the default. Any request to chat GPT throw a consent dialog before any data is sent.

[–] TechNerdWizard42 -4 points 6 months ago (2 children)

People who use and trust Apple, are idiots. They gasp in wonder when they receive such new advances like arranging icons on your screen. Did this with my Palm Pilot in the 90's and every phone, even Windows Mobile phones on 2002/2003. But now it's "NEW"!

Same thing with AI and Apple. Too stupid to actually know any better. But when Daddy Apple says you are going to use it, everyone fawns.

Sending your data, all of it, to a cloud is not privacy. I guarantee this is part of the content scanning and reporting requirements being seen across the globe. It's under the public relations marketing to prevent CSAM, human trafficking, drug crimes, etc. But anyone with a brain cell knows that's not the why. That's the way it can be sold to fear mongering groups.

[–] [email protected] 2 points 6 months ago

I know some really smart people who fawn over apple. They get excited while talking about the new feature I have on my 2019 Samsung or my win10 desktop.

You ever notice that it kinda sounds a lot like your little nephew talking about dinosaurs?

[–] zeppo 1 points 6 months ago* (last edited 6 months ago)

Pretty sure I was able to rearrange icons on my IPod Touch 4 in 2010.

[–] EarMaster -4 points 6 months ago (1 children)

I think it is, because Siri is barely usable any more. Other solutions have shown how bad it is and everyone hopes real AI will make it better...

[–] [email protected] 1 points 6 months ago

...they control how good or bad Siri is. They chose to make it shut on purpose.