this post was submitted on 22 Dec 2023
229 points (95.6% liked)

Technology

60073 readers
3765 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.

top 50 comments
sorted by: hot top controversial new old
[–] Ghostalmedia 31 points 1 year ago (2 children)

Remember, this probably isn’t an either or thing. Both Apple and Google have been offloading certain AI tasks to devices to speed up response time and process certain requests offline.

[–] OscarRobin 14 points 1 year ago (1 children)

Yep, though Google is happy to process your data in the cloud constantly while Apple consistently tries to find ways to achieve it locally, which is generally better for privacy and security but also cheaper for them too.

[–] zwaetschgeraeuber 9 points 1 year ago (1 children)

Yea thats why they look trough your images for "cp"

[–] Squizzy 3 points 1 year ago (1 children)
[–] zwaetschgeraeuber 1 points 1 year ago (1 children)

Ok have to correct myself, they crawled back from this 2 weeks ago due to backlash, but i doubt they wont do it at least in a similar way or hidden like they did with reducing power on older devices to "save battery" https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

[–] [email protected] 1 points 1 year ago (1 children)

All phones reduce power on older devices. Would you rather your phone be a little slower or just cut out at 15%? The problem with Apple is they weren’t upfront about it.

[–] zwaetschgeraeuber 1 points 1 year ago (1 children)
[–] [email protected] 1 points 1 year ago

Oh fair. I was mainly saying all phones do that hahaha

[–] [email protected] 28 points 1 year ago (4 children)

I am really jealous being an android user that Apple usually have this things running on their own devices with more privacy in mind. But I just can't accept the close garden Apple push.

[–] bonnetbee 18 points 1 year ago (1 children)

They do say that they have privacy in mind. And they are also collecting the same data of their users as Google. Don't be too jealous, they suck just as much as your next Android-Phone company. But with a higher price tag and a walled garden.

[–] [email protected] 7 points 1 year ago (1 children)

They’re both bad, but Apple is clearly less bad about privacy than most big hardware or software companies by far.

[–] bonnetbee 6 points 1 year ago

How do you know? Because they promise?

[–] Squizzy 5 points 1 year ago (2 children)

I would change to apple if it weren't for a few annoying bits, mostly to do with the walled garden. Like the appletv I have has a terrible input for typing unless you have an iPhone...so now I want neither.

[–] [email protected] 6 points 1 year ago (1 children)

I like to own my stuff and have control of it install install whatever I want.

[–] Squizzy 1 points 1 year ago (3 children)

Yeah but android is owned by an advertising company, all they want is data. Apple has a better system and less need to push for more data.

[–] [email protected] 3 points 1 year ago

Apple is also an advertizing company, and walled garden ecosystem makes it harder to avoid its reach.

[–] [email protected] 2 points 1 year ago

But I can removed them instead Apple if they want to track you cant remove it from there. Even on their secure mode the DNS was reaching to Apple servers.

[–] bonnetbee 1 points 1 year ago (1 children)

Android is not owned by Google, but used and heavily supported by Google. It is open source, you can check how much data bare Android is collecting.

And 'all they want is data' is wrong, too, all they want is money. Same with Apple, all they want is money. And if they get more money by collecting user data to target more people to buy Apple products, they will do that.

[–] Squizzy 1 points 1 year ago

All they want is data because that is their model for how they make money. Android may not be outright owned by Google but it's development and funding without Google would be on a very different level. Yeah it is open source to an extent but the versions out by Google have their changes made, which goes across the vast majority of android devices.

[–] RGB3x3 5 points 1 year ago (2 children)

I'm in the same boat, but it seems to be some things they're going to change in 2024. Namely, sideloading apps, actual Firefox with add-ons, and the ability to actually move app icons wherever I want on the home screen (not sure if that'll change ever).

They've already gone to USB-C, which was the main reason I never would have switched over.

[–] [email protected] 3 points 1 year ago

Even with all those changes I wouldn't change.

[–] [email protected] 2 points 1 year ago

Sideloading’s possible and surprisingly easy on iOS even for non-technical folks, I use AltStore but I thinks there’s other options. Don’t know what I’d without uYou+ or TikTok LRD

[–] [email protected] 2 points 1 year ago (1 children)

Funny that you mention it, a few months ago when updating stuff I got a new feature on my Android phone... Offline subtitle generation based on audio, just realtime generated from anything outputing sound on my phone.

A Google search suggests this might be an older feature - not sure if my phone didn't support it, or if I maybe just missed it, or if they added a more obvious button.

Google has a separate app for that stuff, called Private Compute Services. Right now it's nothing like an offline Google assistant replacement, but I thought it's really nice to have that stuff available without relying on internet access.

[–] [email protected] 1 points 1 year ago

Did you try to turn off the internet to see if it actually works? That's pretty amazing thank you for sharing!

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago)

It's already possible. A 4bit quant of phi 1.5 1.5B (as smart as a 7b model ) takes 1Gb of ram . Phi 2 2.6B (as smart as a 13b model ) was recently released and it would likely take 2GB of RAM with 4bit Quant (not tried yet) The research only license on these make people not want to port them to android and instead focus on weak 3B models or bigger models ( 7b+) which heavily limit any potential usability.

  1. Apple could mimic and improve the phi models training to make their own powerful but small model and then leverage the fact that they have full knowledge and control over the hardware architecture to maximize every drop of performance. Kinda like how the some people used their deep knowledge of the console architecture to make it do things that seems impossible.

Or

  1. The Apple engineers will choose, either due to time constraints or laziness to simply use llama.cpp which will certainly implement this flash attention and then use an already available model that allow its use for commercial purposes like mistral, add some secret sauce optimizations based on the hardware and voilà.

I bet on 2.

[–] OhmsLawn 9 points 1 year ago (5 children)

How's that supposed to work?

I'm picturing a backpack full of batteries and graphics cards. Maybe they're talking about a more limited model?

[–] abhibeckert 29 points 1 year ago* (last edited 1 year ago)

This is a Financial Times article, regurgitated by Ars Technica. The article isn't by a tech journalist, it's by a business journalist, and their definition of "AI" is a lot looser than what you're thinking of.

I'm pretty sure they're talking about things that Apple is already doing not just on current hardware but even on hardware from a few years ago. For example the keyboard on iOS now uses pretty much the same technology as ChatGPT but scaled way way down to the point where "Tiny Language Model" would probably be more accurate. I wouldn't be surprised if the training data is as small as ten megabytes, compared to half a terabyte for ChatGPT.

The model will learn that you say "Fuck Yeah!" to one person and "That is interesting, thanks for sharing it with me." to someone else. Very cool technology - but it's not AI. The keyboard really will suggest swear words now by the way - if you've used them previously in a similar context to the current one. The old algorithmic keyboard had hardcoded "do not swear, ever" logic.

[–] [email protected] 12 points 1 year ago (1 children)

I've been playing with llama.cpp a bit for the last week and it's surprisingly workable on a recent laptop just using the CPU. It's not really hard to imagine Apple and others adding (more) AI accelerators on mobile.

[–] [email protected] 5 points 1 year ago

Oh yes and the CPUs on phones have being getting more powerful every year and there was nothing that could take advantage of their full potential now with a local AI will be great for privacy and response.

[–] MiltownClowns 6 points 1 year ago (2 children)

They're making their own silicone now. You can achieve a lot more efficiency when you're streamlined the whole way through.

[–] hips_and_nips 6 points 1 year ago

silicone

It’s silicon. Silicon is a naturally occurring chemical element, whereas silicone is a synthetic substance.

Silicon is for computer chips, silicone is for boobies.

[–] [email protected] 4 points 1 year ago (2 children)

By making their own, you mean telling Taiwan Semiconductor Manufacturing Company “hey we are going to buy enough of these units that you have to give us the specs we chose at a better price than the competitors, and since we chose the specs off your manufacturing capacity sheets we will say “engineered in Cupertino TM” “

Btw I’m not shitting on Apple here. I love my m2 processor.

load more comments (2 replies)
[–] eager_eagle 4 points 1 year ago* (last edited 1 year ago)

Yes, like google is doing with their tensor chips in the pixels

[–] blahsay 7 points 1 year ago (3 children)

I'm going to blow your mind here....the 'cloud' is just two or three data centres with replication turned on. It's mostly a buzz word to charge a bit more

[–] [email protected] 11 points 1 year ago (1 children)

Wait, so you mean it's not actual rain clouds in the sky??

[–] blahsay 3 points 1 year ago

They looked into it but apparently no

[–] Vash63 8 points 1 year ago (1 children)

Eh, it's a bit more than that. I work on a private cloud, the implications of it being a cloud versus traditional bare metal or virtualization platforms are around the APIs, quick spin up/down cycles, fully integrated recovery, imaging and remote console systems, integration with automated deployment platforms and others. It's not just a buzz word.

[–] blahsay 3 points 1 year ago

Most of that's on any half decent commercial server. You're right there's definitely some differences though.

I actually worked on our corporate move from private servers (main, backup and dr) to Azure cloud which had the only two server locations (melb and Sydney) and the mythology around cloud seemed a bit much

[–] eager_eagle 2 points 1 year ago* (last edited 1 year ago)

Charging more for cloud? As if apple is not finding an excuse to charge even more for their overpriced phones by going offline.

[–] [email protected] 4 points 1 year ago

This is the best summary I could come up with:


Apple’s latest research about running large language models on smartphones offers the clearest signal yet that the iPhone maker plans to catch up with its Silicon Valley rivals in generative artificial intelligence.

The paper was published on December 12 but caught wider attention after Hugging Face, a popular site for AI researchers to showcase their work, highlighted it late on Wednesday.

Device manufacturers and chipmakers are hoping that new AI features will help revive the smartphone market, which has had its worst year in a decade, with shipments falling an estimated 5 percent, according to Counterpoint Research.

Running the kind of large AI model that powers ChatGPT or Google’s Bard on a personal device brings formidable technical challenges, because smartphones lack the huge computing resources and energy available in a data center.

Apple tested its approach on models including Falcon 7B, a smaller version of an open source LLM originally developed by the Technology Innovation Institute in Abu Dhabi.

Academic papers are not a direct indicator of how Apple intends to add new features to its products, but they offer a rare glimpse into its secretive research labs and the company’s latest technical breakthroughs.


The original article contains 741 words, the summary contains 194 words. Saved 74%. I'm a bot and I'm open source!

[–] Ghostalmedia 3 points 1 year ago* (last edited 1 year ago) (1 children)

Google is doing this exact same thing with Gemini, the platform behind Bard / Assistant.

Gemini has large scale models, that live in data centers, and handles complex queries. They also have a “Nano” version of the model that can live on a phone and handle simpler on-device tasks.

The smaller models are great for things like natural language UI and smart home controls. It’s also way faster and capable of working offline. A big use case for offline AI has been hiking with the Apple Watch in areas with no reception.

[–] bruhduh 1 points 1 year ago

Also battery management, background tasks power distribution and hardware energy efficiency, i mean it would be great to have ai that adapted hardware energy consumption settings depending on my use case, yes i know that algorithms already exist to do that, but it would be great to have much much more flexible energy manager based on ai that accommodate and adapt to my use cases

[–] kromem 3 points 1 year ago (1 children)

AKA "we completely missed the boat on this thing and are going to pretend it was intentional by focusing on an inevitable inflection point a few years out from today instead."

load more comments (1 replies)
load more comments
view more: next ›