this post was submitted on 19 Jul 2023
61 points (98.4% liked)

Apple

17539 readers
106 users here now

Welcome

to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!

Rules:
  1. No NSFW Content
  2. No Hate Speech or Personal Attacks
  3. No Ads / Spamming
    Self promotion is only allowed in the pinned monthly thread

Lemmy Code of Conduct

Communities of Interest:

Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple

Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode

Community banner courtesy of u/Antsomnia.

founded 1 year ago
MODERATORS
 

Apple is creating its own AI-powered chatbot that some engineers are calling “Apple GPT,” according to a report from Bloomberg. The company reportedly doesn’t have any solid plans to release the technology to the public yet.

As noted by Bloomberg, the chatbot uses its own large language model (LLM) framework called “Ajax,” running on Google Cloud and built with Google JAX, a framework created to accelerate machine learning research. Sources close to the situation tell the outlet that Apple has multiple teams working on the project, which includes addressing potential privacy implications.

top 19 comments
sorted by: hot top controversial new old
[–] [email protected] 46 points 1 year ago (5 children)

If they could integrate that development into making Siri better, that’d be great.

[–] [email protected] 20 points 1 year ago* (last edited 1 year ago)

Hey Siri, what's the weather today?

Expect sunny conditions and a high of 27.

Hey Siri, so it's not supposed to rain right?

It is raining right now.

[–] shodgdon 19 points 1 year ago

Even the most limited GPT that gave a lot of "I can't help you with that" responses to be on the safe side would be light years ahead of Siri at this point

[–] [email protected] 9 points 1 year ago (1 children)

Oh my god yes. Siri needs some serious help.

[–] deong 5 points 1 year ago (1 children)

I think there are probably some ways to cross over a bit, but really, LLMs aren't necessarily aimed at the kind of things we want a virtual assistant to do today. Siri falls down mostly on its ability to correctly do things quickly and reliably. Generating 5000 words of convincingly human sounding explanations isn't what I want from a thing I quickly trigger on my phone. What I want is very short or no reply accompanying the action I wanted to take. Call this person. Start navigation to an address. Turn on the lights. Play the version of a song I like from this specific live album. Some of those things are things Siri really sucks at today, and none of them are likely to get a lot better with an LLM in place. Maybe playing music benefits from a more robust understanding of the language of my query, but the rest of it are things where the suckage is more that Siri takes 8 seconds for the server to respond or just inexplicably decides that today it doesn't know how to turn on a light.

At this point it feels like a great LLM would let Siri fail to respond to a much more varied set of ways for me to ask my question in English, but that's not really the target we're shooting for here.

[–] [email protected] 8 points 1 year ago

I agree with you to an extent in that I would not want Siri producing a thesis every time I ask a simple question. But I think one thing that would help is if she remembered the last few things you requested and builds some sort of context around it? That's what impressed me most about chatgpt. If it doesn't quite give me what I'm looking for, I could clarify it and we'd eventually get there. Siri is like a person with severe short term memory loss, and much of my frustration comes from that.

[–] ForgetReddit 3 points 1 year ago (1 children)

They indeed need to make it more conversational. I think this is a big thing Jobs would harp on if still alive. It should feel like always having a friend/assistant in the room who knows everything.

It sucks for privacy but if you trust apple enough it’d be nice to have an always-on microphone for Siri so you could be like “hey siri that tour we talked about at breakfast- can you bring up directions to that?” Stuff like that

[–] StarManta 2 points 1 year ago* (last edited 1 year ago)

Holy shit that sounds like an absolute nightmare.

Let’s ignore for the moment all the mega corporation and cloud data security implications of that (and there are MANY), let’s pretend it does all processing and storage locally and never needs to transmit any of those conversations offsite.

That STILL sounds like an absolute nightmare. I could spy on the people who live with me in an extraordinarily efficient way. “Hey Siri, what did my wife talk about in the phone call over breakfast?” “Hey siri, is my daughter gay?” “Hey siri, summarize all the conversations you heard at this dinner party.”

[–] mysoulishome 11 points 1 year ago

Having something like Siri or Alexa that was actually smart would be great…

[–] [email protected] 9 points 1 year ago (1 children)

How about make Siri not trash?

[–] [email protected] 1 points 1 year ago

@nicetriangle I’m sure it’s their goal, but releasing an AI can rien things pretty out of control if that AI starts thinking against humans logic. It could be a PR nightmare if Apple’s AI starts going rogue. So they’ll have to bridle it a lot.

@holo_nexus

[–] [email protected] 6 points 1 year ago (1 children)

Stop trying to make assistants. That includes Siri. The ML integrations like making text selectable in images is fucking amazing, invest only there please.

[–] burak 2 points 1 year ago

Why? I use siri all the time and it’s useful

[–] PBJ 5 points 1 year ago

F I X S I R I

[–] sulungskwa 4 points 1 year ago* (last edited 1 year ago)

Me in 2 years: "Hey Siri what time is it"

Siri, enabled with GPT: "I found this on the web for 'hey siri what time is it': " <posts a link to chat.openai.com/chat>

[–] alexius 3 points 1 year ago* (last edited 1 year ago)

I strongly believe that GPT is a (really impressive) gimmick. I’m not conviced that it has the potential for growth every outlet is pushing. No matter the brand or model (Bard, Bing, OpenAI, Apple if it happens), I don’t think this will exponentially improve over time like other tech has. It relies on growth in computational power, bandwidth/cloud connectivity and access to quality content for learning. The first two are already mature technologies that will improve marginally over time and access to new content will be increasingly difficult, specially if the web gets flooded with texts written by GPT.

So yeah, it’s cool they create their own model and add it to their services, but it’s not a big deal.

[–] [email protected] 1 points 1 year ago
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Oh we already have an Apple GPT at home, it even comes with D3DMetal.

load more comments
view more: next ›