Wow, this is... underwhelming. Some text summaries for apps nobody use, and minor Siri improvements that bring it up to par where Google Assistant was 5-10 years ago? Even the "do you want to change it, or send it?" prompt is straight outta Android. It also seems like they copied Google Photos and Gmail features.
And the place a better Siri would be really useful, Apple TV, isn't included at all :(
All that marketing for this...? None of these things require a dramatic new AI chip or months of announcements. They're minor improvements at best.
I know this is an honest response, but it's a bit funny that most (if not all) of those features are not useful at all in daily applications. And they will be added in the future™!
I am with you on that. I am an Apple user for sure (TV, macbook, macpro, iphones, ipads, avp), but this whole Apple AI is just ridiculous, they are so behind. Even integration with ChatGPT is laughable, with very questionable interface, no history through that interface.
I just don't understand why we are protecting so much Apple in that case. They have ability to do better, and the company is not going to the right route in the last few years.
The first large title at apple.com right now is "Apple Intelligence is here". The last 10 or so Ads from Apple are all about Ai.
Which is not true, it is not here. The features are very limited, and there is not a lot of them, and a lot of people are getting on waitlist right now (based on the Reddit comments).
I really hope that there is going to be a class action lawsuit against Apple, as they are doing exactly what Elon Musk, Google and Microsoft liked to do for so long, trying to sell you something that does not exist.
Behind what? Windows chatgpt interface? Apple has been quietly adding AI features, sometimes with no equal. For example, having OCR being built in everywhere in the OS is amazing. I've used it many times already, but it's a quiet seamless addition so I guess people don't notice.
Their photos improvements are pretty solid, bringing closer to par with Google photos. Their translation and voice recognition/dictation are state of the art. Their photo processing is equal or better than Googles. Not really seeing where they are "so behind" to be honest.
"so behind" in that after months of pumping themselves up through glossy marketing they are now just barely on parity with Google Pixel's AI feature set?
If you aren't seeing them, it is more likely to how you handle ads in your everyday experience. They have a very active ad campaign that I've see on several TV channels and streaming services (mainly Prime ads)
It's obvious they just shoehorned this stuff in after missing the bus. Now they made the promises, and are working feverishly to deliver in order to protect the stock price.
It's sad really. They could have had the "courage" to say, you know what, there's not much to LLMs yet. We've been shipping NPUs since 2017 and already use ML across our ecosystem to improve the user experience, but we're not going fall for the AI hype. We'll continue shipping useful features as they become ready. And hey, you already have lots of ways to do gen stuff with apps!
But no, instead we get THE FIRST IPHONE BUILT FROM THE GROUND UP FOR AI!!!1!1! which... doesn't even ship with it and when they finally roll it out it's... standard gen stuff. And not even all that great compared to what you can already do, even local.
Which apps had you supposed no one is using? I can't say the summaries in Mail, Messages, and random notifications on my Mac have been super useful yet, although I just started using those features. The summaries I have looked at due to the novelty of the feature have been satisfactorily accurate.
I welcome the writing tools. I'd been using Grammarly as a glorified text and grammar checker. While I have zero interest in using it or any other AI tool to write text, it's nice for finding minor mistakes like using the the same word twice in a sentence. And now I have something free and built-in that's as good as Grammarly at the things I want to use it for.
It's nice that it's built in but I would have preferred they wrote an API for it and let apps provide the service. There's no particular reason we should have to use Apple's models. I already do all the same things locally with ollama with my choice of models.
Does anyone else want to talk to Siri like a normal human? Like an actual assistant?
It drives me nuts that Siri can't interact correctly when spoken to like this:
'Siri, could you text my wife that I will be home in 20 minutes'
Converts to:
Text Wife:
That I will be home in 20 minutes
Should be:
I will be home in 20 minutes
Drives me nuts. This is what I actually want. It's just so much more natural. This is my biggest grievance with virtual assistants. I want to talk to it like a real assistant. Hopefully after the LLM refactor of Siri this will happen, but on 18.2, still doesn't work with redesigned Siri. I don't know if they have added the LLM integration with her, but I thought they had in 18.2
This sort of misunderstanding w.r.t. Siri is actually an area where it historically does quite well. I can't replicate this issue on my phone and i almost suspect you pulled it out of thin air.
I have zero interest in Ai helping me to write or rewrite email or other text. Siri maybe, lets hope it finally becomes somewhat useful, considering how useless and stupid the current version is.
For me, AI helping with writing email has been wonderful. I do a lot of email that is fairly generic and really just needs a basic template for replying or informing people. For those tasks it works great. I also had a more serious email that I wasn't sure how to respond to and needed to make sure the tone was appropriate and AI helped me get a good draft to work from.
Email is one of those things that I would put off as I just found it hard to get started and would worry about grammatical error and trivial mistakes that people seem to focus on more than they should. AI has helped me just be better at email.
I imagine the value of this feature varies depending on how much email you send. If you only send one or two emails a day the value may not be obvious - if most of your job is email communication this could be a whole lot more impactful.
It is more about how much effort does email take for you. I don't do that much email but find the task unpleasant and more time-consuming than I would like. Letting and LLM give me a first draft to work from based on the email and a couple points I want to make in reply makes the whole process much easier and more enjoyable to me, and I don't have to worry as much about silly grammatical errors or other small mistakes.
I usually give it a short draft or points to use and my experience is it usually gives a clear and concise email for me to then work with. You can always ask it to be shorter and see if it gets what you are looking for.
I’ll take ‘not relying on my computer to do tasks that are inherently and inextricably human, like actually reading a text message from my mother or daughter, or replying to them,’ for $5, Alex.
You can also upgrade and just ... not enable it. In order to use the feature, not only do you have to join a waitlist, but you then have to also explicitly opt-in after getting it. Even after opting-in, you can opt out.
Or, you can use some features and not others. You can disable summaries, for instance.
Blanket, knee-jerk reactions like this are silly, and this is coming from someone who, after playing with it a bit, is underwhelmed.
I find the whole "summarize" use-case bizarre. The core problem with too many emails and messages isn't that I want to read all of it but it's too much. It's that most of it is stuff I fundamentally don't care about, from people who just invited themselves into my comms. I don't want a summary of random email blasts. I want my email inbox to only contain messages from people I actually want to hear from in full, and that's simple enough to do with filters, stars, and "high priority" tags.
I did upgrade to it after the first week and it was the worst MacOS upgrade I've ever done (since Leopard), so I went back to Sonoma.
Firewall was broken, Wifi was broken, Contact key Verification was broken. It was the biggest pile of shit I've ever seen, even worse than Windows Vista or Windows 8 ever were for me, by comparison. And the second worst thing besides how buggy it was was how there was literally no benefit to using it - Sequoia is basically all AI nonsense, which I don't intend to use anyway.
Yes it's my fault for upgrading, especially in a dot 0 release, but I was curious and had a free weekend day so I figured I'd give it the plunge. I ended up spending more time investigating just how long I can stay on Sonoma, especially once it gets end of lifed, because I have absolutely no faith in Apple's software going forward and I have 0 interest in having my OS do any kind of AI processing, whether on device or off.
A few more random fuck ups I noticed: Bluetooth Codec quality can no longer be manually controlled or even seen by holding option and clicking the BT icon in the menu bar, nor can you use the Bluetooth Explorer tool to edit the codec and bitrate. Just gone. Sequoia also completely broke my Homebrew and Python3 installs, along with doing something to pip independently of the other two such that I couldn't run any programs that had pip dependencies. That was all on a fresh formatted USB install of Sequoia, too.
That OS is genuinely an unbelievable pile of shit.
Not sure I've spent more than 10 minutes combined over the past 20 years 'reading' a spam or scam email. When those things do manage to get through spam filters it's usually pretty obvious what they are. As for 'marketing messages' I can only assume you mean spam and added this to make the list look longer.
Get back to me in 3-5 years and let me know how getting AI to condense your work emails is going for you - my guess is that the first time ChatGPT manages to fuck up a distillation for you, either by garbling the meaning of something important or just missing a crucial point altogether, you'll swear off it for good. If you still have a job by then...
Point is, if you've been so genuinely bothered by spam and Nigerian princes that you're happy to outsource judgment and critical thinking to a probabilistic bot that hallucinates once in every 15 to 20 tries, then - aside from your skill issue getting your spam filter to work - you and I have very divergent views on what makes man's brain valuable and unique and indeed what parts of our cognition are worth preserving.
It's pretty impressive. I read it back in June when this came out, but the basic gist of it is that everything that _can_ be done on device _is_ done on device, and everything else is run in a _provably_ secure and private cloud environment.
I think it’s “provably secure” in the same sense that end to end chat apps can be provably secure. You can compare codes in the apps to prove that your conversation are noting man-in-the-middles. You still have to trust the software running on both ends to be doing the right thing.
Apple goes further than standard end-to-end encryption messaging by adding a software attestation component to the hand shake. And they say they will publish the server side software for researchers to poke at. And there is a certificate security style log so you can be sure that the server side software is published.
I’m not saying the system is good or works, I’m just saying don’t totally discount the idea of designing a system that has provable security properties.
I'm engaging in pedantry here, but computer science has proofs. When we have hardware level security vulnerabilities, I trust my device as far as I can throw it. Proofs are only sound and valid in maths.
Everything Apple's done yields verifiable security - but it's not "provably secure". The two are distinct, and when you try to sell to me with bad language I get squinty-eyed. Especially since "confidential computing" already exists on x86/AWS, and I struggle to see the difference. It just sounds like Apple marketing to me.
> don’t totally discount the idea of designing a system that has provable security properties
They're only as provable as your assumptions/givens. Given a hardware vuln, where is your security now?
> Security researchers need to be able to verify, with a high degree of confidence, that our privacy and security guarantees for Private Cloud Compute match our public promises.
How is this possible if the software runs on Apple hardware? Do the security researchers get access to the VLSI designs?
> Private Cloud Compute hardware security starts at manufacturing, where we inventory and perform high-resolution imaging of the components of the PCC node before each server is sealed and its tamper switch is activated. When they arrive in the data center, we perform extensive revalidation before the servers are allowed to be provisioned for PCC. The process involves multiple Apple teams that cross-check data from independent sources, and the process is further monitored by a third-party observer not affiliated with Apple. At the end, a certificate is issued for keys rooted in the Secure Enclave UID for each PCC node. The user’s device will not send data to any PCC nodes if it cannot validate their certificates.
Honestly, this was the biggest thing that pushed me from Android to iOS.
I don't trust Google to be (a) incentivized (because ad revenue) or (b) organizationally-capable (because product fiefdoms) to ship privacy arch to that level in Android.
And if I'm going to adopt LLMs on mobile as part of my workflow, I want very strong guarantees about where that data is ending up.
Answered in the linked article. That Apple Intelligence only works on hardware with their neural chip should give a clue. It mostly happens on device. In December they will offer anonymous ChatGPT integration, free, opt-in.
To give you a preliminary answer: IIRC, these models could run locally, and therefor aren't supported by older hardware. But what "... it’s all built on a foundation of privacy with on-device processing and Private Cloud Compute" exactly entails, I'm not sure.
Edit: from what I gather, "Private Cloud Compute" is indeed phoning home, but (supposedly) secure/private.
I updated my macOS and iOS device as soon as I could because I was curious to finally see how these features will work.
Turns out it's not even available today! The Apple Intelligence settings just showed a "Join waitlist" button, I clicked it and it says "You'll be notified when Apple Intelligence is available for [you]".
A not-so-fun footnote for those who looked forward to this:
> The first set of Apple Intelligence features is available now as a free software update with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, and can be accessed in most regions around the world when the device and Siri language are set to U.S. English.
More to come later:
> Apple Intelligence is quickly adding support for more languages. In December, Apple Intelligence will be available for localized English in Australia, Canada, Ireland, New Zealand, South Africa, and the U.K., and in April, a software update will deliver expanded language support, with more coming throughout the year. Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, Vietnamese, and other languages will be supported.
How much of this can be disabled, either through the GUI or via other means (e.g. terminal commands, binary/library removal, hosts file / firewall network block) on Mac OS?
FWIW the good thing to come out of apple intelligence is their private cloud compute, which should open up the doors to more safe and secure processing of data on the cloud. I know there's a push to make everything run on the edge nowadays, but cloud computing still has it's strengths in some areas.
honestly making siri event a little bit better (which as far as I can tell has stubbornly refused to improve over the last few years) would make me more exicted about apple intelligence than all the text summarization and rewriting features
The only ML improvements I noticed and use in my Apple products in year is Chinese text OCR in images, and Chinese handwritten recognition on ipad with pencil. Nothing in Apple Intelligence is making me bat an eye.
I've been using the 18.1 beta for a month now and Siri is noticable better at recognizing your speech and working around less explicit requests e.g. "set a timer for 15 minutes... uh make that 10"
I signed up for the iOS beta and haven't used any of the writing tools, but the AI summary of texts or emails has been really nice for glancing at the phone and getting a glance, especially for wall of text texters.
I updated MacOS to 15.1 but it turns out that I need to be on some kind of waitlist to access the actual new Siri features. Kinda false advertising to call this "available today"...
> Apple Intelligence allows users to stay on top of their inbox like never before with Priority Messages and message summaries.
Ugh.
I mean, really - my manager gets enough emails he doesn't read, fully understand, pipe through Copilot, still doesn't grok 'em, answer and delegate sh't he shouldn't (and wouldn't, if he'd read the email himself) already.
Looking forward to this excuse of having even more overworked people /s
My experience of this feature in the betas over the last few months (for Notifications) has been excellent. I used to have so many notifs I would just ignore them all, now I can quickly glance and see which groups of notifs I want to actually read. In most cases, the summary contains all the info I'd want.
If you glance at the notification to decide if something needs attention now or later, then read the mail(s) in full - that's something different entirely and not a problem I'd think.
I’m interested to see where this goes. That said, I am now planning to delay upgrading my iPhone 11 Pro until iPhone 18 comes out.
IMHO Apple and everyone else is moving way too fast with adoption. The deployment surface of iPhone is huge - I’m interested to see how Apple handles their first really serious issue (like “diverse Nazis”).
Also - current AI programs are complete and total pigs. iPhone 16 offers 8GB of memory and 1TB storage. I know the programs need the memory and so forth but still. I get it but I’m also going to wait now for the vendors to figure out the new future.
In the meantime, I will watch and wait. Plus, if Apples history is any indicator, the first 2-3 versions will be lame but then around 4 or 5 it will take off.
Imho, one of the reasons Apple built out their hybrid security/privacy arch was so they could trade data transfer for cpu/mem, when it makes sense.
The user sees some additional latency, but Apple can deliver equivalent functionality across hardware generations, with some calling out to servers and some doing on-device processing.
Honestly, I'm mostly impressed that Apple is aiming to deliver OS-level, works-in-all-apps functionality.
Imho, that's what users really want, but MS and Google's product org structures hamstring them from being able to deliver that quickly.
Wow, this is... underwhelming. Some text summaries for apps nobody use, and minor Siri improvements that bring it up to par where Google Assistant was 5-10 years ago? Even the "do you want to change it, or send it?" prompt is straight outta Android. It also seems like they copied Google Photos and Gmail features.
And the place a better Siri would be really useful, Apple TV, isn't included at all :(
All that marketing for this...? None of these things require a dramatic new AI chip or months of announcements. They're minor improvements at best.
FWIW, the more major things aren't included yet. Image generation, emoji, Siri improvements haven't arrived yet.
I know this is an honest response, but it's a bit funny that most (if not all) of those features are not useful at all in daily applications. And they will be added in the future™!
I am with you on that. I am an Apple user for sure (TV, macbook, macpro, iphones, ipads, avp), but this whole Apple AI is just ridiculous, they are so behind. Even integration with ChatGPT is laughable, with very questionable interface, no history through that interface.
I just don't understand why we are protecting so much Apple in that case. They have ability to do better, and the company is not going to the right route in the last few years.
They might be behind but at least consider that it's because they've seen what "the future" of this technology looks like and aren't at all impressed.
The only reason they're even implementing the current feature set is because of pressure from investors.
The first large title at apple.com right now is "Apple Intelligence is here". The last 10 or so Ads from Apple are all about Ai.
Which is not true, it is not here. The features are very limited, and there is not a lot of them, and a lot of people are getting on waitlist right now (based on the Reddit comments).
I really hope that there is going to be a class action lawsuit against Apple, as they are doing exactly what Elon Musk, Google and Microsoft liked to do for so long, trying to sell you something that does not exist.
Behind what? Windows chatgpt interface? Apple has been quietly adding AI features, sometimes with no equal. For example, having OCR being built in everywhere in the OS is amazing. I've used it many times already, but it's a quiet seamless addition so I guess people don't notice.
Their photos improvements are pretty solid, bringing closer to par with Google photos. Their translation and voice recognition/dictation are state of the art. Their photo processing is equal or better than Googles. Not really seeing where they are "so behind" to be honest.
"so behind" in that after months of pumping themselves up through glossy marketing they are now just barely on parity with Google Pixel's AI feature set?
C'mon man.
> months of pumping themselves up through glossy marketing
I don’t think I’ve come across their marketing outside tech-specific channels, who are reporting on it because AI gets clicks.
If you aren't seeing them, it is more likely to how you handle ads in your everyday experience. They have a very active ad campaign that I've see on several TV channels and streaming services (mainly Prime ads)
https://www.adweek.com/brand-marketing/apple-intelligence-de...
It's obvious they just shoehorned this stuff in after missing the bus. Now they made the promises, and are working feverishly to deliver in order to protect the stock price.
It's sad really. They could have had the "courage" to say, you know what, there's not much to LLMs yet. We've been shipping NPUs since 2017 and already use ML across our ecosystem to improve the user experience, but we're not going fall for the AI hype. We'll continue shipping useful features as they become ready. And hey, you already have lots of ways to do gen stuff with apps!
But no, instead we get THE FIRST IPHONE BUILT FROM THE GROUND UP FOR AI!!!1!1! which... doesn't even ship with it and when they finally roll it out it's... standard gen stuff. And not even all that great compared to what you can already do, even local.
Which apps had you supposed no one is using? I can't say the summaries in Mail, Messages, and random notifications on my Mac have been super useful yet, although I just started using those features. The summaries I have looked at due to the novelty of the feature have been satisfactorily accurate.
I welcome the writing tools. I'd been using Grammarly as a glorified text and grammar checker. While I have zero interest in using it or any other AI tool to write text, it's nice for finding minor mistakes like using the the same word twice in a sentence. And now I have something free and built-in that's as good as Grammarly at the things I want to use it for.
It's nice that it's built in but I would have preferred they wrote an API for it and let apps provide the service. There's no particular reason we should have to use Apple's models. I already do all the same things locally with ollama with my choice of models.
Siri already works great with Apple TV.
Have you actually used Apple Intelligenge? It sounds like you don’t even use iOS, so not sure how you’re able to judge “minor improvements at best.”
Also privacy. That’s substantially different than Android or anything from Google.
It's called a "soft opening".
Apple said they had to delay rolling out some of the features they had planned for this release.
Fortunately for users the features cost them nothing.
Does anyone else want to talk to Siri like a normal human? Like an actual assistant?
It drives me nuts that Siri can't interact correctly when spoken to like this: 'Siri, could you text my wife that I will be home in 20 minutes'
Converts to:
Text Wife: That I will be home in 20 minutes
Should be: I will be home in 20 minutes
Drives me nuts. This is what I actually want. It's just so much more natural. This is my biggest grievance with virtual assistants. I want to talk to it like a real assistant. Hopefully after the LLM refactor of Siri this will happen, but on 18.2, still doesn't work with redesigned Siri. I don't know if they have added the LLM integration with her, but I thought they had in 18.2
I just tried this and it sent "I will be home in 20 minutes". I am on 18.1 but I doubt that matters.
This sort of misunderstanding w.r.t. Siri is actually an area where it historically does quite well. I can't replicate this issue on my phone and i almost suspect you pulled it out of thin air.
I have zero interest in Ai helping me to write or rewrite email or other text. Siri maybe, lets hope it finally becomes somewhat useful, considering how useless and stupid the current version is.
For me, AI helping with writing email has been wonderful. I do a lot of email that is fairly generic and really just needs a basic template for replying or informing people. For those tasks it works great. I also had a more serious email that I wasn't sure how to respond to and needed to make sure the tone was appropriate and AI helped me get a good draft to work from.
Email is one of those things that I would put off as I just found it hard to get started and would worry about grammatical error and trivial mistakes that people seem to focus on more than they should. AI has helped me just be better at email.
I imagine the value of this feature varies depending on how much email you send. If you only send one or two emails a day the value may not be obvious - if most of your job is email communication this could be a whole lot more impactful.
It is more about how much effort does email take for you. I don't do that much email but find the task unpleasant and more time-consuming than I would like. Letting and LLM give me a first draft to work from based on the email and a couple points I want to make in reply makes the whole process much easier and more enjoyable to me, and I don't have to worry as much about silly grammatical errors or other small mistakes.
I wonder how long it will take for the sort of tone this generates to become the baseline for what to avoid in high-volume email.
Out of curiosity, as a user, have you tried to get it to format an executive email?
My impression of LLM-generated emails has been they tend towards verbosity. At least relative to the minimal-characters some higher-role folks prefer.
I haven't spent much time trying to get them to exec memo edit/format information.
I usually give it a short draft or points to use and my experience is it usually gives a clear and concise email for me to then work with. You can always ask it to be shorter and see if it gets what you are looking for.
Summarization is useful
I enjoy seeing entire text threads and email chain summarized in a notification. Really helps choose what is worth reading now vs later
I've appreciated the novelty of seeing a whole wall of text shrunk down to "can you give me a ride to the airport?"
I’ll take ‘not relying on my computer to do tasks that are inherently and inextricably human, like actually reading a text message from my mother or daughter, or replying to them,’ for $5, Alex.
I’ll stay on Sonoma for as long as I safely can.
You can also upgrade and just ... not enable it. In order to use the feature, not only do you have to join a waitlist, but you then have to also explicitly opt-in after getting it. Even after opting-in, you can opt out.
Or, you can use some features and not others. You can disable summaries, for instance.
Blanket, knee-jerk reactions like this are silly, and this is coming from someone who, after playing with it a bit, is underwhelmed.
I find the whole "summarize" use-case bizarre. The core problem with too many emails and messages isn't that I want to read all of it but it's too much. It's that most of it is stuff I fundamentally don't care about, from people who just invited themselves into my comms. I don't want a summary of random email blasts. I want my email inbox to only contain messages from people I actually want to hear from in full, and that's simple enough to do with filters, stars, and "high priority" tags.
The Apple Intelligence features are opt in. I’d suggest upgrading to Sequoia and just keeping those off if that’s what you are concerned about.
I did upgrade to it after the first week and it was the worst MacOS upgrade I've ever done (since Leopard), so I went back to Sonoma.
Firewall was broken, Wifi was broken, Contact key Verification was broken. It was the biggest pile of shit I've ever seen, even worse than Windows Vista or Windows 8 ever were for me, by comparison. And the second worst thing besides how buggy it was was how there was literally no benefit to using it - Sequoia is basically all AI nonsense, which I don't intend to use anyway.
Yes it's my fault for upgrading, especially in a dot 0 release, but I was curious and had a free weekend day so I figured I'd give it the plunge. I ended up spending more time investigating just how long I can stay on Sonoma, especially once it gets end of lifed, because I have absolutely no faith in Apple's software going forward and I have 0 interest in having my OS do any kind of AI processing, whether on device or off.
A few more random fuck ups I noticed: Bluetooth Codec quality can no longer be manually controlled or even seen by holding option and clicking the BT icon in the menu bar, nor can you use the Bluetooth Explorer tool to edit the codec and bitrate. Just gone. Sequoia also completely broke my Homebrew and Python3 installs, along with doing something to pip independently of the other two such that I couldn't run any programs that had pip dependencies. That was all on a fresh formatted USB install of Sequoia, too.
That OS is genuinely an unbelievable pile of shit.
Also inextricably human tasks like reading spam messages, scam messages, marketing messages, overly verbose work emails, enjoy that!
Not sure I've spent more than 10 minutes combined over the past 20 years 'reading' a spam or scam email. When those things do manage to get through spam filters it's usually pretty obvious what they are. As for 'marketing messages' I can only assume you mean spam and added this to make the list look longer.
Get back to me in 3-5 years and let me know how getting AI to condense your work emails is going for you - my guess is that the first time ChatGPT manages to fuck up a distillation for you, either by garbling the meaning of something important or just missing a crucial point altogether, you'll swear off it for good. If you still have a job by then...
Point is, if you've been so genuinely bothered by spam and Nigerian princes that you're happy to outsource judgment and critical thinking to a probabilistic bot that hallucinates once in every 15 to 20 tries, then - aside from your skill issue getting your spam filter to work - you and I have very divergent views on what makes man's brain valuable and unique and indeed what parts of our cognition are worth preserving.
Question. Is this phoning home all the time?
I'd recommend reading up on Private Cloud Compute, the system that Apple designed to implement this: https://security.apple.com/blog/private-cloud-compute/
It's pretty impressive. I read it back in June when this came out, but the basic gist of it is that everything that _can_ be done on device _is_ done on device, and everything else is run in a _provably_ secure and private cloud environment.
Not to dunk on you, but your misuse of "provably" makes me discount what you said.
If we could "prove" security, we would. Proving security in a networked environment? Hahaha - there have been successful attacks on airgapped envs.
I think it’s “provably secure” in the same sense that end to end chat apps can be provably secure. You can compare codes in the apps to prove that your conversation are noting man-in-the-middles. You still have to trust the software running on both ends to be doing the right thing.
Apple goes further than standard end-to-end encryption messaging by adding a software attestation component to the hand shake. And they say they will publish the server side software for researchers to poke at. And there is a certificate security style log so you can be sure that the server side software is published.
I’m not saying the system is good or works, I’m just saying don’t totally discount the idea of designing a system that has provable security properties.
I'm engaging in pedantry here, but computer science has proofs. When we have hardware level security vulnerabilities, I trust my device as far as I can throw it. Proofs are only sound and valid in maths.
https://securityintelligence.com/news/apple-m-series-chips-h...
Everything Apple's done yields verifiable security - but it's not "provably secure". The two are distinct, and when you try to sell to me with bad language I get squinty-eyed. Especially since "confidential computing" already exists on x86/AWS, and I struggle to see the difference. It just sounds like Apple marketing to me.
> don’t totally discount the idea of designing a system that has provable security properties
They're only as provable as your assumptions/givens. Given a hardware vuln, where is your security now?
Thanks for this reply, it was more informative than the original "dunking" one.
Agreed, I think a lot of people do not appreciate the complexity of software systems. If we could prove software, we wouldn't have bugs. https://wiki.c2.com/?ProofsCantProveTheAbsenceOfBugs
> Security researchers need to be able to verify, with a high degree of confidence, that our privacy and security guarantees for Private Cloud Compute match our public promises.
How is this possible if the software runs on Apple hardware? Do the security researchers get access to the VLSI designs?
From the article (emphasis mine):
> Private Cloud Compute hardware security starts at manufacturing, where we inventory and perform high-resolution imaging of the components of the PCC node before each server is sealed and its tamper switch is activated. When they arrive in the data center, we perform extensive revalidation before the servers are allowed to be provisioned for PCC. The process involves multiple Apple teams that cross-check data from independent sources, and the process is further monitored by a third-party observer not affiliated with Apple. At the end, a certificate is issued for keys rooted in the Secure Enclave UID for each PCC node. The user’s device will not send data to any PCC nodes if it cannot validate their certificates.
> high-resolution imaging of the components of the PCC node
Does that mean they image the internals of the ICs? Or do they just make some pictures of the PCBs?
> monitored by a third-party observer not affiliated with Apple
Your friendly neighborhood NSA agent :p
Honestly, this was the biggest thing that pushed me from Android to iOS.
I don't trust Google to be (a) incentivized (because ad revenue) or (b) organizationally-capable (because product fiefdoms) to ship privacy arch to that level in Android.
And if I'm going to adopt LLMs on mobile as part of my workflow, I want very strong guarantees about where that data is ending up.
I believe that’s called Gemini Nano.
One model does not a strategy make. And I'm loathe to bet on a horse that makes money every time it bites me.
Answered in the linked article. That Apple Intelligence only works on hardware with their neural chip should give a clue. It mostly happens on device. In December they will offer anonymous ChatGPT integration, free, opt-in.
To give you a preliminary answer: IIRC, these models could run locally, and therefor aren't supported by older hardware. But what "... it’s all built on a foundation of privacy with on-device processing and Private Cloud Compute" exactly entails, I'm not sure.
Edit: from what I gather, "Private Cloud Compute" is indeed phoning home, but (supposedly) secure/private.
Observation: the primary user base for this is not us (technology professionals already using Copilot, and dealing with management layers who do),
it's "everyday" people doing everyday tasks.
I updated my macOS and iOS device as soon as I could because I was curious to finally see how these features will work.
Turns out it's not even available today! The Apple Intelligence settings just showed a "Join waitlist" button, I clicked it and it says "You'll be notified when Apple Intelligence is available for [you]".
Try setting your language to US English. Might not be the issue, but only US English devices get it at the moment.
FWIW to anyone reading this: I got in off the waitlist in ~2 hours.
Today's release only supports English (US) and you appear to be in the UK. English (UK) support is slated for the end of this year.
My phone is on English (US) and I live in San Francisco.
For me it was a formality, I got the invite in a minute later.
I also got it around 40 minutes later!
I'm in the US and was asked to join a waitlist.
UK English is available in the developer beta 18.2 if you have an Apple developer account.
A not-so-fun footnote for those who looked forward to this:
> The first set of Apple Intelligence features is available now as a free software update with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, and can be accessed in most regions around the world when the device and Siri language are set to U.S. English.
More to come later:
> Apple Intelligence is quickly adding support for more languages. In December, Apple Intelligence will be available for localized English in Australia, Canada, Ireland, New Zealand, South Africa, and the U.K., and in April, a software update will deliver expanded language support, with more coming throughout the year. Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, Vietnamese, and other languages will be supported.
How much of this can be disabled, either through the GUI or via other means (e.g. terminal commands, binary/library removal, hosts file / firewall network block) on Mac OS?
All of it is opt-in and even requires joining a waiting list.
Apple hasn't provided any documentation about how it works, beyond that there may be a waiting list.
There's a checkbox at Settings > Apple Intelligence & Siri to turn it on.
It’s all opt-in. So nothing to disable.
FWIW the good thing to come out of apple intelligence is their private cloud compute, which should open up the doors to more safe and secure processing of data on the cloud. I know there's a push to make everything run on the edge nowadays, but cloud computing still has it's strengths in some areas.
honestly making siri event a little bit better (which as far as I can tell has stubbornly refused to improve over the last few years) would make me more exicted about apple intelligence than all the text summarization and rewriting features
The only ML improvements I noticed and use in my Apple products in year is Chinese text OCR in images, and Chinese handwritten recognition on ipad with pencil. Nothing in Apple Intelligence is making me bat an eye.
I've been using the 18.1 beta for a month now and Siri is noticable better at recognizing your speech and working around less explicit requests e.g. "set a timer for 15 minutes... uh make that 10"
Okay well it says that's a part of the update
I signed up for the iOS beta and haven't used any of the writing tools, but the AI summary of texts or emails has been really nice for glancing at the phone and getting a glance, especially for wall of text texters.
In honor of election season, I hope I can use the 'priority messages' stuff to better filter the political spam I get.
Is this the introduction of native call recording on iPhone? I've always had to use an app to record calls.
Yep, it's built in now.
I updated MacOS to 15.1 but it turns out that I need to be on some kind of waitlist to access the actual new Siri features. Kinda false advertising to call this "available today"...
I can’t believe they didn’t update the USB location on the Magic Mouse. Still on the bottom. Unbelievable.
That would be a pretty impressively sweeping thing to accomplish with a software update launching Apple Intelligence. :D
sorry just griping about apple’s announcements today in general.
Rather disappointed that this isn't available on my 14 Pro.
> Apple Intelligence allows users to stay on top of their inbox like never before with Priority Messages and message summaries.
Ugh.
I mean, really - my manager gets enough emails he doesn't read, fully understand, pipe through Copilot, still doesn't grok 'em, answer and delegate sh't he shouldn't (and wouldn't, if he'd read the email himself) already.
Looking forward to this excuse of having even more overworked people /s
My experience of this feature in the betas over the last few months (for Notifications) has been excellent. I used to have so many notifs I would just ignore them all, now I can quickly glance and see which groups of notifs I want to actually read. In most cases, the summary contains all the info I'd want.
If you glance at the notification to decide if something needs attention now or later, then read the mail(s) in full - that's something different entirely and not a problem I'd think.
I’m interested to see where this goes. That said, I am now planning to delay upgrading my iPhone 11 Pro until iPhone 18 comes out.
IMHO Apple and everyone else is moving way too fast with adoption. The deployment surface of iPhone is huge - I’m interested to see how Apple handles their first really serious issue (like “diverse Nazis”).
Also - current AI programs are complete and total pigs. iPhone 16 offers 8GB of memory and 1TB storage. I know the programs need the memory and so forth but still. I get it but I’m also going to wait now for the vendors to figure out the new future.
In the meantime, I will watch and wait. Plus, if Apples history is any indicator, the first 2-3 versions will be lame but then around 4 or 5 it will take off.
Imho, one of the reasons Apple built out their hybrid security/privacy arch was so they could trade data transfer for cpu/mem, when it makes sense.
The user sees some additional latency, but Apple can deliver equivalent functionality across hardware generations, with some calling out to servers and some doing on-device processing.
Honestly, I'm mostly impressed that Apple is aiming to deliver OS-level, works-in-all-apps functionality.
Imho, that's what users really want, but MS and Google's product org structures hamstring them from being able to deliver that quickly.