58 comments

  • solardev 3 hours ago

    Wow, this is... underwhelming. Some text summaries for apps nobody use, and minor Siri improvements that bring it up to par where Google Assistant was 5-10 years ago? Even the "do you want to change it, or send it?" prompt is straight outta Android. It also seems like they copied Google Photos and Gmail features.

    And the place a better Siri would be really useful, Apple TV, isn't included at all :(

    All that marketing for this...? None of these things require a dramatic new AI chip or months of announcements. They're minor improvements at best.

    • hondo77 3 minutes ago

      It's called a "soft opening".

    • CubsFan1060 2 hours ago

      FWIW, the more major things aren't included yet. Image generation, emoji, Siri improvements haven't arrived yet.

      • XzAeRosho 2 hours ago

        I know this is an honest response, but it's a bit funny that most (if not all) of those features are not useful at all in daily applications. And they will be added in the future™!

        • outcoldman an hour ago

          I am with you on that. I am an Apple user for sure (TV, macbook, macpro, iphones, ipads, avp), but this whole Apple AI is just ridiculous, they are so behind. Even integration with ChatGPT is laughable, with very questionable interface, no history through that interface.

          I just don't understand why we are protecting so much Apple in that case. They have ability to do better, and the company is not going to the right route in the last few years.

    • baggachipz an hour ago

      It's obvious they just shoehorned this stuff in after missing the bus. Now they made the promises, and are working feverishly to deliver in order to protect the stock price.

    • dmix 2 hours ago

      Apple said they had to delay rolling out some of the features they had planned for this release.

    • linotype 2 hours ago

      Fortunately for users the features cost them nothing.

  • andiareso 2 hours ago

    Does anyone else want to talk to Siri like a normal human? Like an actual assistant?

    It drives me nuts that Siri can't interact correctly when spoken to like this: 'Siri, could you text my wife that I will be home in 20 minutes'

    Converts to:

    Text Wife: That I will be home in 20 minutes

    Should be: I will be home in 20 minutes

    Drives me nuts. This is what I actually want. It's just so much more natural. This is my biggest grievance with virtual assistants. I want to talk to it like a real assistant. Hopefully after the LLM refactor of Siri this will happen, but on 18.2, still doesn't work with redesigned Siri. I don't know if they have added the LLM integration with her, but I thought they had in 18.2

    • bjtitus an hour ago

      I just tried this and it sent "I will be home in 20 minutes". I am on 18.1 but I doubt that matters.

  • garyclarke27 3 hours ago

    I have zero interest in Ai helping me to write or rewrite email or other text. Siri maybe, lets hope it finally becomes somewhat useful, considering how useless and stupid the current version is.

    • mulderc 2 hours ago

      For me, AI helping with writing email has been wonderful. I do a lot of email that is fairly generic and really just needs a basic template for replying or informing people. For those tasks it works great. I also had a more serious email that I wasn't sure how to respond to and needed to make sure the tone was appropriate and AI helped me get a good draft to work from.

      Email is one of those things that I would put off as I just found it hard to get started and would worry about grammatical error and trivial mistakes that people seem to focus on more than they should. AI has helped me just be better at email.

      • simonw 2 hours ago

        I imagine the value of this feature varies depending on how much email you send. If you only send one or two emails a day the value may not be obvious - if most of your job is email communication this could be a whole lot more impactful.

        • throwaway19972 2 hours ago

          I wonder how long it will take for the sort of tone this generates to become the baseline for what to avoid in high-volume email.

      • ethbr1 2 hours ago

        Out of curiosity, as a user, have you tried to get it to format an executive email?

        My impression of LLM-generated emails has been they tend towards verbosity. At least relative to the minimal-characters some higher-role folks prefer.

        I haven't spent much time trying to get them to exec memo edit/format information.

    • mingus88 2 hours ago

      Summarization is useful

      I enjoy seeing entire text threads and email chain summarized in a notification. Really helps choose what is worth reading now vs later

    • 2 hours ago
      [deleted]
  • avazhi 2 hours ago

    I’ll take ‘not relying on my computer to do tasks that are inherently and inextricably human, like actually reading a text message from my mother or daughter, or replying to them,’ for $5, Alex.

    I’ll stay on Sonoma for as long as I safely can.

    • orionsbelt an hour ago

      The Apple Intelligence features are opt in. I’d suggest upgrading to Sequoia and just keeping those off if that’s what you are concerned about.

  • amelius 3 hours ago

    Question. Is this phoning home all the time?

    • justusthane 2 hours ago

      I'd recommend reading up on Private Cloud Compute, the system that Apple designed to implement this: https://security.apple.com/blog/private-cloud-compute/

      It's pretty impressive. I read it back in June when this came out, but the basic gist of it is that everything that _can_ be done on device _is_ done on device, and everything else is run in a _provably_ secure and private cloud environment.

      • AlexErrant 2 hours ago

        Not to dunk on you, but your misuse of "provably" makes me discount what you said.

        If we could "prove" security, we would. Proving security in a networked environment? Hahaha - there have been successful attacks on airgapped envs.

      • amelius 2 hours ago

        > Security researchers need to be able to verify, with a high degree of confidence, that our privacy and security guarantees for Private Cloud Compute match our public promises.

        How is this possible if the software runs on Apple hardware? Do the security researchers get access to the VLSI designs?

        • judofyr 2 hours ago

          From the article (emphasis mine):

          > Private Cloud Compute hardware security starts at manufacturing, where we inventory and perform high-resolution imaging of the components of the PCC node before each server is sealed and its tamper switch is activated. When they arrive in the data center, we perform extensive revalidation before the servers are allowed to be provisioned for PCC. The process involves multiple Apple teams that cross-check data from independent sources, and the process is further monitored by a third-party observer not affiliated with Apple. At the end, a certificate is issued for keys rooted in the Secure Enclave UID for each PCC node. The user’s device will not send data to any PCC nodes if it cannot validate their certificates.

          • amelius 2 hours ago

            > high-resolution imaging of the components of the PCC node

            Does that mean they image the internals of the ICs? Or do they just make some pictures of the PCBs?

      • ethbr1 2 hours ago

        Honestly, this was the biggest thing that pushed me from Android to iOS.

        I don't trust Google to be (a) incentivized (because ad revenue) or (b) organizationally-capable (because product fiefdoms) to ship privacy arch to that level in Android.

        And if I'm going to adopt LLMs on mobile as part of my workflow, I want very strong guarantees about where that data is ending up.

        • lrem 2 hours ago

          I believe that’s called Gemini Nano.

    • gregjor 2 hours ago

      Answered in the linked article. That Apple Intelligence only works on hardware with their neural chip should give a clue. It mostly happens on device. In December they will offer anonymous ChatGPT integration, free, opt-in.

    • 2 hours ago
      [deleted]
    • tgv 2 hours ago

      To give you a preliminary answer: IIRC, these models could run locally, and therefor aren't supported by older hardware. But what "... it’s all built on a foundation of privacy with on-device processing and Private Cloud Compute" exactly entails, I'm not sure.

      Edit: from what I gather, "Private Cloud Compute" is indeed phoning home, but (supposedly) secure/private.

  • underyx 2 hours ago

    I updated my macOS and iOS device as soon as I could because I was curious to finally see how these features will work.

    Turns out it's not even available today! The Apple Intelligence settings just showed a "Join waitlist" button, I clicked it and it says "You'll be notified when Apple Intelligence is available for [you]".

    • graeme 2 hours ago

      Try setting your language to US English. Might not be the issue, but only US English devices get it at the moment.

    • cbhl 2 hours ago

      Today's release only supports English (US) and you appear to be in the UK. English (UK) support is slated for the end of this year.

      • underyx 2 hours ago

        My phone is on English (US) and I live in San Francisco.

        • browningstreet 2 hours ago

          For me it was a formality, I got the invite in a minute later.

          • underyx 2 hours ago

            I also got it around 40 minutes later!

      • daft_pink 2 hours ago

        UK English is available in the developer beta 18.2 if you have an Apple developer account.

      • purpleblue 2 hours ago

        I'm in the US and was asked to join a waitlist.

  • 6gvONxR4sf7o 42 minutes ago

    In honor of election season, I hope I can use the 'priority messages' stuff to better filter the political spam I get.

  • Y-bar 3 hours ago

    A not-so-fun footnote for those who looked forward to this:

    > The first set of Apple Intelligence features is available now as a free software update with iOS 18.1, iPadOS 18.1, and macOS Sequoia 15.1, and can be accessed in most regions around the world when the device and Siri language are set to U.S. English.

    More to come later:

    > Apple Intelligence is quickly adding support for more languages. In December, Apple Intelligence will be available for localized English in Australia, Canada, Ireland, New Zealand, South Africa, and the U.K., and in April, a software update will deliver expanded language support, with more coming throughout the year. Chinese, English (India), English (Singapore), French, German, Italian, Japanese, Korean, Portuguese, Spanish, Vietnamese, and other languages will be supported.

  • drooopy 2 hours ago

    How much of this can be disabled, either through the GUI or via other means (e.g. terminal commands, binary/library removal, hosts file / firewall network block) on Mac OS?

    • pilif 2 hours ago

      All of it is opt-in and even requires joining a waiting list.

  • superfamicom 2 hours ago

    I signed up for the iOS beta and haven't used any of the writing tools, but the AI summary of texts or emails has been really nice for glancing at the phone and getting a glance, especially for wall of text texters.

  • anvil-on-my-toe 2 hours ago

    Is this the introduction of native call recording on iPhone? I've always had to use an app to record calls.

  • abe94 3 hours ago

    honestly making siri event a little bit better (which as far as I can tell has stubbornly refused to improve over the last few years) would make me more exicted about apple intelligence than all the text summarization and rewriting features

    • gwervc 3 hours ago

      The only ML improvements I noticed and use in my Apple products in year is Chinese text OCR in images, and Chinese handwritten recognition on ipad with pencil. Nothing in Apple Intelligence is making me bat an eye.

    • mfro 3 hours ago

      I've been using the 18.1 beta for a month now and Siri is noticable better at recognizing your speech and working around less explicit requests e.g. "set a timer for 15 minutes... uh make that 10"

    • furyofantares 3 hours ago

      Okay well it says that's a part of the update

  • aaroninsf 2 hours ago

    Observation: the primary user base for this is not us (technology professionals already using Copilot, and dealing with management layers who do),

    it's "everyday" people doing everyday tasks.

  • daft_pink 2 hours ago

    I can’t believe they didn’t update the USB location on the Magic Mouse. Still on the bottom. Unbelievable.

    • kemayo an hour ago

      That would be a pretty impressively sweeping thing to accomplish with a software update launching Apple Intelligence. :D

      • daft_pink an hour ago

        sorry just griping about apple’s announcements today in general.

  • belfalas 2 hours ago

    I’m interested to see where this goes. That said, I am now planning to delay upgrading my iPhone 11 Pro until iPhone 18 comes out.

    IMHO Apple and everyone else is moving way too fast with adoption. The deployment surface of iPhone is huge - I’m interested to see how Apple handles their first really serious issue (like “diverse Nazis”).

    Also - current AI programs are complete and total pigs. iPhone 16 offers 8GB of memory and 1TB storage. I know the programs need the memory and so forth but still. I get it but I’m also going to wait now for the vendors to figure out the new future.

    In the meantime, I will watch and wait. Plus, if Apples history is any indicator, the first 2-3 versions will be lame but then around 4 or 5 it will take off.

    • ethbr1 2 hours ago

      Imho, one of the reasons Apple built out their hybrid security/privacy arch was so they could trade data transfer for cpu/mem, when it makes sense.

      The user sees some additional latency, but Apple can deliver equivalent functionality across hardware generations, with some calling out to servers and some doing on-device processing.

      Honestly, I'm mostly impressed that Apple is aiming to deliver OS-level, works-in-all-apps functionality.

      Imho, that's what users really want, but MS and Google's product org structures hamstring them from being able to deliver that quickly.

  • moepstar 2 hours ago

    > Apple Intelligence allows users to stay on top of their inbox like never before with Priority Messages and message summaries.

    Ugh.

    I mean, really - my manager gets enough emails he doesn't read, fully understand, pipe through Copilot, still doesn't grok 'em, answer and delegate sh't he shouldn't (and wouldn't, if he'd read the email himself) already.

    Looking forward to this excuse of having even more overworked people /s

    • jshreder 2 hours ago

      My experience of this feature in the betas over the last few months (for Notifications) has been excellent. I used to have so many notifs I would just ignore them all, now I can quickly glance and see which groups of notifs I want to actually read. In most cases, the summary contains all the info I'd want.

      • moepstar 2 hours ago

        If you glance at the notification to decide if something needs attention now or later, then read the mail(s) in full - that's something different entirely and not a problem I'd think.

  • 3 hours ago
    [deleted]