Control iPhone with the movement of your eyes

(support.apple.com)

75 points | by 9woc 4 days ago ago

34 comments

  • w10-1 an hour ago

    Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.

    Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."

    Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.

    Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.

    It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.

    Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.

    I'm willing to try, and eager to see how it gets integrated with other features.

  • augstein 8 hours ago

    Amazing how good eye tracking works on my phone (15 Pro).

    Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.

    • nomel 35 minutes ago

      I think interfering with a biological necessity that is there to maintain eye health probably isn't a good candidate for HID input. I suspect the user would end up with very dry eyes, as they subconsciously/consciously refrained from blinking, even if it were only long blinks (which I do often if my eyes feel the need).

      Now I could see a wink working! Left wink, right wink. And, with a wink, you don't lose tracking during the action (just half the signal).

    • bdavbdav 30 minutes ago

      Hectic is a good way to describe it. I was frantically trying to turn it off when it seemed to just start clicking on things at random.

    • cush 2 hours ago

      It's really bad on my iPhone 13. Surprised they released it here. After one or two clicks it needs to recalibrate. There doesn't seem to be a way to not click on things either. No way to change apps. After the third recalibration, I selected "yes" to "would you like to disable eye tracking", and while eye tracking was disabled, I also lost access to swiping down on the control center or swiping up on the app switcher. Had to restart the phone to get things back to a usable place.

    • helsinki 33 minutes ago

      Mine doesn’t work at all on a 16 Pro. Maybe due to astigmatism?

  • canuckintime 2 hours ago

    Eye tracking is very impressive technology, and foveated rendering is an excellent application, but eye control is poor UX. Yes, I'm saying the emperor has no clothes.

    Imagine having a jittery cursor in the center of your vision at all times? If I had a mouse/trackpad working like that it would immediately be replaced but that's Apple's eye control. Imagine scrolling a page and every where you glance there's a spotlighted/popup control or ad? That's Apple eye control utilizing dwell and snap to item.

    It's telling that the 'best window' apps/use cases for Vision Pro are video watching and Mac virtual display which has low reliance on eye control during usage. Trying to browse the web with Apple's eye control is a clear regression compared to touch/keyboard/mouse/trackpad only useful as an accessibility feature

    • cush 2 hours ago

      > eye control is poor UX... a jittery cursor in the center of your vision

      For a lot of folks with tremors or other mobility issues, their eyes may be much more stable than their fingers. It might be helpful to weigh the tradeoffs you're presenting with alternatives including a jittery inaccurate finger in the center of their vision, or even just not being able to use the UI at all.

      > only useful as an accessibility feature

      For the above reasons, that's exactly how they're marketing it

      • canuckintime an hour ago

        > For a lot of folks with tremors or other mobility issues, their eyes may be much more stable than their fingers. It might be helpful to weigh the tradeoffs you're presenting with alternatives including a jittery inaccurate finger in the center of their vision, or even just not being able to use the UI at all.

        I did not suggest otherwise

        > For the above reasons, that's exactly how they're marketing it

        That's not how it's being received (see other HN users in this very topic) nor how Apple is marketing it for Vision Pro

    • kube-system 2 hours ago

      Yeah, disability can be frustrating. But good on Apple for giving people some options here.

      • canuckintime an hour ago

        Hey I did not say otherwise. It's a good accessibility feature. It becomes frustrating when Apple makes it the main control as for the Vision Pro

  • Willingham 8 hours ago

    I too tried this for a short while and was not impressed. However, I can’t help but wonder how ‘good’ I could get at using it if I invested more time in it. Would love to hear from someone who truly uses this tool regularly. Flying a plane is also quite cumbersome the first 15 minutes.

  • Diti 31 minutes ago

    Doesn’t work in a dark room (visual hypersensitivity from ASD requires me to do that).

  • Jhsto an hour ago

    Anyone knows if there's a chance that the color of your eyes affects how well this feature works?

  • maxglute 3 hours ago

    If we were to expand on face control scheme furthur, what face gestures would be used for clicking/tapping? click holding? What would be least exhausting to face muscles, what would look least ridiculous?

    • layer8 2 hours ago

      Double-blinking for tapping seems the most obvious. Closing your eyes for a second for tap-and-hold (and a second time to release, when necessary, e.g. for drag and drop).

      • inlined 44 minutes ago

        I think the hardest gesture to do would be to scroll. Cheekily I nominate sticking your tongue out and virtually licking the UX up/down

  • dt3ft 6 hours ago

    This is rather buggy. You can get locked out of your device.

  • ciceryadam 4 hours ago

    So finally somebody made the Opera face control from 15 years ago a reality: https://www.youtube.com/watch?v=kkNxbyp6thM

  • dyeje 9 hours ago

    I played around with this a bit. Doesn’t work amazing on my iPhone model (SE 3rd gen), but it’s pretty cool. I don’t think there’s an API to use it in apps yet, but I would love to make an eye controlled mobile game.

  • Thoreandan 4 hours ago

    Kinda feels like an unfinished project?

  • ClassyJacket an hour ago

    I can't wait for this to be implemented on other devices and used to make sure you actually look at the screen during ads! Google is going to love it.

  • chakintosh 9 hours ago

    Tried this a while ago, it's really bad. Often freezes or overshoots the target.

  • GrumpyNl 5 hours ago

    I see lots of people walking and scrolling on their phone. Every once and a while they look up and continue. What will happen when you control it with your eyes and you look up, will it scroll?

    • pndy 4 hours ago

      When I set it up and placed it on table, I've got "bring your face into view" banner on the top of the screen.

      Still, on my 13 Pro it doesn't seem to be following my eyes correctly half way thru the screen - I need to actually look a little bit up away from center of field to get pointer to select anything. I tried setting it up few times and same thing happens over and over. Scrolling seems to be done via the widget - similar to the Assistive Touch one. Unless there's some "eye gesture" I haven't figured out yet.

      It's really interesting feature, surely helpful for people with mobility issues but for majority is rather a novelty you can show your friends. The feature I do use quite often is voice control which can be activated without training and it helps when you have busy or dirty hands and you don't want to touch the device

    • nonameiguess 2 hours ago

      It obviously can't work in all circumstances. Walking is one. Driving, whether you're controlling the phone via CarPlay or just have it mounted, is another. Using it in the dark, presumably the phone on its own doesn't light your face well enough. You can't use it with sunglasses on.

  • gnrlst 8 hours ago

    It doesn't mention how scrolling would work - is that supported?

    • canuckintime 3 hours ago

      Only supported through Dwell Control+AssistiveTouch feature for scroll gestures

    • flurdy 6 hours ago

      Eye rolling takes on a new purpose...

      • xyst 2 hours ago

        imagine doom scrolling with your eyes. Eye fatigue from screens. Eye fatigue from too much rolling.

  • xyst 2 hours ago

    How long until this is turned on silently across all devices and adtech folks, native mobile apps, and website operators are able to use your eye tracking data for A/B testing?

    The selfie normalized surveillance. Social media normalized "transparency" (ie, posting every little dumb detail about yourself". Advertisements invading all aspects of life (tv, radio, streaming services, ads in your taskbar).

    • crooked-v 2 hours ago

      Apple has been pretty thorough about not allowing actual eye tracking data through to apps (just the resulting interactions), to the point that a lot of Vision devs have complained about it getting in the way of immersive UX design.