> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.
> For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience.
Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...
> It looks like these cameras are infrared and intended to see gestures from the wearer.
I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.
Remember the fuss people made about the Google Glass?
Turns out that people become okay with cameras every few inches observing every action as long as you say "It's for gestures.", even though the data stream will inevitably make it back to corpo for training ( and voyeurism[0] by the other parties that'll be in the room).
Same excuse for the VR headsets. "Oh, it has a red LED that fires when recording!" , meanwhile the thing has 30 other IR cams in non-stop loops consuming the environment at power-on til death.
My experience working at Apple was that private information does not leave the device. All our training data came from contractors who were hired to perform data collection and not from the general public.
If Apple really walked the walk about their privacy marketing, it would sunset all their APIs used for tracking, making impractically hard for advertisers to track you. Not warning "do you still want to be tracked?", but remaking their whole stack to make tracking unreasonably hard.
Currently I see Apple as safer than, say Google or Microsoft, but not as the privacy bastion it claims to be.
It's clear it doesn't bother you, but I'll try to explain my posture.
Years ago, Apple's Weather app sourced their data from The Weather Channel. That meant these three tracking options ragasrding your location:
- Always share - You get real-time weather alerts, very useful some seasons
- Share while using - You get current weather, but lose real-time alerts
- Do not share - Might as well uninstall the app
Then Apple made Apple Weather, which collects weather data from multiple sources, and is supposedly safer to share real-time location with since Apple won't share it with anyone. Before this, The Weather Channel had the real-time location of millions worldwide, and all Apple had for privacy was that prompt.
This is the kind of stack reengineering I'm talking about, that makes privacy a real proposal, but applied deeper so it really makes a difference.
Unless you're some sort of globetrotter going to a new city every week, the app is quite usable just by adding your city.
>Before this, The Weather Channel had the real-time location of millions worldwide
Are you sure apple wasn't proxying the traffic through their servers?
edit: for instance the stocks app very prominently shows the data is from yahoo stocks, but if you check "most contacted domains" in app privacy report, they're all apple domains. It doesn't contact yahoo at all.
>You are one PRISM type request and one gag order from a silent update changinf that.
Wouldn't the bigger issue be that they can abuse the same thing to grab camera and or microphone from your phone? Probably more useful than airpods too, given that a phone's always on, unlike airpods.
Mine are older and support Find My, but only when they’re out of the case. If I can’t find my case when they’re in it, I’m stuck. Does pro do anything for that?
The AirPods Pro 2, AirPods Pro 3, and AirPods 4 WITH ANC all have cases with speakers and separate Find My integration that I belive is what you are looking for.
It’s the only reason why I have upgraded to ANC on the Airpods 4, because I don’t really like ANC, but I want the smart case.
People commonly continue to wear their AirPods in the changing rooms, because why not? Keep the tunes/podcast/etc going while you get ready/even shower.
If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
So I guess what I am saying is: This could be an anti-feature for certain people, or get people into trouble who continue to do a preexisting habit.
>If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
Is having the camera on illegal in and of itself, or only when you're actively recording?
If you open the camera app but don't hit record, it's still "recording" in the sense that the image sensor is receiving light and sending over data, but it's not "recording" in what most people interpret the word to mean.
I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
>I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
Right, because there's very few plausible justifications for why someone would be aiming their phone in a changing room. The same doesn't apply to airpods.
>You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
Is there any indication that you'll be able to get a video feed from airpods? If not, and you basically have to jailbreak them to do surreptitious recordings, what makes them more or less notable than all the other spy cameras you can get today and are more or less tolerated by the authorities?
>"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Do you think it's some outrageous breach of privacy that face id (and similar face scanning technologies) are constantly on the lookout for faces, including in bathrooms?
It doesn't today because they don't have cameras on them, it won't tomorrow if they do. People will definitely need to justify (maybe legally), why they're pointing the camera on their AirPods at people changing.
> Is there any indication that you'll be able to get a video feed from airpods?
You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
>You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
You're not answering my question. Do you or do you not think that face id phones are facing legal obstacles because they also have cameras that randomly turn on for "internal processing"? If face id phones get a pass, why don't you think airpods would get a pass?
Have you never used an IR camera? IR can see more than visible light cameras, including through layers of people's clothing and can certainly see a detailed image of a naked human.
So more of an issue in that case; and the laws I am talking about don't have a "IR Camera" exception.
I wonder what this would do to battery life -- continuously-on IR cameras are going to be a significant power draw. And then there's the question of whether the video processing is done on the earbuds, or how much Bluetooth bandwidth is used sending the video stream to your phone for processing.
Using this to detect gestures does seem very cool, however. Seems like a fascinating engineering challenge.
It would be incredibly useful for scrubbing through commercials during podcasts, if you could just pinch your fingers and drag through the air. Infinitely better than double-pinching the stem 15 times in a row.
Finer volume control. Different gestures for fast forward vs next track. Additional gestures e.g. for liking the current track.
I still own a Google device with that tech on it (Home Display), and, yeah it isn't useful. They just hide certain UI elements until your hand gets close, which is obnoxious and feels like they invented something then invented a usage for it to justify it.
UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).
This was a solved problem in the 1st and 2nd generation of AirPods with tap controls[1]. I'm still surprised that they removed that feature in favor of pressure, although now that I'm reflecting more on it, I wonder if it's part of Apple using their manufacturing and engineering as a moat[2]. i.e. Tap controls are relatively easy, so once wireless earbuds became commodities, they had to figure out some way to differentiate themselves.
That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.
The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.
> Japan’s requirement for an audible camera shutter sound isn’t just a quirky design decision — it’s a deliberate policy meant to prevent secret photography.
>I suspect it could be due to Japan requiring that cameras play a sound when taking a photo or video.
That depends on what "taking a photo or video" means. If it only covers making a recording, then it won't apply to airpods. The same applies for faceID for instance, I doubt japanese iPhones are making shutter sounds everytime you pull out your phone, even though it's obviously using the camera.
That law regulates captured images. It doesn't require continuous shutter sounds while the phone is processing and even displaying the camera input - only once an image is captured. It seems unlikely that the AirPods will allow users to capture IR images that are used for gestural control and environmental awareness for system functions.
It is interesting to see camera-in-airpods as a rumor instead of wireless-radio-as-camera [0] to detect similar. Maybe it is less power/volume intensive to add very limited cameras instead of upping the processing power to run inference on the radio signals?
I suppose that the "camera" could be as simple as an optical flow sensor [1] commonly used on mice and quad-copters and placed behind the black plastic so there would not be a visible lens [2].
Can we get a bloody better sound driver, protection from ANC feedback shriek, one or two physical buttons (or at least a raised haptic buttons, not just a touch sensitive stem area), and battery level on the case?
We don't want no fucking infra cameras for "better hand gestures and enhanced spatial audio experience".
There was an old video about the new app for your BlackBerry that would show you what's on the other side of your BlackBerry. At the time we believed this to be a joke, rather than a product development brief.
Would you be kind enough to explain your gut reaction here with logical arguments as to why this is definitely not a feature that would ever be released?
I suspect most devices will have cameras and mics on them and will mesh connect as a collective system. OpenAI is most likely working on a suite of devices that would fit this "regalia" of sorts.
It looks like these cameras are infrared and intended to see gestures from the wearer. There are some more details in the linked article:
https://www.macrumors.com/2024/06/30/new-airpods-to-feature-...
> The IR cameras can detect environmental image changes, facilitating a broader range of gestures to improve user interaction. For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience."
I wonder if the signal could be integrated into AR glasses or headset to provide a wider FOV to the wearer.
> For example, if a user watches a video using Apple Vision Pro and the new AirPods, and turns their head to look in a specific direction, the sound source in that direction can be "emphasized to enhance the spatial audio/computing experience.
Geez, if only the Apple Vision had some kind of gyroscope and accelerometer so it could detect head motion without relying on external hardware...
> It looks like these cameras are infrared and intended to see gestures from the wearer.
I had an APV for a while, controlling it with just gestures was sweet. If they're looking to bring those kinds of gestures to other Apple devices via AirPods (i.e. beyond just bringing more gestures to the AirPods), I'm intrigued.
Remember the fuss people made about the Google Glass?
Turns out that people become okay with cameras every few inches observing every action as long as you say "It's for gestures.", even though the data stream will inevitably make it back to corpo for training ( and voyeurism[0] by the other parties that'll be in the room).
Same excuse for the VR headsets. "Oh, it has a red LED that fires when recording!" , meanwhile the thing has 30 other IR cams in non-stop loops consuming the environment at power-on til death.
[0]: https://www.reuters.com/article/world/uk/nsa-staff-used-spy-...
My experience working at Apple was that private information does not leave the device. All our training data came from contractors who were hired to perform data collection and not from the general public.
If Apple really walked the walk about their privacy marketing, it would sunset all their APIs used for tracking, making impractically hard for advertisers to track you. Not warning "do you still want to be tracked?", but remaking their whole stack to make tracking unreasonably hard.
Currently I see Apple as safer than, say Google or Microsoft, but not as the privacy bastion it claims to be.
>Not warning "do you still want to be tracked?", but remaking their whole stack to make tracking unseasonably hard.
It's opt in, and the bolded option is "ask app not to track", so I'm really not sure what the issue is here.
It's clear it doesn't bother you, but I'll try to explain my posture.
Years ago, Apple's Weather app sourced their data from The Weather Channel. That meant these three tracking options ragasrding your location:
- Always share - You get real-time weather alerts, very useful some seasons
- Share while using - You get current weather, but lose real-time alerts
- Do not share - Might as well uninstall the app
Then Apple made Apple Weather, which collects weather data from multiple sources, and is supposedly safer to share real-time location with since Apple won't share it with anyone. Before this, The Weather Channel had the real-time location of millions worldwide, and all Apple had for privacy was that prompt.
This is the kind of stack reengineering I'm talking about, that makes privacy a real proposal, but applied deeper so it really makes a difference.
>- Do not share - Might as well uninstall the app
Unless you're some sort of globetrotter going to a new city every week, the app is quite usable just by adding your city.
>Before this, The Weather Channel had the real-time location of millions worldwide
Are you sure apple wasn't proxying the traffic through their servers?
edit: for instance the stocks app very prominently shows the data is from yahoo stocks, but if you check "most contacted domains" in app privacy report, they're all apple domains. It doesn't contact yahoo at all.
You are one PRISM type request and one gag order from a silent update changinf that.
In fact, it could have been the case already and you would not have known ir.
>You are one PRISM type request and one gag order from a silent update changinf that.
Wouldn't the bigger issue be that they can abuse the same thing to grab camera and or microphone from your phone? Probably more useful than airpods too, given that a phone's always on, unlike airpods.
It reminds me of that principle of how people are more likely to be receptive to your request if you preface it with "because..."
I just want a feature that helps me find the one that runs away after I fall asleep with them in >:(
The pros have Find My -support and you can even ping the earbuds separately via the app.
Also the new model can regonize when you fall a sleep and stop the media. I think it works, but I'm not sure how quickly it detects the sleep.
Mine are older and support Find My, but only when they’re out of the case. If I can’t find my case when they’re in it, I’m stuck. Does pro do anything for that?
Yes it does! Each ear separately and the case itself.
Doesn’t that already exist within the Find My app? It’s not perfect, but it’s always worked fine for me.
For me, they fell off during sleep less often after I did the tip size testing.
(I don't really want to be wearing them while asleep, but my body sometimes has other plans.)
That’s why I use the Pro Max for sleep, but I wish it had the newer chip with sleep detection.
Get memory foam tips for them. Mine rarely fall out if I sleep with them in.
Put a damn tracker in the case. I hate how the case is completely dumb if the AirPods are outside of it.
The AirPods Pro 2, AirPods Pro 3, and AirPods 4 WITH ANC all have cases with speakers and separate Find My integration that I belive is what you are looking for.
It’s the only reason why I have upgraded to ANC on the Airpods 4, because I don’t really like ANC, but I want the smart case.
They did. Airpods Pro 2 has a tracker in the case, Find My can find all three pieces separately and each can make the pinging noise.
People commonly continue to wear their AirPods in the changing rooms, because why not? Keep the tunes/podcast/etc going while you get ready/even shower.
If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
So I guess what I am saying is: This could be an anti-feature for certain people, or get people into trouble who continue to do a preexisting habit.
>If they add cameras to them, regardless of the implementation, I'm pretty sure that not only is against every gym-policy, but may be an actual criminal offense in certain states.
Is having the camera on illegal in and of itself, or only when you're actively recording?
If the purpose of these is gestures, won't they always be "actively recording?"
If you open the camera app but don't hit record, it's still "recording" in the sense that the image sensor is receiving light and sending over data, but it's not "recording" in what most people interpret the word to mean.
I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
>I think if people take out their phone and start pointing the camera at others in the changing room, most people would interpret that to not be ok and that's what we're discussing here.
Right, because there's very few plausible justifications for why someone would be aiming their phone in a changing room. The same doesn't apply to airpods.
>You're essentially setting the scene to where people cannot know if they're being recorded or not while a camera is always pointed at them. That's a problem. Least of all because law enforcement will need to investigate to determine if the law was broken on each complaint.
Is there any indication that you'll be able to get a video feed from airpods? If not, and you basically have to jailbreak them to do surreptitious recordings, what makes them more or less notable than all the other spy cameras you can get today and are more or less tolerated by the authorities?
>"But internal processing!" isn't quite the slam dunk defense you seem to think it is. It won't work like that in the real world, people being recorded while changing won't care about that distinction.
Do you think it's some outrageous breach of privacy that face id (and similar face scanning technologies) are constantly on the lookout for faces, including in bathrooms?
> The same doesn't apply to airpods.
It doesn't today because they don't have cameras on them, it won't tomorrow if they do. People will definitely need to justify (maybe legally), why they're pointing the camera on their AirPods at people changing.
> Is there any indication that you'll be able to get a video feed from airpods?
You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
>You're just repeating the same "internal processing" point you've made, and that I've already pointed out isn't a legal or practical difference in the real world.
You're not answering my question. Do you or do you not think that face id phones are facing legal obstacles because they also have cameras that randomly turn on for "internal processing"? If face id phones get a pass, why don't you think airpods would get a pass?
IR cameras, not visible light cameras, so this should be a non-issue.
Have you never used an IR camera? IR can see more than visible light cameras, including through layers of people's clothing and can certainly see a detailed image of a naked human.
So more of an issue in that case; and the laws I am talking about don't have a "IR Camera" exception.
I wonder what this would do to battery life -- continuously-on IR cameras are going to be a significant power draw. And then there's the question of whether the video processing is done on the earbuds, or how much Bluetooth bandwidth is used sending the video stream to your phone for processing.
Using this to detect gestures does seem very cool, however. Seems like a fascinating engineering challenge.
>Using this to detect gestures does seem very cool, however.
Don't know, sounds totally useless, like most on-air gesture interfaces.
It would be incredibly useful for scrubbing through commercials during podcasts, if you could just pinch your fingers and drag through the air. Infinitely better than double-pinching the stem 15 times in a row.
Finer volume control. Different gestures for fast forward vs next track. Additional gestures e.g. for liking the current track.
Doesn't sound useless to me at all.
reminds me when google introduced radar for their pixel phones called project soli. https://research.google/blog/soli-radar-based-perception-and.... i have a feeling that these will be as successful. its a solution in search of a problem in my eye
I still own a Google device with that tech on it (Home Display), and, yeah it isn't useful. They just hide certain UI elements until your hand gets close, which is obnoxious and feels like they invented something then invented a usage for it to justify it.
UI should be consistent, it allows users to learn a muscle memory, this "hide stuff until you're 20cm away" stuff is the antithesis of that (and all good design in general).
Would be quite handy for gesture control. When wearing thick gloves you need to take them off to operate the current AirPods.
This was a solved problem in the 1st and 2nd generation of AirPods with tap controls[1]. I'm still surprised that they removed that feature in favor of pressure, although now that I'm reflecting more on it, I wonder if it's part of Apple using their manufacturing and engineering as a moat[2]. i.e. Tap controls are relatively easy, so once wireless earbuds became commodities, they had to figure out some way to differentiate themselves.
That said, as someone who does pottery (messy hands), wears gloves/hats (stuff in the way), and has relatively poor fine motor control, I guess I welcome any solution that doesn't mean getting clay or cold air in my hair/ear.
The battery consumption and latency of the IR cameras will be interesting though. Too sensitive, and you'll eat up your battery. Not sensitive enough, and UX suffers.
1: https://support.apple.com/en-us/102628 2: https://news.ycombinator.com/item?id=45186975
Depending on where the cameras are placed, they can be used to work out the pose (ie what position) the upper body is in.
Meta showed that it was possible to do that from cameras mounted on glasses.
However, the power required to do that is quite high (30-60mw to detect, more to do the pose extractions)
So I suspect its just hand recognition.
Are accelerometers not sufficient for pose determination? I would assume they'd work as well as cameras if not better.
for your head, more than enough, for the position of the arm you need something else.
How does this work out for 'observers' who are getting their windows broken, ripped out of cars?
Are we going to have to take our airpods off when we go in the bathroom now?
Seems like a negative tradeoff.
So in Japan, will the audio play on the outside of the headphone?
What? Where did you get that from?
I suspect it could be due to Japan requiring that cameras play a sound when taking a photo or video. Here's some background:
https://japandaily.jp/why-you-cant-turn-off-the-camera-shutt...
> Japan’s requirement for an audible camera shutter sound isn’t just a quirky design decision — it’s a deliberate policy meant to prevent secret photography.
>I suspect it could be due to Japan requiring that cameras play a sound when taking a photo or video.
That depends on what "taking a photo or video" means. If it only covers making a recording, then it won't apply to airpods. The same applies for faceID for instance, I doubt japanese iPhones are making shutter sounds everytime you pull out your phone, even though it's obviously using the camera.
Looks like it will be an infrared camera, so perhaps that changes the requirement?
Doesn't that just make it worse? It can't see color, but instead it can potentially see through thin clothing
That law regulates captured images. It doesn't require continuous shutter sounds while the phone is processing and even displaying the camera input - only once an image is captured. It seems unlikely that the AirPods will allow users to capture IR images that are used for gestural control and environmental awareness for system functions.
The law doesn't make a distinction for what the AirPod's internal processing does. Infrared definitely violates people's privacy and dignity.
But that’s unrelated to the shutter sound law
A shutter sound I kinda get, that is simple to implement, but how is a video audio signal supposed to work?
This
So you want to get AirPods banned everywhere… genius
I don't believe these facebook rayban / modern google glass things are banned everywhere, why would this one be?
It is interesting to see camera-in-airpods as a rumor instead of wireless-radio-as-camera [0] to detect similar. Maybe it is less power/volume intensive to add very limited cameras instead of upping the processing power to run inference on the radio signals?
I suppose that the "camera" could be as simple as an optical flow sensor [1] commonly used on mice and quad-copters and placed behind the black plastic so there would not be a visible lens [2].
0. https://web.ece.ucsb.edu/~ymostofi/WiFiReadingThroughWall
1. https://ieeexplore.ieee.org/document/10164626
2. https://www.schneier.com/blog/archives/2012/07/camera-transp...
Can we get a bloody better sound driver, protection from ANC feedback shriek, one or two physical buttons (or at least a raised haptic buttons, not just a touch sensitive stem area), and battery level on the case?
We don't want no fucking infra cameras for "better hand gestures and enhanced spatial audio experience".
There was an old video about the new app for your BlackBerry that would show you what's on the other side of your BlackBerry. At the time we believed this to be a joke, rather than a product development brief.
[flagged]
Would you be kind enough to explain your gut reaction here with logical arguments as to why this is definitely not a feature that would ever be released?
Why do you think they don't want to add gesture controls?
Is it possible for tech to "jump the shark"?
Yes. It happens all the time. You would usually call the features "gimmicks".
Have you seen anything that Microsoft has done in the past 10 years?
I suspect most devices will have cameras and mics on them and will mesh connect as a collective system. OpenAI is most likely working on a suite of devices that would fit this "regalia" of sorts.