A few weeks ago, while listening to music and developing an app with my AirPods, I noticed their spatial audio feature. I thought about what else could be done with reverse engineering, including the possibility of using AirPods as a motion controller. I built the world's first AirPods-controlled game, which uses a motor driven by head movements. In fact, it's not just AirPods, it's a game that uses a headset as a motion controller. And today, it was approved on the App Store. Who knows, maybe AirPods Arcade has even started? :)
This is an amazing idea! Even more surprising is the fact that apple approved it, usually they're not a fan of features being abused in ways they weren't intended for...
Reminds me of when the first iPhones came out and developers were very creative for the time with the available features: flashlight app, bubble level app, asphalt 4
First off, kudos and congrats on the launch, seems like a fun idea! I am curious, as you mentioned reverse engineering. How difficult was it to retrieve the raw gyroscope data from the AirPods - AFAIK there is no API to access this information, right?
Put a sine wave emitter (or multiple) on the scene. Enable head tracking. Analyze stereo sound at the output. Mute output. There you go: you now can track user’s head without direct access to gyroscope data.
Apple does not secretly analyze sine waves to infer head motion. Instead, airpods pro/max/gen-3 include actual IMUs (inertial measurement units), and ios exposes their readings through core motion.
It’s a known research technique called acoustic motion tracking (some labs use inaudible chirps to locate phones or headsets) you mentioned, but it’s not how airpods head tracking works
I think they're more so talking about measuring attenuation that apple applies for the "spatial audio" effect (after apple does all of the fancy IMU tracking for you), by using a known amplitude of signal in, and the ability to programmatically monitor the signal out after the effect, you can reverse engineer a crude estimated angle out of the delta between the two.
I don't think that's how this app works though, after installing it I got a permission prompt for motion tracking.
Can I ask about the tech stack - what did you use to build it. Just plain Swift and SceneKit? I did notice the app download is over 100MB, which seems a bit excessive for the game play.
A few weeks ago, while listening to music and developing an app with my AirPods, I noticed their spatial audio feature. I thought about what else could be done with reverse engineering, including the possibility of using AirPods as a motion controller. I built the world's first AirPods-controlled game, which uses a motor driven by head movements. In fact, it's not just AirPods, it's a game that uses a headset as a motion controller. And today, it was approved on the App Store. Who knows, maybe AirPods Arcade has even started? :)
This is an amazing idea! Even more surprising is the fact that apple approved it, usually they're not a fan of features being abused in ways they weren't intended for...
Very curious how you get the motion data from AirPods.I read the app description and noticed that no modification to AirPods is needed
Apple has public developer-accessible Core Motion APIs for this
https://developer.apple.com/documentation/coremotion/cmheadp...
Fun idea!
This is amazing! I'm about to waste my entire afternoon playing this. Feature request: nod to restart
Reminds me of when the first iPhones came out and developers were very creative for the time with the available features: flashlight app, bubble level app, asphalt 4
First off, kudos and congrats on the launch, seems like a fun idea! I am curious, as you mentioned reverse engineering. How difficult was it to retrieve the raw gyroscope data from the AirPods - AFAIK there is no API to access this information, right?
If you're looking to develop based on raw or processed gyroscope data then Core Motion is helpful
https://developer.apple.com/documentation/coremotion
"For example, a game might use accelerometer and gyroscope input to control onscreen game behavior."
The part specifically about obtaining headphone orientation is here (added in 2023):
https://developer.apple.com/documentation/coremotion/cmheadp...
Put a sine wave emitter (or multiple) on the scene. Enable head tracking. Analyze stereo sound at the output. Mute output. There you go: you now can track user’s head without direct access to gyroscope data.
Apple does not secretly analyze sine waves to infer head motion. Instead, airpods pro/max/gen-3 include actual IMUs (inertial measurement units), and ios exposes their readings through core motion.
It’s a known research technique called acoustic motion tracking (some labs use inaudible chirps to locate phones or headsets) you mentioned, but it’s not how airpods head tracking works
I think they're more so talking about measuring attenuation that apple applies for the "spatial audio" effect (after apple does all of the fancy IMU tracking for you), by using a known amplitude of signal in, and the ability to programmatically monitor the signal out after the effect, you can reverse engineer a crude estimated angle out of the delta between the two.
I don't think that's how this app works though, after installing it I got a permission prompt for motion tracking.
Looks like there is an API for this. Here's an example: https://github.com/tukuyo/AirPodsPro-Motion-Sampler/blob/8ac...
Is it possible for an app to access the actual output played by the AirPods after 3D audio is applied?
I see that global leaderboard… over 6000 This is great! Gives me old school arcade vibes and obviously need for speed.
Love it - quite a unique idea!
Can I ask about the tech stack - what did you use to build it. Just plain Swift and SceneKit? I did notice the app download is over 100MB, which seems a bit excessive for the game play.
Can one also use AirPods for e.g. Google Maps navigation?
I love this! It’s so reminiscent of the way we all would swerve our bodies when playing these types of games as a kid.
(Flying would be amazing.)
Amazing that it has taken this long for someone to do this.
This is so dumb - I love it. So much of modern computing is monetised and sanitised.
It’s so cool to see an interesting toy-like tech demo that does something new and different.