> The input mechanism I describe doesn’t have to be a physical button. In fact, gesture-based inputs might be even more convenient. If AirPods had built-in accelerometers, users could interact with audio content by nodding or shaking their heads. Radar-based sensors like Google’s Motion Sense could also create an interesting new interaction language for audio content.
> You could also think about the Apple Watch as the main input device. In contrast to the AirPods, Apple opened the Watch for developers from the start, but it hasn’t really seen much success as a platform. Perhaps a combination of Watch and AirPods has a better chance of creating an ecosystem with its own unique applications?
All AirPods actually do contain an accelerometer, and the Pro contain a gyroscope as well, however I'm not aware of them being opened to developers, so they're still not of use to anything unless Apple decides to open them up or implement the feature themselves.
Apple has them in the pencil too and they kind of suck. For the AirPods it’s not too bad for the double tap action but on the pencil it’s horrible. It’s such a difficult action to make when using it and I trigger it randomly all the time. Would much prefer a Wacom style button bar.
> The input mechanism I describe doesn’t have to be a physical button. In fact, gesture-based inputs might be even more convenient. If AirPods had built-in accelerometers, users could interact with audio content by nodding or shaking their heads. Radar-based sensors like Google’s Motion Sense could also create an interesting new interaction language for audio content.