Artificial Intelligence by Apple by Air?


After only a few days with Apple’s wireless AirPod headphones, it’s clear that there will be a huge platform business based on the reliable, persistent availability of a contextual artificial intelligence that can talk to you and receive commands.

That platform will benefit Apple first, but it will then expand — along with Siri — to developers and startup companies.

The key bits of that, of course, are “reliable” and “persistent.” It is very similar to the way that the increasing sophistication of push notifications and the Apple Watch are making opening apps more optional than ever. If you know an iPhone user will have AirPods in their ear and can access the services you offer via Siri at any moment, then you’ve got a powerful new conduit to that user.

It’s not well known outside of the industry, but a couple of years ago, Apple switched from its voice provider Nuance and fired up its own internal voice team. In a not unconnected event, Siri’s second major revision shipped a few months later, bringing a big jump in reliable recognition of commands. There’s still a lot of room to grow, but it’s improving.

Then Apple shipped the first wave of its SiriKit compatibility out to developers earlier this year at its developer conference, allowing a handful of categories of apps to offer up their services to Siri, to be commanded by the user. It was a lot less robust than some had hoped, but it will get iterated on. And there are competitors out there like Viv that are also pushing the power boundaries of these interconnected contextual systems.

But before Apple’s AI becomes a true audio platform, it needs hardware that makes it easier to put Siri in your ear — and no real reason to take it out. Enter the AirPods.


Interacting with the AirPods is super slick. Inserting them into your ear gives an audio cue that says they’re on and activated (they turn off when out of your ear to conserve battery, with the W1 monitoring the sensors). Taking a single bud out of your ear will pause your audio and re-inserting it will start the audio again. You can just insert one and use that like a phone headset — they work independently of one another.

To control them, you double tap, gently (tapping hard will likely lead to a sore tragus) to bring up Siri and then give Siri your commands. Double tapping while a phone is ringing will answer the phone call. That’s it. There are no other controls.

Because everything happens via Siri, I was left feeling that the opportunity cost was too high for minor interactions like volume or track advancing. I’ve been forcing myself to use Siri, but I think many people will be reaching in pocket to make those adjustments at first.

It’s going to take a big cultural adjustment here. Both to get used to seeing these cordless buds hanging out of people’s ears like a postmodern Ceti eel and for people to get comfortable talking out loud to Siri for their every desire.

I did find, however, that speaking commands sotto voce — not whispering, but in a low register — worked just fine. The two beam-forming microphones and the accelerometer that detects when your jaw is moving make this one of the best in ear microphone options I’ve used.

I can’t help thinking it’s possible that “AI voice,” speaking lowly to your personal thinking machine, will become a thing as this kind of system becomes more commonplace.

There is so much interest in the AirPods. Be sure to read the full iPhone7/Plus review from Techcrunch here.