You Think Apple Airpods Look Stupid? Join the Club.
I have to say, the first time I saw someone wearing these I thought to myself, boy, that looks dumb. Of course, I was one of the same guys to say that Google Glass looked dumb. If you think about it, Google Glass and Apple Airpods have a lot in common.
They both look “different”, but in some ways, they both moved in the same direction. They were both augmented reality devices, but where Google went wrong, Apple went right.
Glass made you look like the Terminator. These make you look like someone chopped the cord off your headphones. When you look at someone head on with these on, it almost looks like they are wearing earrings. I foresee a huge market in dressing them up somehow – I bet some enterprising folks out there are already thinking about how to skin these things. Now, some of you Apple fanboys and fangirls are probably saying, “How can you compare these beautifully designed devices to something as ugly as Google Glass.
It’s simple. Both are augmented reality devices. Glass attempted to augment your reality by visually dropping things in your visual path. These work by leveraging the next new hot space, in my opinion, audio augmented reality.
I love science-fiction, and not just Star Trek. In sci-fi, authors really are futurists, mapping out the possible futures by envisioning the extension of things. In this case, my sense is that the folks at Apple watched the movie Her, and realized that was the future they wanted for us. If you ask me, its the right way to go.
If you haven’t seen the movie, let me recap the key piece of technology for you. I won’t go into plot details in case you would like to watch it – and as an innovator, I highly recommend it for you (one of these days I need to do a list of “awesome movies for innovators and inventors to watch” – I’d better put that on my to-do list. ) The key technology in the film is a highly advanced “audio chatbot”, an AI you can purchase and interact with to help you live your life, so things and basically assist you. A super virtual assistant, so to speak. The way the tech works is not Google Glass-like – the way it works is via an ear bud. The hero of our tale, and seemingly most people in the movie, use a single ear bud in order to interact with their virtual assistant. Instead of using a high-tech visual interface, like Tony Stark and John Anderton did in Iron Man and Minority Report respectively, they use an audio interface. Like any chatbot, like Amazon Alexa, you ask things and it responds. In Her, instead of it sitting in a speaker in your house, it’s always with our hero, in her ear, whispering answers to his questions, joking with him, and helping him to live his life.
Some people might say – so what? It’s just another chatbot. Chatbots are a pain to use. We will all eventually be using visual interfaces anyways at some point.
I disagree, and here’s why. One of the reasons is why Airpods are such a big deal:
- They were designed to be worn all the time. You should leave them in, just like you would leave earrings on your ears. The sensors on the devices can tell when you are talking and when you are listening. The 24-hour battery life and 5 hour talk time are tuned to having them in your ears all the time, just like in Her. Coupled with instant access to Siri, they provide instant answers to anything she can respond to. While Siri is pretty rudimentary at this point, as time goes by and Apple continues to gather data, she will get more and more sophisticated. I see no reason why we can’t get her near to, or at the same level as Samantha.
- They don’t sound all that great for music. My guess here is that they were more designed as interfaces to Siri than they are to playing music. In this way, while they can play music, their primary purpose in life is to be an always-on interface to Siri. In short, they are designed to be Apple’s version of Alexa, except totally portable, as befits our mobile lifestyle
- As an audio based augmented reality chatbot, they can provide us the ability to receive information and ask questions without the interruptive nature of a visual interface. For example, let’s say that you are talking to someone. If you need to get an answer using a typical interface, or when you are using an augmented reality device, like Google Glass or Microsoft Hololens, or even the LaForge optical glasses, you have to bring the device between you and the other person. You have to divert your attention away from the conversation in order to look at the device. Looking at devices has become one of the major reasons why many couples have communications and intimacy issues. Remove the visual requirement, and people can talk to people once again. This is a key differentiator between this and Google Glass.
Airpods are a great first step towards an complete ecosystem of audio based, portable virtual assistants who can help us live our lives, do all of the time wasting stuff computers are frankly better at, and we can direct them to do these things in an interface which doesn’t force us to break away from our own human-to-human conversation. It’s possible that as these progress, visual interfaces like the screen on your phone or your Apple Watch will be less and less important, and maybe even no longer necessary.