How your ear just became the most exciting place in technology

By Oliver Smith 14 March 2017

Welcome to the weird world of ‘hearables’.

Your ear is a marvellous place.

Not only do our ears help us hear and understand the world but, over the past few years, they’ve become the focus for a feverish development of technology.

Today Apple and Samsung aren’t just competing to own the smartphone in your pocket and the smartwatch on your wrist – they’re also battling to build the computer in your ear.

Indeed Apple’s AirPods and Samsung’g Gear IconX might present themselves merely as wireless earbuds, but there’s a far bigger fight going on.

Enter your ear

Hang on, you might be thinking, what’s so special about my ear?

Quite a lot, it turns out.

There are biological reasons why your ear is a good place to put lots of sensors.

Obviously the ear is a great place to pick up any speech or sounds you’re making, sensors there can also track your steps and the turn of your head as you cycle or even swim.

Then there’s the pulse that can be measured from the tiny capillaries in your ear.

“Even measuring which direction your eyes are looking can be done by sensing the muscles that are in your ear,” Nikolaj Hviid, CEO of Bragi, told The Memo at Mobile World Congress earlier this month.

Your ear is a fascinating place, and an especially well designed place to put a tiny bud of sensors.


Four years ago Hviid and his team at Bragi began working on what he calls the first ‘audio contextual computer’, or ‘hearables’ as they’re coming to be known.

“In essence we’re making an operating system for your ear.”

After a record-breaking crowdfunding campaign Bragi released The Dash last year, a €299 bluetooth headphone that does so much more.

With 27 sensors, 4 GB storage, a tiny processor and an advanced system of speakers and microphones, The Dash is a highly advanced micro-computer, albeit one that fits inside your ear.

As well as fitness tracking and the ability to play music from your phone or The Dash’s in-built storage, you can also answer a call from your smartphone with a nod or swipe your finger across each earbud to change volume or change the song.

With the array of microphones in each earbud The Dash also has a feature called ‘audio transparency’ which blends the song or walking directions you’re listing to with the sounds around you, keeping you attuned to what’s happening around you.

This, says Hviid, really demonstrates the potential of an ‘audio contextual computer’.

The sound of computing

Screens make glorious, colourful, bright, distracting computers.

On the way to work you’ll probably see dozens of smartphone zombies walking down the road, or sitting mindlessly on the train.

In the industry jargon, a smartphone’s screen is a ‘serial immersive interface’, basically when you look at it, everything else is gone.

It’s so distracting that you’ll actually walk yourself into danger without noticing, because they’re just so hard to tear your eyes away from.

Hearing is the complete opposite.

“Hearing is beautiful because it’s a ‘parallel discrete interface’,” says Hviid.

“You can hear me, you can hear someone outside, you can hear a scratching noise or someone knocking on the door, all at the same time.”

You may have sat in a noisy pub, with music playing, talking to a friend, but still retain the ability to hear all these things and switch your focus as needed.

It’s the start of an operating system for your ear, it’s the beauty of how our brains gracefully handle sound, and it’s why so many players, from Apple to Samsung and Bragi, are scrambling to put their own computer in your ear.

But this is only the start.

As voice assistants in our home, like Amazon’s Alexa, open people up to the idea of controlling a computer with their voice, putting these computers in your ear, along with dozens or hundreds of sensors, is the next logical step.

That’s why your ear just became the most exciting place in technology.

Are you listening?