A bicycle for the senses
For the past seven decades, computers have been primarily designed to enhance what your brain can do — think and remember. New kinds of computers will enhance what your senses can do — see, hear, touch, smell, taste.
The term spatial computing is emerging to encompass both augmented and virtual reality. I believe we are exploring an even broader paradigm: sensory computing. The phone was a keyhole for peering into this world, and now we’re opening the door.
In the early days of Apple, Steve Jobs was fond of describing the computer as a bicycle for the mind:
I read a study that measured the efficiency of locomotion for various animals across the planet. The condor used the least amount of energy to move one kilometer and humans came in with a rather unimpressive showing about a third of the way down the list. So that didn’t look so good, but then someone at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a man on a bicycle blew the condor away, completely off the top of the charts. That’s what a computer is to me. A computer is the most remarkable tool that we’ve ever come up with and it’s the equivalent of a bicycle for our minds.
Because of my hybrid training in biology and industrial design, Steve Jobs’s analogy always spoke to me. It made me wonder, what would a bicycle for the senses be like?
A bicycle for your ears
The first mass-market bicycle for the senses was Apple’s AirPods. Its noise cancellation and transparency mode replace and enhance your hearing.
Earbuds are turning into ear computers that will become more easily programmable. This can enable many more kinds of hearing. For example, instantaneous translation may soon be a reality, akin to the Babel fish from The Hitchhiker’s Guide to the Galaxy:
The Babel fish is small, yellow, leech-like, and probably the oddest thing in the Universe. It feeds on brainwave energy received not from its own carrier, but from those around it. It absorbs all unconscious mental frequencies from this brainwave energy to nourish itself with. It then excretes into the mind of its carrier a telepathic matrix formed by combining the conscious thought frequencies with nerve signals picked up from the speech centres of the brain which has supplied them. The practical upshot of all this is that if you stick a Babel fish in your ear you can instantly understand anything said to you in any form of language. The speech patterns you actually hear decode the brainwave matrix which has been fed into your mind by your Babel fish.
One can imagine other kinds of hearing enhancements. Similar to hearing aids, specific frequencies could be fine-tuned to accommodate hearing loss. But what if you could see like a bat? By integrating earbuds with a headset, a sensory computer could translate what you can’t see into verbal descriptions you can interpret.
I have said that the essence of the belief that bats have experience is that there is something that it is like to be a bat. Now we know that most bats (the microchiroptera, to be precise) perceive the external world primarily by sonar, or echolocation, detecting the reflections, from objects within range, of their own rapid, subtly modulated, high-frequency shrieks. Their brains are designed to correlate the outgoing impulses with the subsequent echoes, and the information thus acquired enables bats to make precise discriminations of distance, size, shape, motion, and texture comparable to those we make by vision.
We are advancing towards a set of technologies that will expand and personalize our individual umwelt.
A bicycle for your eyes
Headset displays connect sensory extensions directly to your vision. Equipped with sensors that perceive beyond human capabilities, and access to the internet, they can provide information about your surroundings wherever you are.
Until now, visual augmentation has been constrained by the tiny display on our phone. By virtue of being integrated with your your eyesight, headsets can open up new kinds of apps that feel more natural.
Every app is a superpower. Sensory computing opens up new superpowers that we can borrow from nature. Animals, plants and other organisms can sense things that humans can’t:
- Snakes sense heat to locate prey
- Birds sense magnetic fields to guide their migrations
- Eagles have sharper eyesight, multiple times the visual acuity of humans
- Cats see in the dark
- Sharks sense electrical currents
- Jellyfish detect ocean currents
- Chameleons see in 360-degree vision
- Ticks smell butyric acid to find mammals
How could these superpowers be useful to humans in daily life?
We can take nature’s superpowers and expand them across many more vectors that are interesting to humans:
- Across scale — far and near, binoculars, zoom, telescope, microscope
- Across wavelength — UV, IR, heatmaps, nightvision, wifi, magnetic fields, electrical and water currents
- Across time — view historical imagery, architectural, terrain, geological, and climate changes
- Across culture — experience the relevance of a place in books, movies, photography, paintings, and language
- Across space — travel immersively to other locations for tourism, business, and personal connections
- Across perspective — upside down, inside out, around corners, top down, wider, narrower, out of body
- Across interpretation — alter the visual and artistic interpretation of your environment, color-shifting, saturation, contrast, sharpness
Every domain becomes a layer or a lense through which you can sense the world:
- Geography, terrain, elevation
- Biology, chemistry, physics, astronomy
- Structural engineering, architecture, interior design
- Sports, fitness
- Real estate, shopping
What happens when put on a headset and open the “Math” app? How could seeing the world through math help you understand both better?
A bicycle for your nose?
We are still in the early days of spatial and sensory computing, but you may be surprised by how quickly these new capabilities will evolve.
We may be closer to creating an electronic nose than you might imagine. Researchers are finding that neural networks may open up new ways we can digitize smells. It may sound far-fetched, but converting olfactory patterns into visual patterns could open up some interesting applications. Perhaps a new kind of cooking experience? Or new medical applications that convert imperceptible scents into visible patterns?
Advances in haptics may open up new kinds of tactile sensations. A kind of second skin, or softwear, if you will. Consider that Apple shipped a feature to help you find lost items that vibrates more strongly as you get closer. What other kinds of data could be translated into haptic feedback?
Sensory computing opens up many new questions that I am curious to explore, and see explored.
This essay is a revised compilation of personal notes written between 2011-2013 while working on a headset that never shipped. Apple’s Vision Pro inspired me to dust off these ideas and update them.