Can we go beyond 5-sense based humanity?

Can we go beyond 5-sense based humanity?

David Eagleman

We are built out of very small stuff, and we are embedded in a very large cosmos, and the fact is that we are not very good at understanding reality at either of those scales, and that’s because our brains haven’t evolved to understand the world at that scale….Instead, we’re trapped on this very thin slice of perception right in the middle. But it gets strange, because even at that slice of reality that we call home, we’re not seeing most of the action that’s going on…our brains are sampling just a little bit of the world.

Across the animal kingdom, different animals pick up on different parts of reality. So in the blind and deaf world of the tick, the important signals are temperature and butyric acid; in the world of the black ghost knife fish, its sensory world is lavishly colored by electrical fields; and for the echolocating bat, its reality is constructed out of air compression waves.

That’s the slice of their ecosystem that they can pick up on, and we have a word for this in science. It’s called the umwelt, which is the German word for the surrounding world. Now, presumably, every animal assumes that its umwelt is the entire objective reality out there.

 The brain figures out what to do with the data that comes in. And when you look across the animal kingdom, you find lots of peripheral devices. So snakes have heat pits with which to detect infrared, and the ghost knife fish has electroreceptors, and the star-nosed mole has this appendage with 22 fingers on it with which it feels around and constructs a 3D model of the world, and many birds have magnetite so they can orient to the magnetic field of the planet. So what this means is that nature doesn’t have to continually redesign the brain. Instead, with the principles of brain operation established, all nature has to worry about is designing new peripherals.

So here’s the concept…as I’m speaking, my sound is getting captured by the tablet, and then it’s getting mapped onto a vest that’s covered in vibratory motors, just like the motors in your cell phone. So as I’m speaking, the sound is getting translated to a pattern of vibration on the vest. Now, this is not just conceptual: this tablet is transmitting Bluetooth, and I’m wearing the vest right now. So as I’m speaking —  the sound is getting translated into dynamic patterns of vibration. I’m feeling the sonic world around me.

We’ve been very encouraged by our results with sensory substitution, but what we’ve been thinking a lot about is sensory addition. How could we use a technology like this to add a completely new kind of sense, to expand the human umwelt? For example, could we feed real-time data from the Internet directly into somebody’s brain, and can they develop a direct perceptual experience?”

David Eagleman is an American neuroscientist, author and science communicator