Can we create new senses for humans? | David Eagleman

TED
18 Mar 201520:39

Summary

TLDRThe transcript explores the concept of 'umwelt,' or the subjective reality experienced by different species, highlighting how human perception is limited by biology. Neuroscientist David Eagleman discusses how technology can expand our sensory experiences beyond natural limits, enabling sensory substitution and even sensory addition. He demonstrates how devices like vests can convert sound into vibrations, allowing the deaf to perceive speech, and suggests future possibilities, such as astronauts monitoring space station health through sensory input. Eagleman envisions a future where humans can customize their sensory experience, unlocking new ways to interact with the world.

Takeaways

  • 🌌 Humans are limited in their perception of reality, only perceiving a small fraction of the vast cosmos around them.
  • 👁 Our experience of the world is constrained by biology, with our senses only capturing a tiny portion of available information.
  • 🐍 Different animals perceive reality in unique ways, such as snakes detecting infrared or bats using echolocation.
  • 🧠 The brain is highly adaptable and can process sensory information from various sources, regardless of how it is received.
  • 🔌 Sensory substitution technologies, like those that allow blind people to 'see' through their tongues or backs, demonstrate the brain's flexibility.
  • 🔊 The vest that translates sound into vibrations on the skin shows potential for expanding sensory experiences for the deaf.
  • 💡 Sensory addition could allow humans to experience new kinds of data, like real-time stock market information or the emotional state of a crowd.
  • 🚀 This technology could be applied in various fields, such as allowing astronauts to feel the health of their spacecraft or pilots to feel the status of their aircraft.
  • 📊 The brain's ability to handle multidimensional data could revolutionize how we interact with complex information, making it more intuitive.
  • 🔄 The future may involve humans choosing their sensory experiences, enhancing and expanding their perception beyond natural limitations.

Q & A

  • What is the 'umwelt' and how does it relate to our perception of reality?

    -The 'umwelt' is a term used to describe the surrounding world as experienced by an organism. It relates to our perception of reality by highlighting that each species perceives the world differently based on their sensory capabilities. Humans, for example, perceive only a small fraction of reality due to the limitations of our sensory organs.

  • Why do humans only perceive a small fraction of the electromagnetic spectrum?

    -Humans only perceive a small fraction of the electromagnetic spectrum because our biological receptors, specifically our eyes, are only sensitive to certain wavelengths of light, known as the visible spectrum. We are unaware of other wavelengths, such as radio waves or X-rays, because we lack the necessary biological sensors to detect them.

  • How do animals like snakes or honeybees perceive reality differently from humans?

    -Snakes and honeybees perceive reality differently because they have sensory capabilities that humans do not. Snakes can detect infrared radiation, allowing them to sense heat, while honeybees can see ultraviolet light, giving them a different visual experience of the world.

  • What is sensory substitution, and how has it been demonstrated?

    -Sensory substitution is the process of feeding information into the brain through unconventional sensory channels, allowing the brain to interpret this data in a meaningful way. It has been demonstrated with blind individuals who could 'see' objects by feeling patterns on their back or tongue, showing that the brain can adapt to different types of input.

  • What is the significance of the brain's ability to process different types of sensory input?

    -The significance of the brain's ability to process different types of sensory input is that it shows the brain's flexibility and capacity to adapt. It can interpret data from various sources and construct a coherent understanding of the environment, even when the data comes from unconventional sources, such as vibrations on the skin.

  • How does the vest used in the research translate sound into a sensory experience for deaf individuals?

    -The vest used in the research translates sound into a sensory experience by converting sound waves into patterns of vibration that can be felt on the skin. These vibrations correspond to the sounds, allowing deaf individuals to 'feel' speech and other auditory information, which their brains can then learn to interpret.

  • What potential does the vest technology have for sensory addition?

    -The vest technology has the potential for sensory addition by allowing humans to experience new kinds of data in real-time, such as stock market trends or environmental conditions, directly through their bodies. This could expand human perception and interaction with the world in unprecedented ways.

  • Why does the brain not care about the origin of sensory data?

    -The brain does not care about the origin of sensory data because it is designed to process and make sense of any electrochemical signals it receives, regardless of where they come from. This allows the brain to interpret data from various sources, whether they are natural senses or artificial inputs, and construct a coherent perception of reality.

  • What are the implications of the brain's 'general-purpose' computing ability?

    -The implications of the brain's 'general-purpose' computing ability are vast. It means that the brain can potentially adapt to new forms of sensory input, allowing for the development of technologies that can enhance or even expand our sensory experience. This could lead to new ways of interacting with the world, beyond our natural capabilities.

  • What are some possible future applications of sensory expansion technologies?

    -Possible future applications of sensory expansion technologies include allowing astronauts to feel the health of a space station, enabling people to monitor their own biological states like blood sugar levels, or providing pilots with a direct sense of their aircraft's condition. These technologies could revolutionize how we interact with and understand complex data in real-time.

Outlines

00:00

🔬 Understanding the Limits of Human Perception

This paragraph explores the limitations of human perception, emphasizing that we are only able to perceive a tiny fraction of the electromagnetic spectrum. While our biology allows us to see colors and light, it leaves us blind to other wavelengths like radio waves and infrared. The paragraph introduces the concept of 'umwelt,' a term used to describe the limited slice of reality each species can perceive, highlighting how different animals sense the world differently based on their biological makeup.

05:03

🧠 The Brain's Ability to Interpret Signals

This section delves into how the brain processes information. Despite being isolated in the darkness of the skull, the brain interprets electrochemical signals to create our subjective experience of the world. The brain's adaptability is highlighted, as it can make sense of signals from artificial devices like cochlear implants or retinal implants. This ability to interpret different data sources suggests that the brain is a general-purpose computing device, capable of making sense of any sensory input it receives.

10:04

🔧 Sensory Substitution and Human Potential

The focus here is on sensory substitution, where the brain is fed information through unconventional channels and learns to interpret it. The paragraph discusses how blind individuals have been able to 'see' through their backs, tongues, and other body parts using various devices. This demonstrates the brain's incredible flexibility in processing new types of sensory data. The section ends by introducing a project aimed at enabling deaf people to 'hear' through vibrations on a vest, showing early success in this innovative approach.

15:05

📡 Expanding Human Senses with Technology

This paragraph explores the potential for sensory addition, where technology could introduce entirely new senses to humans. Examples include real-time data feeds from the internet being converted into sensory input, such as economic data or emotional sentiment analysis from Twitter. The possibilities for enhancing human experience are vast, ranging from monitoring the International Space Station's health to feeling your own blood sugar levels. The concept suggests that we can transcend our biological limitations by integrating new sensory devices into our experience.

20:06

🚀 The Future of Human Sensory Expansion

In the final paragraph, the potential applications of sensory expansion technology are discussed. The speaker envisions a future where humans can feel complex data streams, such as the state of machinery or health, through sensory input rather than visual or auditory cues. The talk concludes with the idea that humans will soon be able to choose their sensory experiences, no longer confined by the limitations of our natural biology. This marks a significant leap in human evolution, enabling us to experience the universe in entirely new ways.

Mindmap

Keywords

💡Umwelt

The concept of 'Umwelt' refers to the surrounding sensory world that an organism perceives, based on its biological sensory apparatus. In the video, it's used to explain how different species experience reality differently. For example, a tick's umwelt revolves around temperature and butyric acid, while a human's umwelt is centered on the senses available to them, such as sight and sound.

💡Sensory Substitution

Sensory substitution involves feeding information into the brain through unusual sensory channels, allowing the brain to interpret this information as if it were coming from a traditional sense. The video discusses how blind people can 'see' using tactile feedback on their back or tongue, showing that the brain can adapt to new forms of input and generate meaningful perceptions from them.

💡Sensory Addition

Sensory addition refers to the idea of introducing completely new sensory experiences beyond the ones evolution has provided. The video explores this concept by discussing the potential to feed real-time data from sources like the internet or stock market directly into the brain, allowing humans to experience new dimensions of reality.

💡Peripheral Devices

'Peripheral devices' in the video are compared to sensory organs like eyes and ears, which provide data to the brain. The brain, like a general-purpose computing device, interprets this data without concern for its source. This analogy highlights the flexibility of the brain to integrate new sensory inputs through technological advancements.

💡General Purpose Computing Device

This term describes the brain's ability to process and interpret various types of data, regardless of its origin. The brain does not inherently distinguish between traditional sensory inputs and those provided by technology, as demonstrated by sensory substitution devices. This concept is crucial to understanding how the brain adapts to new sensory information.

💡Electrochemical Signals

Electrochemical signals are the primary language of the brain, representing the data it receives from sensory organs. The video explains that regardless of the sensory organ or technological device providing the input, the brain translates these signals into our subjective experience of reality. This underpins the brain's adaptability and the potential for sensory substitution.

💡Sensory Technology

Sensory technology refers to devices that can enhance or replace biological senses. Examples from the video include cochlear implants for hearing and retinal implants for vision, which demonstrate that technology can interface with the brain to provide sensory experiences, expanding or restoring the umwelt for individuals with sensory impairments.

💡Artificial Hearing/Vision

These are examples of sensory substitution where technology, such as cochlear implants or retinal implants, is used to replicate the function of biological senses. The video uses these examples to illustrate how technology can bridge the gap between sensory deficits and the brain's ability to interpret new forms of sensory data.

💡Pattern Recognition

Pattern recognition is the brain's ability to detect and interpret regularities in the data it receives. The video emphasizes how the brain excels at this task, allowing it to make sense of complex sensory inputs, whether they come from traditional senses or new technological channels like the vest that translates sound into vibrations.

💡Human Expansion

Human expansion refers to the idea of extending the sensory and perceptual capabilities of humans beyond natural limitations using technology. The video discusses potential applications, such as astronauts feeling the health of a space station or people sensing data from the internet, which could lead to new ways of experiencing and interacting with the world.

Highlights

Humans are not naturally equipped to perceive the full spectrum of reality, as our senses are limited to a narrow band of perception.

Most of the electromagnetic spectrum, such as radio waves and X-rays, is invisible to us, though other animals and machines can detect them.

The concept of 'umwelt' describes the limited sensory world of each species, emphasizing that humans are confined by their biological senses.

The brain is a general-purpose computing device that interprets signals from various sensory inputs, regardless of their source.

Technology can expand human perception by creating new sensory inputs, such as artificial hearing and vision devices.

Sensory substitution allows the brain to process information from unconventional sources, such as converting visual data into tactile signals for the blind.

The 'brainport' device enables blind people to perceive their surroundings through electrotactile signals on their tongue, effectively allowing them to 'see' through their tongue.

The brain does not differentiate between sensory inputs, as long as the signals are interpretable, making it possible to experience new types of perception.

A vest equipped with vibratory motors can translate sound into patterns of vibration, enabling deaf individuals to understand spoken language through touch.

The concept of sensory addition involves using technology to introduce entirely new senses, potentially allowing humans to experience data like real-time stock market movements.

Experiments are being conducted to determine if people can develop a direct perceptual experience of abstract data, such as economic trends, through sensory addition.

The potential applications of sensory addition are vast, including enhancing astronauts' perception of spacecraft health or providing 360-degree vision.

The brain's flexibility in adapting to new sensory inputs opens up possibilities for redefining human perception and experience through technology.

The project demonstrates that wearable technology could be a cost-effective alternative to invasive medical procedures, like cochlear implants.

The future of human experience may involve choosing and customizing our sensory peripherals, enabling us to interact with and understand the universe in entirely new ways.

Transcripts

play00:12

We are built out of very small stuff,

play00:17

and we are embedded in a very large cosmos,

play00:20

and the fact is that we are not very good at understanding reality

play00:24

at either of those scales,

play00:26

and that's because our brains

play00:27

haven't evolved to understand the world at that scale.

play00:32

Instead, we're trapped on this very thin slice of perception

play00:36

right in the middle.

play00:38

But it gets strange, because even at that slice of reality that we call home,

play00:43

we're not seeing most of the action that's going on.

play00:46

So take the colors of our world.

play00:49

This is light waves, electromagnetic radiation that bounces off objects

play00:54

and it hits specialized receptors in the back of our eyes.

play00:57

But we're not seeing all the waves out there.

play01:01

In fact, what we see

play01:03

is less than a 10 trillionth of what's out there.

play01:07

So you have radio waves and microwaves

play01:10

and X-rays and gamma rays passing through your body right now

play01:13

and you're completely unaware of it,

play01:16

because you don't come with the proper biological receptors

play01:19

for picking it up.

play01:21

There are thousands of cell phone conversations

play01:24

passing through you right now,

play01:25

and you're utterly blind to it.

play01:28

Now, it's not that these things are inherently unseeable.

play01:31

Snakes include some infrared in their reality,

play01:36

and honeybees include ultraviolet in their view of the world,

play01:40

and of course we build machines in the dashboards of our cars

play01:43

to pick up on signals in the radio frequency range,

play01:46

and we built machines in hospitals to pick up on the X-ray range.

play01:50

But you can't sense any of those by yourself,

play01:53

at least not yet,

play01:55

because you don't come equipped with the proper sensors.

play01:59

Now, what this means is that our experience of reality

play02:03

is constrained by our biology,

play02:07

and that goes against the common sense notion

play02:09

that our eyes and our ears and our fingertips

play02:12

are just picking up the objective reality that's out there.

play02:16

Instead, our brains are sampling just a little bit of the world.

play02:22

Now, across the animal kingdom,

play02:24

different animals pick up on different parts of reality.

play02:27

So in the blind and deaf world of the tick,

play02:30

the important signals are temperature and butyric acid;

play02:34

in the world of the black ghost knifefish,

play02:37

its sensory world is lavishly colored by electrical fields;

play02:42

and for the echolocating bat,

play02:45

its reality is constructed out of air compression waves.

play02:49

That's the slice of their ecosystem that they can pick up on,

play02:53

and we have a word for this in science.

play02:55

It's called the umwelt,

play02:56

which is the German word for the surrounding world.

play03:00

Now, presumably, every animal assumes

play03:03

that its umwelt is the entire objective reality out there,

play03:07

because why would you ever stop to imagine

play03:10

that there's something beyond what we can sense.

play03:13

Instead, what we all do is we accept reality

play03:16

as it's presented to us.

play03:19

Let's do a consciousness-raiser on this.

play03:21

Imagine that you are a bloodhound dog.

play03:24

Your whole world is about smelling.

play03:27

You've got a long snout that has 200 million scent receptors in it,

play03:31

and you have wet nostrils that attract and trap scent molecules,

play03:36

and your nostrils even have slits so you can take big nosefuls of air.

play03:40

Everything is about smell for you.

play03:43

So one day, you stop in your tracks with a revelation.

play03:47

You look at your human owner and you think,

play03:50

"What is it like to have the pitiful, impoverished nose of a human?

play03:55

(Laughter)

play03:57

What is it like when you take a feeble little noseful of air?

play04:00

How can you not know that there's a cat 100 yards away,

play04:04

or that your neighbor was on this very spot six hours ago?"

play04:07

(Laughter)

play04:10

So because we're humans,

play04:12

we've never experienced that world of smell,

play04:15

so we don't miss it,

play04:18

because we are firmly settled into our umwelt.

play04:22

But the question is, do we have to be stuck there?

play04:26

So as a neuroscientist, I'm interested in the way that technology

play04:30

might expand our umwelt,

play04:33

and how that's going to change the experience of being human.

play04:38

So we already know that we can marry our technology to our biology,

play04:41

because there are hundreds of thousands of people walking around

play04:45

with artificial hearing and artificial vision.

play04:49

So the way this works is, you take a microphone and you digitize the signal,

play04:53

and you put an electrode strip directly into the inner ear.

play04:57

Or, with the retinal implant, you take a camera

play04:59

and you digitize the signal, and then you plug an electrode grid

play05:02

directly into the optic nerve.

play05:05

And as recently as 15 years ago,

play05:09

there were a lot of scientists who thought these technologies wouldn't work.

play05:13

Why? It's because these technologies speak the language of Silicon Valley,

play05:18

and it's not exactly the same dialect as our natural biological sense organs.

play05:24

But the fact is that it works;

play05:26

the brain figures out how to use the signals just fine.

play05:31

Now, how do we understand that?

play05:33

Well, here's the big secret:

play05:35

Your brain is not hearing or seeing any of this.

play05:40

Your brain is locked in a vault of silence and darkness inside your skull.

play05:47

All it ever sees are electrochemical signals

play05:50

that come in along different data cables,

play05:53

and this is all it has to work with, and nothing more.

play05:58

Now, amazingly,

play06:00

the brain is really good at taking in these signals

play06:03

and extracting patterns and assigning meaning,

play06:07

so that it takes this inner cosmos and puts together a story

play06:11

of this, your subjective world.

play06:16

But here's the key point:

play06:18

Your brain doesn't know, and it doesn't care,

play06:21

where it gets the data from.

play06:24

Whatever information comes in, it just figures out what to do with it.

play06:29

And this is a very efficient kind of machine.

play06:31

It's essentially a general purpose computing device,

play06:36

and it just takes in everything

play06:38

and figures out what it's going to do with it,

play06:41

and that, I think, frees up Mother Nature

play06:44

to tinker around with different sorts of input channels.

play06:49

So I call this the P.H. model of evolution,

play06:52

and I don't want to get too technical here,

play06:54

but P.H. stands for Potato Head,

play06:57

and I use this name to emphasize that all these sensors

play07:01

that we know and love, like our eyes and our ears and our fingertips,

play07:04

these are merely peripheral plug-and-play devices:

play07:08

You stick them in, and you're good to go.

play07:12

The brain figures out what to do with the data that comes in.

play07:18

And when you look across the animal kingdom,

play07:20

you find lots of peripheral devices.

play07:23

So snakes have heat pits with which to detect infrared,

play07:27

and the ghost knifefish has electroreceptors,

play07:30

and the star-nosed mole has this appendage

play07:33

with 22 fingers on it

play07:35

with which it feels around and constructs a 3D model of the world,

play07:39

and many birds have magnetite so they can orient

play07:43

to the magnetic field of the planet.

play07:45

So what this means is that nature doesn't have to continually

play07:49

redesign the brain.

play07:52

Instead, with the principles of brain operation established,

play07:56

all nature has to worry about is designing new peripherals.

play08:01

Okay. So what this means is this:

play08:04

The lesson that surfaces

play08:06

is that there's nothing really special or fundamental

play08:09

about the biology that we come to the table with.

play08:12

It's just what we have inherited

play08:14

from a complex road of evolution.

play08:18

But it's not what we have to stick with,

play08:21

and our best proof of principle of this

play08:23

comes from what's called sensory substitution.

play08:26

And that refers to feeding information into the brain

play08:29

via unusual sensory channels,

play08:32

and the brain just figures out what to do with it.

play08:35

Now, that might sound speculative,

play08:37

but the first paper demonstrating this was published in the journal Nature in 1969.

play08:43

So a scientist named Paul Bach-y-Rita

play08:46

put blind people in a modified dental chair,

play08:49

and he set up a video feed,

play08:51

and he put something in front of the camera,

play08:54

and then you would feel that

play08:56

poked into your back with a grid of solenoids.

play08:59

So if you wiggle a coffee cup in front of the camera,

play09:02

you're feeling that in your back,

play09:04

and amazingly, blind people got pretty good

play09:07

at being able to determine what was in front of the camera

play09:11

just by feeling it in the small of their back.

play09:14

Now, there have been many modern incarnations of this.

play09:18

The sonic glasses take a video feed right in front of you

play09:21

and turn that into a sonic landscape,

play09:24

so as things move around, and get closer and farther,

play09:26

it sounds like "Bzz, bzz, bzz."

play09:29

It sounds like a cacophony,

play09:31

but after several weeks, blind people start getting pretty good

play09:34

at understanding what's in front of them

play09:37

just based on what they're hearing.

play09:39

And it doesn't have to be through the ears:

play09:41

this system uses an electrotactile grid on the forehead,

play09:45

so whatever's in front of the video feed, you're feeling it on your forehead.

play09:49

Why the forehead? Because you're not using it for much else.

play09:51

The most modern incarnation is called the brainport,

play09:56

and this is a little electrogrid that sits on your tongue,

play09:59

and the video feed gets turned into these little electrotactile signals,

play10:03

and blind people get so good at using this that they can throw a ball into a basket,

play10:10

or they can navigate complex obstacle courses.

play10:15

They can come to see through their tongue.

play10:19

Now, that sounds completely insane, right?

play10:21

But remember, all vision ever is

play10:24

is electrochemical signals coursing around in your brain.

play10:28

Your brain doesn't know where the signals come from.

play10:31

It just figures out what to do with them.

play10:34

So my interest in my lab is sensory substitution for the deaf,

play10:40

and this is a project I've undertaken

play10:43

with a graduate student in my lab, Scott Novich,

play10:46

who is spearheading this for his thesis.

play10:48

And here is what we wanted to do:

play10:50

we wanted to make it so that sound from the world gets converted

play10:54

in some way so that a deaf person can understand what is being said.

play10:59

And we wanted to do this, given the power and ubiquity of portable computing,

play11:03

we wanted to make sure that this would run on cell phones and tablets,

play11:08

and also we wanted to make this a wearable,

play11:11

something that you could wear under your clothing.

play11:14

So here's the concept.

play11:17

So as I'm speaking, my sound is getting captured by the tablet,

play11:22

and then it's getting mapped onto a vest that's covered in vibratory motors,

play11:28

just like the motors in your cell phone.

play11:31

So as I'm speaking,

play11:33

the sound is getting translated to a pattern of vibration on the vest.

play11:40

Now, this is not just conceptual:

play11:41

this tablet is transmitting Bluetooth, and I'm wearing the vest right now.

play11:47

So as I'm speaking -- (Applause) --

play11:50

the sound is getting translated into dynamic patterns of vibration.

play11:55

I'm feeling the sonic world around me.

play12:01

So, we've been testing this with deaf people now,

play12:05

and it turns out that after just a little bit of time,

play12:08

people can start feeling, they can start understanding

play12:12

the language of the vest.

play12:14

So this is Jonathan. He's 37 years old. He has a master's degree.

play12:19

He was born profoundly deaf,

play12:22

which means that there's a part of his umwelt that's unavailable to him.

play12:26

So we had Jonathan train with the vest for four days, two hours a day,

play12:30

and here he is on the fifth day.

play12:33

Scott Novich: You.

play12:36

David Eagleman: So Scott says a word, Jonathan feels it on the vest,

play12:39

and he writes it on the board.

play12:42

SN: Where. Where.

play12:46

DE: Jonathan is able to translate this complicated pattern of vibrations

play12:49

into an understanding of what's being said.

play12:52

SN: Touch. Touch.

play12:56

DE: Now, he's not doing this --

play13:00

(Applause) --

play13:07

Jonathan is not doing this consciously, because the patterns are too complicated,

play13:12

but his brain is starting to unlock the pattern that allows it to figure out

play13:17

what the data mean,

play13:19

and our expectation is that, after wearing this for about three months,

play13:23

he will have a direct perceptual experience of hearing

play13:28

in the same way that when a blind person passes a finger over braille,

play13:32

the meaning comes directly off the page without any conscious intervention at all.

play13:38

Now, this technology has the potential to be a game-changer,

play13:42

because the only other solution for deafness is a cochlear implant,

play13:46

and that requires an invasive surgery.

play13:49

And this can be built for 40 times cheaper than a cochlear implant,

play13:54

which opens up this technology globally, even for the poorest countries.

play14:00

Now, we've been very encouraged by our results with sensory substitution,

play14:05

but what we've been thinking a lot about is sensory addition.

play14:09

How could we use a technology like this to add a completely new kind of sense,

play14:14

to expand the human umvelt?

play14:17

For example, could we feed real-time data from the Internet

play14:22

directly into somebody's brain,

play14:24

and can they develop a direct perceptual experience?

play14:27

So here's an experiment we're doing in the lab.

play14:30

A subject is feeling a real-time streaming feed from the Net of data

play14:34

for five seconds.

play14:36

Then, two buttons appear, and he has to make a choice.

play14:39

He doesn't know what's going on.

play14:41

He makes a choice, and he gets feedback after one second.

play14:43

Now, here's the thing:

play14:45

The subject has no idea what all the patterns mean,

play14:47

but we're seeing if he gets better at figuring out which button to press.

play14:51

He doesn't know that what we're feeding

play14:53

is real-time data from the stock market,

play14:56

and he's making buy and sell decisions.

play14:59

(Laughter)

play15:01

And the feedback is telling him whether he did the right thing or not.

play15:04

And what we're seeing is, can we expand the human umvelt

play15:07

so that he comes to have, after several weeks,

play15:10

a direct perceptual experience of the economic movements of the planet.

play15:16

So we'll report on that later to see how well this goes.

play15:20

(Laughter)

play15:22

Here's another thing we're doing:

play15:24

During the talks this morning, we've been automatically scraping Twitter

play15:29

for the TED2015 hashtag,

play15:31

and we've been doing an automated sentiment analysis,

play15:34

which means, are people using positive words or negative words or neutral?

play15:39

And while this has been going on,

play15:41

I have been feeling this,

play15:44

and so I am plugged in to the aggregate emotion

play15:48

of thousands of people in real time,

play15:52

and that's a new kind of human experience, because now I can know

play15:56

how everyone's doing and how much you're loving this.

play16:00

(Laughter) (Applause)

play16:06

It's a bigger experience than a human can normally have.

play16:11

We're also expanding the umvelt of pilots.

play16:14

So in this case, the vest is streaming nine different measures

play16:18

from this quadcopter,

play16:20

so pitch and yaw and roll and orientation and heading,

play16:23

and that improves this pilot's ability to fly it.

play16:27

It's essentially like he's extending his skin up there, far away.

play16:32

And that's just the beginning.

play16:34

What we're envisioning is taking a modern cockpit full of gauges

play16:40

and instead of trying to read the whole thing, you feel it.

play16:44

We live in a world of information now,

play16:47

and there is a difference between accessing big data

play16:51

and experiencing it.

play16:54

So I think there's really no end to the possibilities

play16:58

on the horizon for human expansion.

play17:00

Just imagine an astronaut being able to feel

play17:05

the overall health of the International Space Station,

play17:08

or, for that matter, having you feel the invisible states of your own health,

play17:13

like your blood sugar and the state of your microbiome,

play17:17

or having 360-degree vision or seeing in infrared or ultraviolet.

play17:23

So the key is this: As we move into the future,

play17:26

we're going to increasingly be able to choose our own peripheral devices.

play17:31

We no longer have to wait for Mother Nature's sensory gifts

play17:35

on her timescales,

play17:37

but instead, like any good parent, she's given us the tools that we need

play17:41

to go out and define our own trajectory.

play17:45

So the question now is,

play17:47

how do you want to go out and experience your universe?

play17:52

Thank you.

play17:54

(Applause)

play18:11

Chris Anderson: Can you feel it? DE: Yeah.

play18:13

Actually, this was the first time I felt applause on the vest.

play18:16

It's nice. It's like a massage. (Laughter)

play18:19

CA: Twitter's going crazy. Twitter's going mad.

play18:22

So that stock market experiment.

play18:25

This could be the first experiment that secures its funding forevermore,

play18:29

right, if successful?

play18:31

DE: Well, that's right, I wouldn't have to write to NIH anymore.

play18:34

CA: Well look, just to be skeptical for a minute,

play18:37

I mean, this is amazing, but isn't most of the evidence so far

play18:40

that sensory substitution works,

play18:43

not necessarily that sensory addition works?

play18:45

I mean, isn't it possible that the blind person can see through their tongue

play18:48

because the visual cortex is still there, ready to process,

play18:53

and that that is needed as part of it?

play18:55

DE: That's a great question. We actually have no idea

play18:58

what the theoretical limits are of what kind of data the brain can take in.

play19:02

The general story, though, is that it's extraordinarily flexible.

play19:05

So when a person goes blind, what we used to call their visual cortex

play19:09

gets taken over by other things, by touch, by hearing, by vocabulary.

play19:14

So what that tells us is that the cortex is kind of a one-trick pony.

play19:18

It just runs certain kinds of computations on things.

play19:20

And when we look around at things like braille, for example,

play19:24

people are getting information through bumps on their fingers.

play19:27

So I don't think we have any reason to think there's a theoretical limit

play19:30

that we know the edge of.

play19:33

CA: If this checks out, you're going to be deluged.

play19:36

There are so many possible applications for this.

play19:39

Are you ready for this? What are you most excited about, the direction it might go?

play19:43

DE: I mean, I think there's a lot of applications here.

play19:46

In terms of beyond sensory substitution, the things I started mentioning

play19:49

about astronauts on the space station, they spend a lot of their time

play19:54

monitoring things, and they could instead just get what's going on,

play19:57

because what this is really good for is multidimensional data.

play20:00

The key is this: Our visual systems are good at detecting blobs and edges,

play20:05

but they're really bad at what our world has become,

play20:07

which is screens with lots and lots of data.

play20:10

We have to crawl that with our attentional systems.

play20:12

So this is a way of just feeling the state of something,

play20:15

just like the way you know the state of your body as you're standing around.

play20:18

So I think heavy machinery, safety, feeling the state of a factory,

play20:22

of your equipment, that's one place it'll go right away.

play20:25

CA: David Eagleman, that was one mind-blowing talk. Thank you very much.

play20:28

DE: Thank you, Chris. (Applause)

Rate This

5.0 / 5 (0 votes)

関連タグ
Human PerceptionTechnologySensory ExpansionNeuroscienceSensory SubstitutionInnovationFuture of SensesBrain AdaptationWearable TechHuman Experience
英語で要約が必要ですか?