Curiosity Daily

Neuroprosthetics And The Future Of Artificial Touch

Episode Summary

Modern medicine can do extraordinary things – but how? This week, the Curiosity Podcast welcomes Dr. Sliman Bensmaia to explain how scientists are able to develop prosthetic devices that some day may be able to transmit a realistic sense of touch to their owners. An assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago, Dr. Bensmaia discusses how lab is working on cutting edge technology, and what's in store for the future of prosthetics. The Bensmaia Lab studies how the peripheral and central nervous systems represent the world around us. Dr. Bensmaia's research has led to groundbreaking insights about how we perceive objects and textures through the sense of touch, and may one day lead to prosthetic devices that completely restore a realistic sense of touch for amputees and tetraplegic patients.  Additional resources discussed: Bensmaia Lab Bionic Touch Through a Brain Interface | Sliman Bensmaia | TEDxColumbiaCollegeChicago UChicago Discovery Series│'Brain Teasers: Cracking the Mind's Toughest Riddles' with Sliman Bensmaia Watch President Obama fist bump a robotic arm powered by a brain chip Follow Curiosity Daily on your favorite podcast app to get smarter withCody Gough andAshley Hamer — for free! Still curious? Get exclusive science shows, nature documentaries, and more real-life entertainment on discovery+! Go to https://discoveryplus.com/curiosity to start your 7-day free trial. discovery+ is currently only available for US subscribers.

Episode Notes

Modern medicine can do extraordinary things – but how? This week, the Curiosity Podcast welcomes Dr. Sliman Bensmaia to explain how scientists are able to develop prosthetic devices that some day may be able to transmit a realistic sense of touch to their owners. An assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago, Dr. Bensmaia discusses how lab is working on cutting edge technology, and what's in store for the future of prosthetics.

The Bensmaia Lab studies how the peripheral and central nervous systems represent the world around us. Dr. Bensmaia's research has led to groundbreaking insights about how we perceive objects and textures through the sense of touch, and may one day lead to prosthetic devices that completely restore a realistic sense of touch for amputees and tetraplegic patients.

Additional resources discussed:

Follow Curiosity Daily on your favorite podcast app to get smarter with Cody Gough and Ashley Hamer — for free! Still curious? Get exclusive science shows, nature documentaries, and more real-life entertainment on discovery+! Go to https://discoveryplus.com/curiosity to start your 7-day free trial. discovery+ is currently only available for US subscribers.

 

Full episode transcript here: https://curiosity-daily-4e53644e.simplecast.com/episodes/neuroprosthetics-and-the-future-of-artificial-touch

Episode Transcription

CODY GOUGH: I'm curious what's the future of prosthetics?

 

Eventually we will have a prosthesis that approximates the dexterity of a human hand.

 

[MUSIC PLAYING]

 

CODY GOUGH: Hi. I'm Cody Gough.

 

ASHLEY HAMER: And I'm Ashley Hamer. Today, we're going to talk about the science of touch and the future of prosthetics.

 

CODY GOUGH: Every week we explore what we don't know because curiosity makes you smarter.

 

ASHLEY HAMER: From the award winning curiosity.com this is the curiosity podcast.

 

CODY GOUGH: The human brain is extremely complicated. So my guest today is an expert when it comes to how our brain works. He runs a lab that works on cutting edge technology, studying how our brain processes our sense of touch, and helping pave the way for the future of prosthetics. And most importantly, he is able to explain everything in a way that's not too complicated considering that he's working on some pretty sophisticated science. If you've ever wondered how our brain works or how scientists research how our brain works, this is the podcast for you.

 

I'm here with Dr. Sliman Bensmaia, neuroscientist and principal investigator and associate professor at University of Chicago. Principal investigator, you just explained to me means basically that you lead your research lab, right?

 

SLIMAN BENSMAIA: That's right.

 

CODY GOUGH: And your research lab is doing some really cool things.

 

SLIMAN BENSMAIA: Thank you, I think so too.

 

CODY GOUGH: Well, good. I would hope so. You're working on prosthetics, but not just prosthetics. But if I understand correctly prosthetics that allow people to feel a sense of touch.

 

SLIMAN BENSMAIA: That's right. So let's take like three steps back if that's cool. So it turns out any time we use our hands to do anything, not only are we moving our fingers in very precise ways to interact with objects. But we're getting this barrage of signals back from our fingers we think of these signals as touch. And also maybe lesser known sensors or modality from the hand is proprioception, which sort of tracks the movement and position of the fingers.

 

And these signals are critical to our ability to use our hands so without these signals if our hands were completely insensate, we basically couldn't use them. So that's the premise of a lot of the work that we do, especially the work in prosthetics. What most of my lab is devoted to is understanding these signals. How is information about objects that we interact, about the shape of our hand, how it moves, where it is. How are these signals represented in the nervous system, and we study various stages of processing in the nervous system.

 

We study what signals does the hands send the brain, And then how does the brain interpret those signals. So that's what we spend most of our time doing, but in the context of neuroprosthetics, if you're trying to control a robotic hand it's still true that you need to have these sensory signals from the hand. Sensory signals that tell you, for instance, are you touching the object or not? What part of the hand is touching the object? If you want to pick it up you need to touch it with at least your thumb and one of your fingers.

 

How much pressure are you exerting on the object. You need to exert enough pressure to pick it up, but so much pressure that you're going to crush it, right. All of these abilities require the sensory signals from the hand, and without these sensory signals basically you would suck at using your hands. And you would suck it using your own hand, and you certainly wouldn't be unable to use a robotic hand. So that's sort of the premise for our prosthetics work. Because you might ask, well, it's hard enough to control a robotic hand. Why do you want to make it feel too, right? Well, it turns out it's useless to control it if you can't make it feel also.

 

CODY GOUGH: Right. If you're really cold if you're in a freezing environment and you can't feel anything, yeah you can't.

 

SLIMAN BENSMAIA: Yeah exactly. That's right.

 

CODY GOUGH: Can't get that hot cup of coffee, and then once you get around your hand you can finally start to get that feeling back.

 

ASHLEY HAMER: Touch is way more than just a nice thing to have. Like Sliman said, if we didn't have a sense of touch our limbs would be useless. The sense of touch comes down to a combination of many, many different receptors that work together to let your brain know what your body is doing. You've got kinesthetic and proprioceptive nerve fibers telling you how fast and in what direction to move your hand to reach for, say, a coffee cup. Pressure receptors that gauge how hard you're gripping the handle, vibration receptors that tell you if the cup is slipping from your grasp, temperature receptors that tell you how hot the coffee is, and pain receptors that tell you if the heat is burning your skin. With all that, is it even a question that artificial limbs would need a sense of touch?

 

CODY GOUGH: I am just totally blown away at how you would even start any of that research. When you talk about a robotic hand, for example, I know that certain prosthetics people can control. How old is that technology?

 

SLIMAN BENSMAIA: I mean, the cutting edge of that technology is like-- I don't know a, day old. It's constantly growing. These robotic hands are increasingly anthropomorphic and increasingly approximate normal human hands. So it's been a process. So initially people demonstrated the ability to control for instance a cursor on a screen. And a cursor moving up and down on a screen and then moving up and down left or right on the screen, and then eventually you transferred the same kind of mathematics and sort of neural interfaces that allow this, allow interfacing now these neural signals with a different type of device. So working on how to record neural signals and in real time convert them into control signals, that's been going on for a couple of decades.

 

And it started out very primitive, controlling cursors to controlling more and more complex devices. In a parallel stream, people started working on robotic hands. And initially you had hands that just didn't do anything, and eventually you had hands that could open and close. And then hands that eventually could move maybe the thumb independently of the other fingers, and then again, each generation of a hand became closer and closer to human hand. And now, I mean, we don't have an exact replica of a human hand that's a robot because it's remarkable how amazing human hands are. They're really sophisticated complex devices and no robot approximates a human hand. But they're getting closer and closer and close enough now that we can really start thinking about equipping either amputees or tetraplegic with these hands.

 

CODY GOUGH: Yeah, and in terms of how they are actually controlled, your brain's got a lot of neurons in it. And neurons are the cells that can serve different functions, but one is shooting an electrical signals to a part of your body to move it right. So I Mean, how do you isolate and identify what neurons are firing to what, and then manipulate it so that it's firing to an object that isn't a part of a human being.

 

SLIMAN BENSMAIA: So I work on two different types of things that I think are equally cool. One of them is for amputees. So for amputees the idea is to control it. So part of the arm is missing, but part of the arm is still there and the nerve that used to observe the arm, that used to innervate the hand is still there. So the idea is you can interface with that nerve and sort of replace the signals that used to go to the brain.

 

CODY GOUGH: So the nerve is, it's still there. It's just been severed at a certain point.

 

SLIMAN BENSMAIA: It's been It's been severed, that's right.

 

But it turns out if you now electrically interface with that nerve, you can create sensations. When you stimulate that nerve, when you electrically stimulate that nerve. You create patterns of neural activation in this nerve. You basically wake that nerve up and that evokes the sensation in the amputee of something touching his hand that is no longer there.

 

CODY GOUGH: Is that like a phantom limb kind of thing.

 

SLIMAN BENSMAIA: It's not even a phantom limb because phantom limb is this faint weird sensation that seems to be emanating from a hand that's not there. This is a sensation. This is a real sensation from a hand that's not there. Patients describe it as or subjects describe it as it's like, my hand popped right back. It's not really there, but I feel as though it were there.

 

CODY GOUGH: Interesting.

 

SLIMAN BENSMAIA: So it's a much more vivid natural sensation than a phantom sensation. So that's one type. That's one type, perfect.

 

So and then, for that those types of prostheses is in some ways similar, but in some ways very different from the other type of prosthesis which is directed at people who are tetraplegic. So they're paralyzed and insensate from the neck down. That means the nerve is still there. But it's no longer attached to the brain. Right? So you can stimulate the nerve and it doesn't have any conscious. It doesn't have any consequence.

 

So for those patients, the only solution is to go straight to the brain and interface directly with the brain. So the problem is we have a pretty good sense of how the nerve works. And we have much less of a good sense of how the brain works. The brain is this amazingly complex system with 100 billion neurons connected with something like 100 trillion synapses. And thankfully maybe for us because we're going to have a job until we die, there's a lot of work to be done still to understand how the brain works.

 

Nonetheless, we're trying to interface with it. And I have to say, so I started out as just a card-carrying neuroscientist trying to understand these signals and how the sense of touch was represented in the nervous system. That was what I spent most of my time doing, all my time doing.

 

And I was approached by someone from Applied Physics Lab when I was still at Johns Hopkins University. And he was like, listen. We're trying to do this thing where we're trying to have these tetraplegic patients control a robotic hand with their brain. And we would like you to help us restore sensation by electrically stimulating the sensory parts of the brain.

 

And I was like, that sounds like the craziest idea I've ever heard. There's no way that's ever going to work. And then, they were like, well, we have some resources to bring to bear on the problem.

 

ASHLEY HAMER: That's researcher speak for we'll give you money.

 

SLIMAN BENSMAIA: And then I was like, OK, fine. We'll give it a shot. And I have to say, even though it seems almost impossible given how little we know about how the brain works, and given how relatively crude these technologies are, what we've been able to accomplish.

 

CODY GOUGH: Pretty amazing.

 

[LAUGHTER]

 

SLIMAN BENSMAIA: There's still a lot of work to be done. But I think we made some promising first steps.

 

CODY GOUGH: Yeah. I mean, occasionally I'll see on Reddit or some other news feed. It'll say something like, yeah, you're able to touch things or you're able to get this feedback. And there was a-- I believe was it a tetraplegic that met President Obama?

 

SLIMAN BENSMAIA: Yes, yes. So that was part of the team that I'm part of. It was part of this DARPA funded project called Revolutionizing Prosthetics. So this was probably, as far as I know, the first human to, not only be able to control a prosthetic hand by thought, but also to be able to feel objects through it. And President Obama went to visit him and gave him a fist pump, which I thought was kind of disappointing. He should have shook his hand so he could actually feel the handshake. There was no sensor on the fist. But anyway, so that was the first attempt. Yeah.

 

CODY GOUGH: So you're disappointed in President Obama's fist bump?

 

SLIMAN BENSMAIA: It's cool. I understand it's his thing. It would have been cooler, from a scientific standpoint, had it been a--

 

CODY GOUGH: Of course.

 

SLIMAN BENSMAIA: --handshake.

 

CODY GOUGH: To actually demonstrate the touch.

 

SLIMAN BENSMAIA: Exactly.

 

CODY GOUGH: OK. So your focus right now, you personally, or is your lab working on both kind of--

 

SLIMAN BENSMAIA: Both kinds. Both kinds, yeah.

 

CODY GOUGH: OK.

 

SLIMAN BENSMAIA: So what we contribute to really both kinds is to say, again, what we're experts on is to understand when we touch an object, how does the nervous system respond? And how in that response is information conveyed? Right? So when you touch an object you know the size, the shape, the texture of the object. If it's moving across your skin, you have information about that too. And all these types of signals contribute to your ability to interact dexterously with it. And again, without these signals we would be unable to use our hands. We basically wouldn't have hands because they'd be useless.

 

So that's what we study in the lab, or we spend most of the time studying, is what those signals are. So then the step into prosthetics is to say, OK, how are we going to leverage what we've learned about how these signals are represented in your nervous system and my nervous system and our intact nervous systems to mimic these patterns of activation in these patients by electrically stimulating either of their nerves in the amputees or the brains in the tetraplegic patients.

 

CODY GOUGH: So this sounds like some of the most complicated science that there is. Walk me through, at a very high level, a day in the life at your lab.

 

[LAUGHTER]

 

What does an experiment look like? What kind of data are you collecting? And how are you analyzing it?

 

SLIMAN BENSMAIA: The scientific branch of the lab basically presents very carefully controlled stimuli to the hand. And just as an aside, what makes this particularly challenging, and why I emphasize the word carefully controlled, is because we're exquisitely sensitive. Our skin is exquisitely sensitive. So if you're trying to understand how the nervous system represents this touch information, you have to have exquisite control over how you stimulate the skin. So we spend a lot of time developing these robots that can stimulate the skin and very precise and very repeatable ways.

 

So for instance, if we want to study texture and how the nervous system encodes texture information, we have this robot that allows us to slide textures across the skin at very precise speeds and force, contact forces.

 

CODY GOUGH: So by precision, you don't just mean location? You also mean pressure.

 

SLIMAN BENSMAIA: Pressure, speed, every single way in which it interacts with the skin needs to be very carefully controlled.

 

CODY GOUGH: OK.

 

SLIMAN BENSMAIA: So we present these stimuli. We record the neural activity either in the nerve or at various stages of processing in the brain. And then, we try to understand how is information about texture encoded in these neural signals? And that's usually sort of involves some math I won't bore you with.

 

But here's the outcome is that, if we do our job right, you can give me a pattern of neural activation. And I can tell you a lot about the thing that was touched. I can tell you what its texture was, how fast it was moving, what direction it was moving in, how much pressure was exerted on the skin, how intense it felt, things about how it felt. How rough did it feel? How hard did it feel? If you give me two patterns of activation, I can tell you which are the ones will feel rougher? Which of these patterns of activation corresponds to the thing that was moving faster? Or things like that.

 

CODY GOUGH: And you can tell that just from the code essentially that you're seeing?

 

SLIMAN BENSMAIA: Yeah, we're decoding the neural code.

 

CODY GOUGH: And how do you even record those neural signals?

 

SLIMAN BENSMAIA: Well, we have a-- it depends on where we're recording from. So we have different types of electrodes. Sometimes we just have a single electrode that we're carefully just driving into the neural tissue. Sometimes we have arrays of electrodes that we stick in the brain and leave there. And so then, that allows us, if we're lucky, to record from tens of neurons at a time.

 

CODY GOUGH: Stick in the brain? What do you mean by that?

 

SLIMAN BENSMAIA: Like a surgical implantation of an array of electrodes.

 

CODY GOUGH: Wow. Does that hurt?

 

SLIMAN BENSMAIA: No. No, it doesn't.

 

CODY GOUGH: I didn't realize you could-- I mean--

 

SLIMAN BENSMAIA: It's really remarkable. You can take this thing. It looks like a little mini bed of nails. It's like 4 by 4 millimeters, 100 electrodes. You can punch that thing into the brain. And it really does not hurt the brain. And it allows you to record from it for years at a time.

 

ASHLEY HAMER: The brain doesn't actually have any pain receptors. The only reason you experience pain coming from what you think is your brain, as in headaches, is that the areas that process pain signals aren't very good at pinpointing the source. A lot of the nerve superhighways that lead to various parts of your body converge when they get to the brain, which makes it hard to know exactly where pain signals are coming from.

 

That can cause something called referred pain, a phenomenon that explains why you get brain freeze when the roof of your mouth gets too cold and why people feel arm pain when they're about to have a heart attack. So while you don't have pain receptors in your brain, you do have them in the layers of tissue that insulate your brain inside your skull, in the muscles you tense when you're stressed, and inside your sinuses. All of those pain receptors can send signals that your brain interprets as a headache.

 

SLIMAN BENSMAIA: So the outcome of these kinds of experiments are an understanding of what the neural code is, how this type of information is encoded. So for instance, how is it that if you close your eyes and I touch your index fingertip, you're going to know that your index fingertip was touched. Not your thumb, not your middle finger, your index. Somehow your brain can keep track of that. How does it do that, right?

 

And so, what is the essential aspect of the evoked neural activity that helps you track which parts of your body are touching things. Right? Do you know the answer to that question?

 

CODY GOUGH: I certainly do not.

 

[LAUGHTER]

 

SLIMAN BENSMAIA: I would have been impressed. So that's an easy one actually. It turns out, you probably have heard of the fact, that we've the somatosensory homunculus. You have a whole body map in your brain. And so, there's a part of your brain that responds to the face. And then next to it, there's a part of the brain that responds to the hand. And next to that, there's a part of the brain that responds to the arm.

 

CODY GOUGH: When you say, responds you mean?

 

SLIMAN BENSMAIA: That means anytime you touch something with your face, something touches your face, this part of the brain is activated.

 

CODY GOUGH: It gives you the feedback.

 

SLIMAN BENSMAIA: Well, hold on. So you're getting ahead of yourself.

 

CODY GOUGH: Sorry.

 

SLIMAN BENSMAIA: Any time something touches your face, the face part of your brain is activated. Any time something touches your hand, the hand part of your brain is activated, which is right next to the face part. And right next to the hand part is the arm part. Right? And it turns out that if you electrically stimulate, you artificially activate, the hand part of the brain you get a sensation on your hand. It's like you feel like your hand is being touched. In fact, even more precise than that, if you activate the index fingertip part of your brain, the subject will feel a sensation on his or her index fingertip. And so, there's this map there of the body that helps the brain keep track of what parts of the body are interacting with other objects.

 

CODY GOUGH: You are doing an exceptionally good job of explaining how the brain encodes and transmits sensory information in a way that is actually really highly understandable.

 

SLIMAN BENSMAIA: Great.

 

CODY GOUGH: So I'm not trying to be jokey about this. But for that reason--

 

SLIMAN BENSMAIA: No jokes.

 

[LAUGHTER]

 

CODY GOUGH: For my next question, for that reason that you talked about with the fact that the brain has kind of a map--

 

SLIMAN BENSMAIA: Right.

 

CODY GOUGH: Is that why, and I'm guessing, scientists are working on prosthetics that resemble the human body as it is, instead of making, like, a hook, or a gun arm, or something like that, or other things?

 

SLIMAN BENSMAIA: Excellent question. I really love that question. So there are really two schools of thought on that point. One school of thought, that I do not at all espouse for reasons I will explain in a minute, is that the brain is a device that is built to learn new stuff. And so, it's learned to use hands. But if you gave it tentacles, it would learn to use the tentacles. And if you replace the hand with a tentacle, after like a week or a month, it'll learn to use that tentacle. I don't think that's true.

 

The other view is that evolution has shaped neural circuits or the development of neural circuits through development. And in early childhood up through puberty these neural circuits are plastic enough to sort of tweak these sort of hardwired circuits to adapt them to the environment.

 

CODY GOUGH: To the individual.

 

SLIMAN BENSMAIA: To the individual and to the environment.

 

CODY GOUGH: Big hand versus small hand kind of thing.

 

SLIMAN BENSMAIA: That's right. As the hands grow, as the body grows, as the body changes, this plasticity is there to sort of accommodate those changes. And then, eventually that plasticity decreases substantially. It doesn't disappear, of course, because we can learn and we can change still as adults. But what's left is an ability to learn certain types of things but not others.

 

And people have shown that you can really learn arbitrary things. Like you can learn to modulate your own brain in seemingly arbitrary ways but only to a certain degree. Like if you think about a hand and what a single hand or a single arm can do, it can do so many different things. It can move in dozens of different dimensions. And it receives thousands and thousands of different kinds of sensory signals that we instantly know how to interpret.

 

And the question is, and this is a matter of debate that I take one side of, whether in adulthood you could learn a completely new plant, as in a tentacle instead of an arm, and you could learn to move it in useful ways? And something that had the complexity of an arm, right? It would have to have the sort of complexity of an arm. If it's something easy like a cursor going up and down, sure, you can learn that. And whether you could learn to interpret the barrage of sensory signals you get back from this tentacle.

 

And I think the answer to that question is no. And no one has ever proven that you can learn to do anything more complex than a thing that's one or two dimensions like a cursor moving up and down, left and right.

 

CODY GOUGH: Sure.

 

SLIMAN BENSMAIA: But reasonable people can disagree. We actually wrote a paper on this that if you look at the literature to date on the ability of the human brain to create completely novel sensory representations, it seems more limited than people think.

 

CODY GOUGH: So it's different than learning a new language?

 

ASHLEY HAMER: Right. It's different than learning a new language. First of all, I want to point out that learning a new language is much more difficult in adulthood than it is in childhood. That kind of highlights the extra plasticity that we have as children that we seem to have less of as adults.

 

ASHLEY HAMER: Plasticity refers to the brain's ability to form new connections between neurons, the way it does every time you learn someone's name or make a mental note of where you parked your car. In early childhood, your brain goes through what are called critical periods, where it becomes more plastic or malleable to help you develop visual skills, emotional control, and language. At those points, the brain is primed and ready to take in new information, which is why young children can learn musical instruments and foreign languages so much faster than adults.

 

As you get older, your brain becomes less plastic and more stuck in its ways, making it harder to learn new things. Making new connections in a brain is a little bit like wiring a house for electricity. It's easy to put in a brand new electrical system when the house is being built. But it's a lot harder to replace the existing wiring later. But that loss of plasticity isn't necessarily a bad thing. The more stable your brain's connections are, the more efficient they are at processing the world around you. You can still learn new things as an adult. It just takes a little longer.

 

SLIMAN BENSMAIA: But a new language, if you think about the dimensionality of a sound signal, it's basically one dimension. Right? You just have one stream of information to keep track of at any given time? Whereas if you think of a hand, there are dozens of streams of information. So for instance, here's a way I like to kind of try to explain the difficulty of creating a new representation of even a hand, forget a tentacle. Right? Because a tentacle we have zero intuition about how to start controlling a tentacle. I don't have any intuition.

 

CODY GOUGH: Sure.

 

SLIMAN BENSMAIA: With the hand, let's say, I was going to remap and just completely remap the sensory signals from your hand so that any time your thumb touched something you would feel like your pinky is moving. And every time your index finger moved you would feel like something touched your thumb. And you just kind of scrambled all these sensations. So there was no correspondence between what you felt and what your hand was doing. Right? And you had to suddenly learn that from scratch. Could you learn that from scratch so that you could now intuitively interpret what your hand is telling you? I don't think so because there's dozens and dozens of things that your hands can do.

 

CODY GOUGH: Even in terms of how dexterous some people are, I used to play saxophone.

 

SLIMAN BENSMAIA: Cool.

 

CODY GOUGH: And even training my fingers to not respond together, if I'm trying to just put together my index finger my ring finger and not my middle finger, even things like that are hard even when we know how to control our bodies.

 

SLIMAN BENSMAIA: Right. And I would even submit that as hard as that is, that's on the motor side. And you're using your hand in ways that you've been using your hand. I mean, it's like a slight tweak on what you've been doing with your hand. It's like an elaboration of stuff that you've done already with your hand.

 

Whereas on the sensory side, it's even harder because that's not what sensory systems-- you're not supposed to relearn sensory stuff. The sensory system, I submit, is there to tell you what your body is doing. And it shouldn't be plastic. It shouldn't be kind of reinterpreting these signals all the time. It should just say, this is what your body is doing right now. Right? Like for instance, if you spend a lot of time feeling a specific type of texture, it's not like suddenly your brain should really become super fine tuned to this texture and not feel it, not be able to feel any other texture. You still want it to, if you feel something else, be able to sense that other thing. Anyways, I'm kind of getting lost in the weeds.

 

But the point is, I think the motor parts are somewhat. You can still learn new things. You can learn new languages. You can learn new instruments. It's going to be slow going. But you can still kind of do it. Whereas, I don't think you can learn a completely new sensory representation of your hand.

 

CODY GOUGH: Yeah, that makes a lot of sense. I imagine it would be difficult because our brain doesn't continue to-- the neurons don't really continue to grow or something as we get older. Right? They kind of grow when we're small. And then--

 

SLIMAN BENSMAIA: Pretty much. That's right. That's exactly right. There's some structural stuff that happens in your nervous system during the so-called critical period, when you're very young, that just does not happen on that scale when you're an adult.

 

CODY GOUGH: Yeah.

 

ASHLEY HAMER: I've already told you about how adults' brains can't learn the same way as children's. But what about the brain cells themselves? You've probably heard that you can't grow new brain cells. You're born with a certain number of them. And once those cells die, you've got that much less brain to work with. Well, I'm happy to tell you that recent research finds that to be a myth.

 

Some types of brain cells never grow or change structure in adulthood. But many do. For example, in the hippocampus, the brain region responsible for learning, long-term memory, and emotion regulation, even elderly brains still produce about 700 neurons every day. That's not a lot when you compare it to the billions of neurons that make up your brain. But it's something. And studies suggest that you might be able to give that growth rate a boost with things like aerobic exercise, 3D video games, or even using your non-dominant hand to brush your teeth.

 

CODY GOUGH: You've talked a little bit about the limits of what we can do. We can't probably program a person's brain to have a bunch of tentacles or anything. But now, we're getting that feedback. You're finding ways to create feedback. So what barriers have been shattered in your time working on this? And what's next?

 

SLIMAN BENSMAIA: Right. So let me talk about the amputees and the tetraplegic patients separately because I think there we're facing two different types of challenges. I think on the sensory side for amputees, like to store the sense of touch to amputees, there again, we're interfacing with the residual nerve, so the nerve they used to innervate the hand. And I think it's fair to say that we have a pretty detailed understanding of how information is encoded in the nerve.

 

And we've recently developed a model that really pretty precisely, at least at a first approximation, can reconstruct the response of every single nerve fiber that innervates the palmar surface of the hand to any touch applied to the skin. So that tells you how the nerve responds when an intact hand touches something.

 

And so what happens, by the way, when you touch something is almost all the nerve fibers or a large number of nerve fibers in your hand become activated. And each of them becomes activated in a slightly different way depending on where they're located in the hand, depending on what type. There are different type of nerve fibers. Their patterns of activation is very idiosyncratic, very predictable, but very idiosyncratic. But we can predict it. We can reconstruct it pretty accurately.

 

So if we could electrically stimulate each nerve fiber individually to produce the desired pattern of activation in each nerve fiber, we'd be able to restore the sense of touch pretty [? veritically, ?] create a pretty very similar sensation of touch. So the challenge there is to get there. The technology is now, if you're lucky, you can maybe-- you have maybe up to 100 or 200 channels. You would need 10,000 channels to do that.

 

CODY GOUGH: Wow. And that number, 100, 200, is very generous. So that's a technological challenge of how is it that we're going to be able to selectively activate each of these fibers in the desired way. And a lot of smart people are working on strategies to do that. And I think that there's a path forward there, a clear path forward. And we're not going to get to 10,000 tomorrow. But if we could get to even 100 tomorrow, like 100 reliable channels that we can count on and stimulate very reliably, that's going to be enough, enough to have a pretty good prosthetic hand. I think that's pretty close to ready for primetime.

 

On the brain side, like I said earlier, the brain is this amazingly complex thing. And we've, I think, made a lot of progress in the last century or so in understanding it. But there's a lot more progress to be made. And so that path forward is not as clear.

 

On the other hand, when you think about tetraplegia and how devastating it is, because you're insensate and paralyzed from the neck down. So you really become completely dependent on others for almost everything. If you can restore some independence through these robotic, anthropomorphic robotic hands or whatever device, that's a major step. So maybe the bar isn't that high in that we don't need to be able to play Rachmaninoff with a prosthetic hand for it to be a clinically viable device.

 

And so what the short-term horizon is unclear. But I think we have made enough progress now that we have nearly clinically significant improvements in the current state of these brain control prostheses is to the point where even a modest improvement will be clinically significant and potentially make the device clinically viable.

 

CODY GOUGH: And that's what you're talking about with the Obama fist bump?

 

SLIMAN BENSMAIA: Yes. So that was one of the very first steps where you have a robotic hand that you can move by thought and get the sensory feedback from it.

 

CODY GOUGH: One more question that we're talking about nerves and nerve endings.

 

SLIMAN BENSMAIA: Yes.

 

CODY GOUGH: Is this research and this line of thought apply to things like eyesight?

 

SLIMAN BENSMAIA: Absolutely. Yeah, yeah. Absolutely. So there are efforts throughout the world to restore vision through either retinal implants for people who have these peripheral neuropathies where the retina, the neural tissue in the eye, begins to malfunction so you can just electrically stimulate the neural tissue there. Or for folks who have more severe peripheral neuropathies, you can interface directly with the brain. So people are working on that.

 

I would submit though the challenges for artificial vision are even greater than the challenges for artificial touch. Because for artificial touch, it's a major improvement to know that you're touching something and which part of your hand you're touching it with. That's way better than nothing. With artificial vision, what is good enough? Is it good enough to know that there's a kind of a blotch of light over here? Or do you need to be able to read? And if you need to be able to read, that is a very, very high bar to achieve through a neural interface given the current state of neural interfaces.

 

CODY GOUGH: Yeah, that makes a lot of sense. But they're making some progress there.

 

SLIMAN BENSMAIA: Absolutely. In fact, there are several groups starting human trials with these visual neural prostheses, cortical visual neural prostheses, one of which is actually at the University of Chicago.

 

CODY GOUGH: If people want to learn more about your work and all the work in this field, what's a good newsletter they can subscribe to, a website they can go to, or YouTube video series they can watch, or any other resources you can think of?

 

SLIMAN BENSMAIA: I don't know. There's a lot of stuff online. I don't want to plug anyone in particular. I certainly don't want to plug myself. But if you put anything like brain controlled robotic arms, there's going to be all kinds of different videos. I encourage you to go online and look at the videos. They're really impressive. These patients controlling these robots that look really pretty-- like, movie, science-fictiony robotic hands. And they're very good at it.

 

CODY GOUGH: Is this a popular growing field of science?

 

SLIMAN BENSMAIA: I think so. And here's another thing that I don't think came up. But what's interesting about neuroprosthetics, in addition to the obvious, like, help that you can provide people who have sometimes debilitating diseases or injuries, is that it's really scientifically interesting. It's a way to test theories about how information is encoded in the nervous system.

 

Because for instance, if you can use a neural code to control an arm or use a neural code to create sensations in a predictable and systematic way, that means that you've understood something about how the brain works or how the nervous system works. And so, it's sort of science informs neuroprosthetics. And neuroprosthetics sort of informs science as well. So, it's a nice bidirectional sort of thing.

 

CODY GOUGH: The last question has to be, since you brought that up--

 

SLIMAN BENSMAIA: OK.

 

CODY GOUGH: Understanding and manipulating neural, what's the potential for evil here?

 

SLIMAN BENSMAIA: Right. There's always potential for evil. But let me say one thing. Let me try to turn your question around and make it sort of a more fun, positive ending. Elon Musk, a couple, two or three months ago, came out with this new company called Neuralink, which is to really try to commercialize this whole idea of brain machine interfaces. And so, we are working strictly on rehabilitation. We're taking amputees. Or we're taking tetraplegic patients. And we're trying to make their lives better.

 

But other people are thinking about augmentation. How can we, thinking forward now, how can we leverage these technologies to do, like, potentially really cool stuff. And so, if I take off my scientific hat, put on my, let's try to look into the future, I can imagine a world in which we could flexibly and sort of bidirectionally interface with machines. Imagine you could interact with Wikipedia in the way you can interact with your own memories. You could just flexibly think about stuff, bring in all the whole of human knowledge in your deliberations about whatever problem is at hand. That would make you a lot smarter than you are, whoever you are.

 

And so, I think there's some people who think, well, what is the potential of this? If we could control extracorporeal devices with the same flexibility that we can control our own bodies, that would have a potential that I think some people are going after. And that's not something that I'm particularly part of but. I go to meetings where people talk about that stuff. And I listen.

 

CODY GOUGH: Wow. Well, good. That's what our listeners do. They listen. Awesome. Well, we want to wrap up with a quick segment called the Curiosity Challenge.

 

SLIMAN BENSMAIA: OK. Uh-oh.

 

CODY GOUGH: And I'm going to ask you about something. Well, I kind of try to teach you something.

 

SLIMAN BENSMAIA: OK, cool.

 

CODY GOUGH: Actually, because you taught me so many things about everything.

 

SLIMAN BENSMAIA: All right.

 

CODY GOUGH: So when you see a brain scan in a textbook or in a scientific study, see the photo of it, do you ever wonder who the brain belongs to? Statistically, do you know who it probably belongs to? There's a brain that has graced the pages of more than 800 scientific papers. Any idea where it came from?

 

SLIMAN BENSMAIA: One brain?

 

CODY GOUGH: Mm-hm.

 

SLIMAN BENSMAIA: Is this like, you're talking about patient HM? Is it like the patient who was the amnesic patient?

 

CODY GOUGH: No, this is an average brain.

 

SLIMAN BENSMAIA: Oh, no. I have no idea.

 

CODY GOUGH: Not sure of the identity of the average brain, all right. It is called, well, Colin Holmes. It's his brain.

 

SLIMAN BENSMAIA: OK.

 

CODY GOUGH: And Colin Holmes, he was a graduate student at the Montreal Neurological Institute. He didn't like the low quality of brain images. But at the time, an MRI lasted several hours. So it was hard to get a still photograph of a brain. So he figured, you might be able to approximate a really high resolution image if you combine multiple scans from a single live brain.

 

So in 1993, he took 27 scans of the brain. So he had to sit in an MRI for quite a long time. He aligned the scans to create one high resolution image, then coauthored a paper about it. And they call his brain Colin 27 or sometimes Average Colin. And yeah, he's been featured in more than 800 scientific papers. And many, many papers have used that image for kind of the average brain. So it's Colin Holmes.

 

SLIMAN BENSMAIA: I did not know that. Thank you.

 

CODY GOUGH: Who knew what the average brain looked like?

 

SLIMAN BENSMAIA: I didn't no. I mean, right. Maybe his brain is weird in some way and will always be considered to be some kind of paragon when, in fact, it shouldn't be.

 

[LAUGHTER]

 

(WHISPERING) Watch out for Colin.

 

[LAUGHTER]

 

(SPEAKING) No, cool. I did not know that.

 

CODY GOUGH: Yeah, good. And I believe I put on my email I was going to ask you to give me a trivia question that has nothing to do with your field of expertise.

 

SLIMAN BENSMAIA: Here's a question. Why should you not chew gum and ride a motorcycle?

 

[LAUGHTER]

 

Does that qualify as a trivia question?

 

CODY GOUGH: I think it does. But I think the answer is actually pretty obvious. Wouldn't you swallow it accidentally?

 

SLIMAN BENSMAIA: No, that's not it.

 

CODY GOUGH: Wait, will the wind get caught in your mouth and create a bubble that blows inward?

 

[LAUGHTER]

 

SLIMAN BENSMAIA: That's not it either.

 

CODY GOUGH: Oh, all right.

 

SLIMAN BENSMAIA: I phrased it purposefully so that the answer would not be obvious. It's because when you're wearing a helmet, it sort of pushes your cheeks into your mouth. And so you're just chewing on your cheek. And so I have all kinds of wounds.

 

CODY GOUGH: So you came here and did this whole podcast having eaten the inside of your cheek?

 

SLIMAN BENSMAIA: Pretty much.

 

[LAUGHTER]

 

CODY GOUGH: Wow. Well, that was something I didn't know.

 

[LAUGHTER]

 

CODY GOUGH: So there you go. For any of you bikers out there or anybody else who wears a helmet, make sure that you do that. Well, again, thank you so much for joining me. I was here with Sliman Bensmaia, Dr. Sliman Bensmaia.

 

CODY GOUGH: It's like, I would say, neuroscientist, principal investigator, and associate professor, which is what you said.

 

CODY GOUGH: An associate professor at University of Chicago. Thanks for joining me.

 

SLIMAN BENSMAIA: Thanks for having me. This was fun.

 

[MUSIC PLAYING]

 

ASHLEY HAMER: We like to end every episode with an extra credit question. You should know the answer if you learn something new every day on curiosity.com. Here's your question. What makes someone a night owl or a morning person? We're not talking about age or gender but something else. Stick around for the answer in just a minute.

 

CODY GOUGH: Do you like surveys? Well, I've got some really good news for you if you do. We want to hear your thoughts on the curiosity podcast. So we created a super quick and easy survey. Please visit curiosity.com/survey and answer a few questions so we can make our podcast better. Again, that's curiosity.com/survey. It's quick and easy and will really help us bring you better content every week. There's a link in the show notes too. But one more time, that URL is curiosity.com/survey. We really appreciate the help.

 

ASHLEY HAMER: Explore history's surprising connections with a new podcast, The Thread with OZY. It's like a cross between revisionist history and six degrees of separation. You'll discover how various historical strands are woven together to create a historic figure, a big idea, or an unthinkable tragedy, like how John Lennon's murder was actually 63 years in the making. Witness how their stories hinge on the past and influence the future. The show is already a chart topper. Get The Thread with OZY. That's O-Z-Y on Apple Podcasts or wherever you listen.

 

CODY GOUGH: Have you ever been listening to the curiosity podcast and wanted to share a clip on Facebook or Twitter? Well, here's some super exciting news. Now, you can thanks to Gretta.com That's G-R- E- T-T- A. You can stream our podcast on Gretta.com/curiosity. And their podcast player will follow along with a written transcript of each episode while you listen. When you hear a clip you want to share, just find it and click share. Gretta will build you a video for you to share with your friends so that you can help spread the word about our podcast. Again, that's Gretta.com/curiosity. And drop us a line to let us know what you think of this super cool new service.

 

ASHLEY HAMER: Here's today's extra credit answer brought to you by curiosity.com. According to a study published in the journal Nature in 2016, whether you're a night owl or a morning person might come down to your genes. The study looked at the DNA of nearly 90,000 people who submitted their genetic material to one of those mail-in DNA analysis companies. And the researchers found 15 genetic patterns associated with being a morning person. So if you tend to wake up early, it could be because your genes are telling you to. We've got a link to learn more in the show notes or search for the word early in the Curiosity app for your Android or iOS device.

 

CODY GOUGH: Thank you for listening. We really appreciate your support. And we'll get you next time. For the Curiosity podcast, I'm Ashley Hamer.

 

CODY GOUGH: And I'm Cody Gough.

 

[MUSIC PLAYING]