Q&A with Katherine Kuchenbecker

 Photo credit: Candace diCarlo

Mechanical engineer Katherine Kuchenbecker tells a story about one of the most compelling videos she’s ever seen about the sense of touch. In it, a woman is lighting a match—an everyday act that takes just a few seconds. But then researchers in a lab numb the woman’s hand and ask her to perform the simple task again. Over and over, Kuchenbecker says, the woman attempts to strike the match but fails to do so. Forty-five seconds later, she is finally able to complete the task.

“This is the most vivid demonstration I’ve ever seen of the importance of the sense of touch in everything we do,” says Kuchenbecker, the Skirkanich Assistant Professor of Innovation in the Department of Mechanical Engineering and Applied Mechanics in Penn’s School of Engineering and Applied Science. “The only way we change anything in the world is through moving and touching it, acting on it, unless you have a brain-computer interface.”

Kuchenbecker has focused her research on haptics, the field concerned with the interface between technology and the complicated human sense of touch. Just 32 years old, Kuchenbecker was recently named one of Popular Science magazine’s Brilliant 10, a group of young researchers selected for being “the sort of minds that improve our world.”

This is the most vivid demonstration I’ve ever seen of the importance of the sense of touch in everything we do,”

A prolific researcher who is also a member of the Bioengineering Graduate Group, Kuchenbecker is currently focused on fooling the human sense of touch through electromechanical sensors and actuators. Her projects range from the VerroTouch system, which provides tactile feedback for surgeons performing robotic surgery, to the gaming vest, which delivers a quick tap to the wearer meant in place of a gunshot during the computer game, “Half-Life 2.”

Kuchenbecker is also teaching two classes this fall—a freshman level mechanics lab and a graduate-level course on haptic interfaces. She is effusive about teaching and Penn: “You need to try to put yourself in their shoes and imagine what they already know about, what they care about, so the things I find to be most effective are bringing real, physical examples into the classroom so it’s more than just equations on the board—it’s connecting those equations, that calculus, those derivatives, with real physical things.”

The Current recently met with Kuchenbecker in her office to discuss her many research projects, what brought her to Penn and appreciating the under-recognized sense of touch.

Q. What do you enjoy about teaching?
A.
You’re helping [students] fill their toolbox with new techniques, new ways of seeing the world, new frameworks, new methods of understanding. Sometimes it’s mathematical equations. Sometimes it’s a new sensing system or a new programming language. I feel like I have a great responsibility to the students who take my classes to try to show them how cool this stuff really is and why I love it. I think I can’t hide that I really enjoy the things that I do.
I like working with young people and trying to see things through their eyes. Sometimes, they’ll give a novel answer, the answer you weren’t expecting. I just love those moments. I love finding those instances where the students thought even beyond what I expected they would be able to figure out or solved a problem in a really novel way.

Q. What brought you here?
A.
When I came to Penn, it just resonated. The students I met here, the other faculty, especially, were just fantastic and it was clear that the were doing really interesting work. I like that [SEAS is] a smaller school so you can actually know all of your colleagues, and there is a real center of excellence here in robotics and it was really appealing to me to be able to be a part of that. I love being a mechanical engineer and being a part of the mechanical engineering faculty, but also being part of the GRASP Lab [General Robotics, Automation, Sensing and Perception] gives me a broader academic community. In the new robotics master’s program that we’re running, I’m getting students in my class who are not traditional mechanical engineers. I’m getting computer scientists, electrical engineers, systems engineers and bioengineers. I have always gotten the sense that Penn really embraces and celebrates interdisciplinary work.
Often we work on letting someone do something that they couldn’t do without assistance or without extension, so robotic surgery is a good example of what my group studies.

Q. What are you working on in the area of robotic surgery?
A.
A recent revolution in surgical technology has beenthe creation of robot-assisted, minimally-assisted surgery systems. Instead of directly holding long, thin laparoscopic tools, the doctor holds a pair of hand controllers that they can easily maneuver. the robotic system measures those movements and makes the surgical tools inside the patient follow those motions in real time. There’s a commercial system [that does this]—actually my Ph.D. advisor at Stanford was one of the engineers who worked on it. The Intuitive da Vinci Surgical System has an extremely intuitiv interface. It’s very natural to move [the tools] around, but unfortunately you cannot feel what the tools are touching. This is an example of a teleoperations system where there’s a nice interface to control movement, but humans are used to touch-based interactions being bi-directional. When I move this tool and it collides with something, I should be able to feel the response instead of [having] to rely on my eyes to see what’s happening.
The big mission behind what a lot of my lab does is to try to make these interfaces more natural and easy to use for the human operator. In the domain of robotic surgery our goal is to try to restore the sense of touch to the operator in the da Vinci Surgical System. We call it the VerroTouch system.

Q. How does it work?
A.
There are many dimensions to the sense of touch—you can feel temperature, you can feel vibrations, you can feel pressure. There are actually different receptors in your skin that convey those different channels. A sense of touch also encompasses your body configuration, so you can close your eyes and touch your nose. So when we took on this challenge of adding touch feedback to the da Vinci, there were many things we could have chosen to restore. The thing that we ended up focusing on is high-frequency vibrations. As I take this tool and tap on something, at the start of contact there’s a little vibration wave. If we put a little vibration sensor on the tool, you can hear tap, tap tap, and you can hear the click click click, and if I drag it, there’s a little noise between the tool and surface. Those signals are really important for a surgeon. They let the surgeon know if the interaction is progressing naturally or if something went wrong ... We sought to add these high-frequency vibrations into the da Vinci. We can take the fully commercially available robot and add vibration sensors near the tools plus vibration actuators near the surgeon’s hands and speakers near their ears. When we connect them together, the surgeon can all of a sudden feel and hear what the tools are touching, which they’ve never been able to do before.
We did a study this summer where we had 11 different surgeons come in and use our system. [They did] some several manipulation tasks with and without the touch-based feedback, and with and without the auditory feedback of the tool vibrations. The most striking finding was that the surgeons showed a strong preference for having this vibration feedback. They said they thought it made them more aware of what the tools were touching
[Collaborator and Assistant Professor of Surgery in the School of Medicine David Lee] and I believe that this additional feedback would probably be most beneficial for people who are first learning how to use the robot. ... During training, you’re trying to translate skills you’ve honed doing open surgery, where you can directly see and feel and hear contacts with the the tissue, to this robotically mediated interaction, where you can see, but you can’t feel or hear what you’re doing. We think that by adding [vibration] back in it may make it easier for young surgeons, or surgeons new to the da Vinci, to learn how to use the robot to its fullest extent.

Q. You’re not necessarily interested in making commercial products.
A.
For anyone who is a technological innovator, it’s extremely exciting to think that the things you create might actually help someone someday. I try to steer my research to problems like that, where I can see a medium-term potential benefit to society or humanity.

Q. I’d be remiss if I didn’t ask about your lab’s tactile gaming vest. It’s not often that a research product catches on with the general public in such a big way. What was that experience like for you as a researcher?
A.
I think there are so many cool projects in our lab, I couldn’t have said which one would catch the imagination of the public. I think what we did right in that project, and what we have done right in a lot of other projects, is to pay very close attention to the quality of the haptic sensation that our system delivers. ... None of us knew what it was like to be shot—nor do we want to inflict pain on the user—but in that project, we wanted the user of a first-person shooter game to feel like something had collided with their skin from outside when they got shot in the game. We tried to design haptic tactile actuators that gave you that impression, something that would arrive quickly. We made it adjustable so the user could pick the strength of feedback they wanted. ... It’s not hurting the person, it’s like a little hi , surprising you and coming from the correct direction, but we were careful to keep it pain-free.
The borderline between the real and the virtual gets blurry as soon as the game comes into your physical space and lets you feel it. I think that’s why that project got so much attention. It jumps the boundary from being something on the computer screen that I’m playing, but isn’t real. As soon as you feel the game physically hitting you, you jump out of your chair.

Q. Describe another one of your projects—the tablet computing project.
A.
We started with a Wacom tablet that has co-located graphics and a stylus so it can measure where you touch on the screen and how hard you’re pushing. But normally, when you touch the screen you can’t feel anything except the screen. So we have added voice coil actuators. When you drag across the picture of a texture on a screen, we can make it feel like you’re touching the real texture. I call this haptography. We take an instrumented tool and drag it across the surface we want to capture and we do a bunch of math to process those recorded signals and build models of them. Then, as you touch the surface on the computer screen, we measure how fast you’re moving and how hard you’re pushing. And then we use these voice coil actuators to shake the tool back and forth to fake you out, to give you the illusion that you’re touching the real surface. We ran a big study with this system this summer. ... We captured eight different textures and the median realism ratings were about a 5 or a 5 1/2 out of 7. Quite realistic.

Q. Your dad is a surgeon. Have you ever gotten inspiration from him for some of your biomedical ideas?
A.
I have interviewed him several times about surgery to try [to learn] how to speak the language of surgeons ... but I also try to go out of my way to make connections with other people here at Penn. About a month-and-a-half ago, I gave a talk at the neurosurgery grand rounds at HUP. One of the faculty there is starting to use the da Vinci robot to do brain surgery. He came over to my lab and gave a talk and brought one of his residents. They taught me and my students about the things that they are doing, and it turns out there’s a really exciting potential collaboration. We’re working on a prototype to take some of the things we’re doing in the da Vinci and turn them into surgical training tools.
Sometimes you meet collaborators like that and they describe a problem and you’re like, ‘That is a home run! I know how to do that and we can do that together.’ This interface between medicine and engineering is just so exciting, doctors are trying to do really audacious things. They’re always in need of new tools, new techniques, new training methods.

Q. It’s striking how important touch is.
A.
I think it’s really under-recognized. You know what it’s like to close your eyes. You understand the value of vision and it’s high. ... And audio, you know what it’s like not to be able to hear for a short while. But you don’t know what it’s like to turn off your sense of touch. It’s hard to appreciate the important role that touch plays in your everyday existence.