Teaching Robots to ‘Feel with Their Eyes’

An engineering Ph.D. student is leading a project that builds up a database of surfaces so that robots may better identify what objects are made of and how to handle them.

At first glance, Alex Burka, a Ph.D. student in Penn’s School of Engineering and Applied Science, looks like a ghostbuster. He walks into the Penn Bookstore strapped to a bulky orange backpack, holding a long, narrow instrument with various sensors attached.

The device looks exactly like the powerful nuclear accelerator backpack and particle thrower attachment used in the movies to attack and contain ghosts. For this reason, Burka and the other people who work with him jokingly refer to it as a “Proton Pack.”

But Burka is not in the business of hunting ghosts. Instead, he’s leading a project designed to enable robots to “feel with their eyes.” Using this Proton Pack, Burka hopes to build up a database of one thousand surfaces to help coach robots on how to identify objects and also to know what they’re made of and how best to handle them. The project is funded by the National Science Foundation as part of the National Robotics Initiative.

“We want to give robots common sense to be able to interact with the world like humans can,” Burka says. “To get that understanding for machines, we want a large dataset of materials and what they look and feel like. Then we can use some technologies emerging in AI, like deep learning, to take all that data and distill it down to models of the surface properties.”

Burka joined associate professor Katherine J. Kuchenbecker’s Penn Haptics Group, part of the GRASP lab, three years ago. Building off several earlier studies from Kuchenbecker’s lab, including Heather Culbertson's database of one hundred surfaces recorded using a stationary instrument, Burka worked with several students to develop an instrument that would be portable. His current intern, grad student Myles Cai, helps with data collection. In the past, Penn students Siyao Hu, Stuart Helgeson, Abhinav Rajvanshi, Sarah Allen, and Shweta Krishnan have also helped with the project.

The Proton Pack incorporates different visual and haptic sensors, including a camera, a depth sensor, force/torque sensors, a microphone and several accelerometers. The researchers hope to take their Proton Pack to stores like Home Depot and IKEA, as well as museums like the Franklin Institute, to get a feel for different surfaces. They are also working with a company called Material ConneXion that has a physical database of different kinds of plastics and construction materials.

​​​​At the bookstore, Burka clips a pink sweatshirt to a table underneath tracking indicators, and taps and drags the handheld portion of the instrument across the material using a variety of speeds and pressures, while the camera tracks his movements. When touching different surfaces, he uses three different end-effectors, which can be thought of as the device’s “fingertips.”

“The computer knows which one is connected,” Burka says. “When I click, ‘Start a new data set,’ like hitting play on a recorder, it begins recording images from the camera and force and accelerometer measurements. Then I do my various surface motions, like tapping, rubbing, and poking. When I hit stop, it asks me to give the surface a name so we can file it away. We also take human ratings of each surface, so I give my impression of how hard, smooth, sticky and cold the surface feels. That's part of what we'll be training our algorithms to identify.”

One of the immediate goals of the project is to be able to take an image of a surface that hasn’t necessarily been seen before and pick out properties like how soft it will feel or how slippery it’s going to be. Burka says they’ve been collaborating with professor Trevor Darrell's lab at UC Berkeley, specifically with students Yang Gao and Lisa Anne Hendricks, to figure out which machine learning techniques to use for that.

Burka recently returned from presenting the project as a demo and poster at the World Haptics Conference in Munich, as well as a presentation at ICRA in Singapore.

One future application for this research, he says, is self-driving cars.

“You would need to anticipate the surfaces you're going to be driving on and the road ahead,” he says. “If you see an icy patch or a patch of sand then you want to be able to anticipate that it has properties that are different from the asphalt.”

Another application is in designing helper robots that can do different kitchen or housekeeping tasks, which often involve picking things up.

“Some objects are slippery and some of them are fragile, so you want to know about the material,” Burka says. “There's a specific grip force that's just enough force to pick something up without slipping, but not too much force that you're going to crush it or waste energy.”

Burka says that, for him, the most exciting part of the project is just putting everything together.

“It involves hardware design, real time software for recording the data, and machine learning. It's all the elements of robotics.”

Proton Pack