A new system allows robots to feel human touch without artificial skin

Even the most capable robots are not very effective at detecting human touch; You typically need a computer science degree or at least a tablet to interact with them efficiently. That could change, thanks to models that can now feel and interpret touch without being covered in high-tech artificial skin. It is a significant step towards allowing robots to interact more intuitively with humans.

To understand the new approach, led by the German Aerospace Center and recently published in Science Robotics, consider the two distinct ways in which our own bodies sense touch. If you hold your left palm facing up and lightly press your left pinky, you may first recognize this touch by the skin on your fingertip. This makes sense – you have thousands of receptors in your hands and fingers alone. It is common for roboticists to try to replicate this sensor coverage on machines using artificial skins, but these can be expensive and ineffective in resisting impacts or adverse environments.

However, if you press harder, you may notice a second way of feeling touch: through your knuckles and other joints. This sensitivity – a sensation of torque, to use robotics jargon – is exactly what the researchers recreated in their new system.

Its robotic arm contains six sensors, each of which can register even incredibly small amounts of pressure against any section of the device. After accurately measuring the magnitude and angle of this force, a series of algorithms can map where a person is touching the robot and analyze exactly what they are trying to communicate. For example, a person could draw letters or numbers anywhere on the surface of the robotic arm with a finger, and the robot could interpret directions from those movements. Any part of the template can also be used as an interactive button.

This means that every square centimeter of the robot essentially becomes a touchscreen, but without the cost, fragility and wiring of one, says Maged Iskandar, a scientist at the German Aerospace Center and lead author of the study.

“Human-robot interaction, in which a person can closely interact with and command a machine, is still not ideal, because the human needs an input device,” says Iskandar. “If you can use the robot itself as a device, interactions will be more fluid.”

A system like this could offer a more economical and simple way to provide not only a sense of touch, but also a new way to communicate with robots. This could be particularly important for larger models, like humanoids, which continue to receive billions in venture capital investments.

Calogero Maria Oddo, a roboticist who leads the Neuro-Robotic Touch Laboratory at the BioRobotics Institute but who was not involved in the work, says the development is significant thanks to how the research combines sensors; elegant use of Mathematics to map touch; and new AI methods in order to integrate all these elements. Oddo says that commercial adoption could be quite quick, as the investment required falls more on the software than the hardware, which is much more expensive.

However, there are caveats. On the one hand, the new feature cannot handle more than two points of contact at the same time. In a very controlled environment, such as a factory floor, this may not be a problem, however, in environments where interactions between humans and robots are less predictable, this may present limitations. And the types of sensors needed to communicate touch to a robot, while commercially available, can also cost tens of thousands of dollars.

Overall, though, Oddo envisions a future in which skin-based sensors and joint-based sensors are integrated to give robots a more comprehensive sense of touch.

“We humans and other animals integrate the two solutions,” he says. “I hope that robots working in the real world will also use both to interact safely and fluidly with the world and learn.”

( fonte: MIT Techonology Review )