Artificial hands, even the most sophisticated prostheses, are still by far inferior to human hands. What they lack are the tactile abilities crucial for dexterity. Other challenges include linking sensing to action within the robotic system – and effectively linking it to the human user. Prof. Dr. Philipp Beckerle from FAU has joined with international colleagues to summarize the latest findings in this field of Robotics – and establish an agenda for future research. Their piece in the research journal Science Robotics suggests a sensorimotor control framework for haptically enabled robotic hands, inspired by principles of the human’s central nervous system. Their aim is to link tactile sensing to movement in human-centred, haptically enabled artificial hands. According to the European and American team of researchers, this approach promises improved dexterity for humans controlling robotic hands.
Tactile sensing needs to play a bigger role
“Human manual dexterity relies critically on touch”, explains Prof. Dr. Philipp Beckerle, head of FAU’s Chair of Autonomous Systems and Mechatronics (ASM). “Humans with intact motor function but insensate fingertips can find it very difficult to grasp or manipulate things.” This, he says, indicates that tactile sensing is necessary for human dexterity. “Bioinspired design suggests that lessons from human haptics could enhance the currently limited dexterity of artificial hands. But robotic and prosthetic hands make little use of the many tactile sensors nowadays available and are hence much less dexterous.”
Beckerle, a Mechatronics engineer, has just had the paper “A hierarchical sensorimotor control framework for human-in-the-loop robotic hands” published in the research journal Science Robotics. In this, he unfolds with international colleagues how advanced technologies now provide not only mechatronic and computational components for anthropomorphic limbs, but also sensing ones. The scientists therefore suggest that such recently developed tactile sensing technologies could be incorporated into a general concept of “electronic skins”. “These include dense arrays of normal-force-sensing tactile elements in contrast to fingertips with a more comprehensive force perception”, the paper reads. “This would provide a directional force-distribution map over the entire sensing surface, and complex three-dimensional architectures, mimicking the mechanical properties and multimodal sensing of human fingertips.” Tactile sensing systems mounted on mechatronic limbs could therefore provide robotic systems with the complex representations needed to characterize, identify and manipulate, e.g. objects.
Human principles as inspiration for future designs
To achieve haptically informed and dexterous machines, the researchers secondly propose taking inspiration from the principles of the hierarchically organised human central nervous system (CNS). The CNS controls, which signals the brain receives from tactile senses and sends back to the body. The authors propose a conceptual framework in which a bioinspired touch-enabled robot shares control with the human – to a degree that the human sets. Principals of the framework include parallel processing of tasks, integration of feedforward and feedback control as well as a dynamic balance between subconscious and conscious processing. These could not only be applied in the design of bionic limbs, but also that of virtual avatars or remotely navigated telerobots.
It remains yet another challenge though to effectively interface a human user with touch-enabled robotic hands. “Enhancing haptic robots with high-density tactile sensing can substantially improve their capabilities but raises questions about how best to transmit these signals to a human controller, how to navigate shared perception and action in human-machine systems”, the paper reads. It remains largely unclear how to manage agency and task assignment, to maximize utility and user experience in human-in-the-loop systems. “Particularly challenging is how to exploit the varied and abundant tactile data generated by haptic devices. Yet, human principles provide inspiration for the future design of mechatronic systems that can function like humans, alongside humans, and even as replacement parts for humans.”
Philipp Beckerle’s Chair is part of the FAU’s Departments of Electrical Engineering, Electronics and Information Technology as well as the Department of Artificial Intelligence in Biomedical Engineering. “Our mission at ASM is to research human-centric mechatronics and robotics and strive for solutions that combine the desired performance with user-friendly interaction properties”, Beckerle explains. “Our focus is on wearable systems such as prostheses or exoskeletons, cognitive systems such as collaborative or humanoid robots and generally on tasks with close human-robot interaction. The human factors are crucial in such scenarios in order to meet the user’s needs and to achieve synergetic interface as well as interaction between humans and machines.”
Apart from Prof. Dr. Beckerle, scientists from the Universities of Genoa, Pisa and Rome, Aalborg, Bangor and Pittsburgh as well as the Imperial College London and the University of Southern California, Los Angeles were contributing to the paper.
Friedrich-Alexander-Universität Erlangen-Nürnberg