Objectives

The last so-many years of European robotics research funding has seen an everlasting effort in the development of human-like hands and actuators, often in a setting with grasping tasks centred around household or industrial settings. In parallel, many projects investigated and indeed developed skin-like sensors, mimicking human tactile sensitivity in one way or another, with goals usually including sensorising fingers. Although most researches would agree—and that not just to get more funding to build more hardware—that the resulting hands and sensors are not perfect or anywhere close to the human hand with its superb tactile sensitivity, on which many developments are based. Nonetheless, the number of interesting approaches to both robotic hands and tactile sensors is better than ever before, and we would like to argue that a status of mechatronic availability has been reached with which unique grasping and manipulation results can be reached.

To clarify the terminology we use to describe human hand use, we argue as follows. In the first stage, the hand and arm move towards the object, prepare the hand orientation, preshape the hand, and set finger and arm stiffness to grasp the object. This part usually requires object model knowledge (e.g., the weight of the object, and its intended use) and does not involve any somatosensory feedback.

The transition to the next stage is crucial: once initial contact is made, updated models on the object inertial properties and stability can be made based on skin data. Shortly after that, the object is taken into the hand and lifted to reach a stable grip. In the grip stage, the foremost task is to stably hold the object in the hand, of course taken its intended use into account, and withstand perturbations due to movement, gravity, and so on. Sensors involved are mostly tactile and proprioceptive, with an arguable influence of vision in some situations.

In the third stage, the object is used in the hand. Here, a tactile sensory-driven feedback loop is closed which (a) ensures stable grip at all times, and (b) lets our fingers move the object in the hand while reaching an object-dependent goal. We call last stage this part manipulation and it’s the focus of the TACMAN project, i.e., the control of finger movement based on finger tactile inputs while stable grip is taken into account.

We focus on the development of new algorithms combining the two fields of tactile sensory data processing and finger control, and will, of course, focus on correspondingly complex hands as well as sensors. Indeed, our project does not focus on the new development of better hands or superior sensors, but we will set out to solve robotic manipulation with hardware developed in previous European projects. In effect, we are testing the hypothesis that currently available hardware is sufficient to achieve robotic dexterity by combining this hardware with biomimetic strategies for information processing.

TACMAN is a very focused project, set out to bring robotic hand use to its next stage. It involves partners focussing on tactile sensors (IIT, TUM), robotic hands (IIT), machine learning (TUD, TUM), and human sensing physiology (UMU). Central to our developments are machine learning approaches, which are required to relate noisy, high-dimensional tactile sensory data to physical events from object and finger movement. Our results will be made available to the iCub community and the greater robotics community in form of an open-source TACMAN software package.

zum Seitenanfang