The RTD part of TACMAN has been divided in four work packages of incremental complexity:
Manipulation with one finger does not at all have a Zen-like character as in Flanagan’s book “The Sound of One Hand Clapping”; rather, we refer to the tactile properties necessary in the grip and manipulation tasks in the higher work packages. Consequently, WP1 is mostly focused on human and tactile sensor data processing and representation. Tasks include tactile sensor “calibration”, slip detection, detection of material properties, and human stable grip—note that, with one finger, stable grip can still be realised and studies when in contact with, e.g., a wall or a table, and it is thus with that we intend to start off.
WP2 add a second finger, and stable grip becomes more challenging—indeed, stable grip with two robotic fingers is a to date not completely solved problem, since the required sensing is usually not available and external disturbances—such as a moving arm—are often ignored. We will tackle this problem here, and study how we can combine two tasks in a sensor representation. Also, the challenging task of rolling an object between two fingers will be tackled.
WP3 extends to the three- and four-finger grip adds a level of complexity, where the focus is on complex, learnable action based on tactile data. The major focus is on regripping: to move an object in the hand, fingers must be replaced while keeping stability criteria satisfied.
Finally, WP4 has the object in the sensorised hand: how do we move from precision grip to power grip; how do we detect the status of the object in the hand. Although these tasks are highly challenging and we do not aim to completely solve in-hand manipulation within the short time frame of the project, we will be able to solve the principles (within WP1, 2, and 3) and implement their use in a robotic setting in WP4.