In The intelligence of touch we deal with touch in robots and humans, in all their aspects. How do they process and interpret tactile information? How do they react and actively explore based on touch? How do they gather information out of the experience of touching and being touched? How can we be inspired to build new robots and sensors and intelligence, in order to exploit touch?
Integrating Multisensory Information for Modeling Human Dexterous Bimanual Manipulation Skills, Kunpeng Yao, Bernardo Fichera, Anaïs Haget, Ilaria Lauzana, Aude Billard, Learning Algorithms and Systems Laboratory (LASA), École polytechnique fédérale de Lausanne (EPFL), Lausanne, Switzerland.
We will demonstrate the experimental setup that we have developed to measure human bimanual dexterity in watchmaking tasks.
The DESC Glove, Francesco Clemente, Marco Controzzi, and Christian Cipriani, The BioRobotics Institute | Scuola Superiore Sant’Anna Pisa, Italy.
Non-Invasive, Temporally Discrete Feedback of Object Contact and Release Improves Myoelectric Hand Controllability
Shape-based Object Classification through Touch-based Continuum Manipulation, Huitan Mao, Junius Santoso, Cagdas Onal, and Jing Xiao, Department of Computer Science, University of North Carolina at Charlotte and Robotics Engineering Program, Worcester Polytechnic Institute.
In this video, we present an approach to shape-based object classification through the use of a new form of continuum manipulators that we developed, which consists of origami-based modules and are equipped with sparse tactile sensors.
Online myocontrol of combined actions via Tactile myography, Mathilde Connan, Claudio Castellini, Risto Kõiva, Robert Haschke, DLR, Robotics and Mechatronics Institute and CITEC, Bielefeld University.
In this demo, we show how tactile myography can be used online to control combined actions of the wrist and hand using bare linear ridge regression.
Catch Me If You Can: Slip Detection with a Multi-fingered Tactile Hand, Jasper James & Nathan Lepora, University of Bristol & Bristol Robotics Laboratory.
A newly modified GRAB Lab Model O containing TacTip optical biomimetic tactile sensors suitable for slip detection.
3D-printed biomimetic optical tactile sensors and robots, Nathan Lepora, University of Bristol and Bristol Robotics Laboratory.
Demo of a range of 3D-printed biomimetic tactile robots, including tactile fingertips, 3d-printed tactile hands and a tactile whisker array, based on the TacTip from Bristol Robotics Laboratory.
Slip detection, grasp force adaptation and stiffness classification with tactile sensors on an iLimb hand prosthesis, Gereon Büscher, Akshat Tandon, Risto Kõiva, and Robert Haschke, CITEC, Bielefeld University.
Using tactile sensors developed within the TACT-HAND project, the iLimb hand prosthesis can detect incipient slippage and adjust its grasping force accordingly to stably hold an object. Gently squeezing an object and observing the corresponding changes in tactile response, an iLimb hand equipped with two tactile sensors per finger can predict the stiffness of the held object.
Versatile gripping system with camera-based tactile sensors, Nicolas Alt, Clemens Schuwerk, Stefan Lochbrunner, Eckehard Steinbach, RoVi Robot Vision project, Technical University of Munich.
We demonstrate a versatile gripping system with a grasp controller based on tactile feedback, integrating cameras and jaws with tactile sensors. The latter use the concept of camera-based sensing, extracting tactile data from a camera image of a passive, deformable element mounted on the jaws.
A Multimodal Embedded Sensor System for Scalable Robotic and Prosthetic Fingers, Pascal Weiner and Tamim Asfour, Institute for Anthropomatics and Robotics
High Performance Humanoid Technologies, Karlsruhe Institute of Technology.
In this demonstration, we present the design of a scalable and low cost robotic finger with a soft fingertip and position, temperature as well as normal and shear force sensors. All cables and sensors are completely enclosed inside the finger, ensuring an anthropometric appearance. As the Finger is meant to be integrated into our prosthetic hand, it is modelled based on the 50th percentile of a male little finger and is easily adaptable to other dimensions in terms of size and sensor system configuration.