Abstract

Transferring manipulation methods between touch sensors using cross-modal prediction: We use an object manipulation method designed for one touch sensor, Soft Bubble, on a robot equipped with another sensor, GelSlim. To do this, we use a cross-modal diffusion model to translate one touch signal to another: i.e., predicting what the object would have felt like if it were sensed with Soft Bubble rather than GelSlim. We then apply a simple, off-the-shelf manipulation method on the estimated tactile signal.

Robotic tactile perception is an important enabler of precise and robust manipulation in uncertain, cluttered, and/or vision-limited environments. However, there is no standardized tactile sensor used for robotic applications, which creates the need to develop new algorithms for similar tasks depending on the tactile sensor available. To address this challenge, we propose tactile cross-modal transfer — a method of rendering a tactile signature collected by one sensor (“the source”) as though it is perceived by another sensor (“the target”). This method enables us to deploy algorithms designed for the target sensor seamlessly and without modification on signatures collected by “sources”. We implement this idea using several generative models for cross-modal transfer between the popular GelSlim 3.0 and Soft Bubble tactile sensors. As a downstream task, we test the method on in-hand object pose estimation from Soft Bubble, but using GelSlim images as input. This task demonstrates the transferability of tactile sensing as a building block toward more complex tasks.

Method

Experiments

Pen Insertion
Tool 1 Insertion
Tool 2 Insertion
Tool 3 Insertion

Authors

Video

Code