Visibility-Inspired Models of Touch Sensors for Navigation

For robotic navigation, sensors such as cameras and Lidars are typically used. But in nature, mammals and fishes use the touch modality to navigate. Inspired by this observation, a recent study published on investigates the utility of touch sensors mounted on a mobile robot for proprioception.

Touch sensor on a robotic manipulator.

Example of a touch sensor on a robotic manipulator. Image credit: UCLA Engineering via Flickr, CC BY 2.0

Researchers consider two types of touch sensors – rigid and compliant (these can compress or bend). Compressible touch sensors can be used for motion planning for collision resilient robots, while simple bumper sensors provide more limited information.

Several virtual touch sensor models are introduced on a mobile robot. Researchers compare the sensors in terms of their preimages and develop mathematical models independent of their realization. The models provide a useful characterization of task-relevant information that enables to increase the task success rate when deploying mobile robots in unstructured environments.

This paper introduces mathematical models of touch sensors for mobile robotics based on visibility. Serving a purpose similar to the pinhole camera model for computer vision, the introduced models are expected to provide a useful, idealized characterization of task-relevant information that can be inferred from their outputs or observations. This allows direct comparisons to be made between traditional depth sensors, highlighting cases in which touch sensing may be interchangeable with time of flight or vision sensors, and characterizing unique advantages provided by touch sensing. The models include contact detection, compression, load bearing, and deflection. The results could serve as a basic building block for innovative touch sensor designs for mobile robot sensor fusion systems.

Research paper: Tiwari, K., Sakcak, B., Routray, P., Manivannan, M., and LaValle, S. M., “Visibility-Inspired Models of Touch Sensors for Navigation”, 2022. Link: