Visuotactile sensors provide high-resolution tactile information but are incapable of perceiving the material features of objects. We present UltraTac, an integrated sensor that combines visuotactile imaging with ultrasound sensing through a coaxial optoacoustic architecture. The design shares structural components and achieves consistent sensing regions for both modalities. Additionally, we incorporate acoustic matching into the traditional visuotactile sensor structure, enabling the integration of the ultrasound sensing modality without compromising visuotactile performance. Through tactile feedback, we can dynamically adjust the operating state of the ultrasound module to achieve more flexible functional coordination.
Systematic experiments demonstrate three key capabilities: proximity detection in the 3–8 cm range (R² = 0.99), material classification (Precision: 99.20%), and texture-material dual-mode object recognition achieves 92.11% accuracy on a 15-class task. Finally, we integrate the sensor into a robotic manipulation system to concurrently detect container surface patterns and internal content, which verifies its promising potential for advanced human-machine interaction and precise robotic manipulation.
UltraTac delivers dual-modal perception by integrating surface texture information from camera imaging through an elastomer membrane with material properties detected via ultrasound.
The fabrication process follows an eight-step procedure, with parallel assembly for the upper and lower sensor components.
The pipeline is structured into three hierarchical levels: sensor, preprocessing, and processing.
The inspection process follows a structured timeline and consists of three phases: approach and grasp, touch, and take.
To Appear
@article{gong2025ultratac,
title={UltraTac: Integrated Ultrasound-Augmented Visuotactile Sensor for Enhanced Robotic Perception},
author={Gong, Junhao and Sou, Kit-Wa and Li, Shoujie and Guo, Changqing and Huang, Yan and Lyu, Chuqiao and Song, Ziwu and Ding, Wenbo},
journal={arXiv preprint arXiv:2508.20982},
year={2025}
}