ArrayTac: A Closed-loop Piezoelectric Tactile Platform for Continuous and Intuitive Multidimensional Rendering

TL;DR

We present ArrayTac, a closed-loop piezoelectric tactile display for multidimensional tactile interaction.

  • ArrayTac simultaneously renders three tactile dimensions on the same interface: shape, stiffness, and friction.
  • We propose Tele-Touch and demonstrate cross-city remote palpation over more than 1,000 km.
  • We propose Tac-Anything, which extracts tactile semantics from a single RGB image and renders them through ArrayTac.

Movie S1. System-level demonstration of ArrayTac for multidimensional tactile rendering.

Abstract

ArrayTac overview figure

Figure 1. The ArrayTac system architecture, hardware stack, and control pipeline.

Human touch depends on the integration of shape, stiffness, and friction, yet existing tactile displays cannot render these cues together as continuous, high-fidelity signals for intuitive perception. Here, we present ArrayTac, a closed-loop piezoelectric tactile display that simultaneously and continuously renders these three dimensions on a 4 x 4 actuator array. Each unit integrates a three-stage micro-lever amplifier with end-effector Hall-effect feedback, enabling up to 5 mm displacement, >500 Hz array refresh, and 123 Hz closed-loop bandwidth. In psychophysical experiments, naive participants identified complex three-dimensional shapes and distinguished multiple stiffness and friction levels through touch alone, without training. We further demonstrate image-to-touch rendering from a single RGB image and remote palpation of a medical-grade breast tumor phantom over >1,000 km, in which all 11 naive participants correctly identified tumor number and type with sub-centimeter localization error.

Cross-city Remote Palpation over More Than 1,000 km

We demonstrate cross-city remote palpation of a breast tumor phantom over more than 1,000 km. In this setup, a participant at the local site receives real-time tactile feedback through ArrayTac while teleoperating a remote robotic system, enabling identification of tumor locations and types with sub-centimeter localization error.

Cross-city remote palpation of a breast tumor phantom using ArrayTac.

System Overview Video

This video introduces the overall system workflow, including tactile data acquisition, sliding-window interaction, the 4 x 4 tactile display array, the XYZ platform, actuator structure, and system-level information flow.

Movie S2. System overview and hardware architecture of ArrayTac.

Rendering Methods Video

This video explains how ArrayTac renders shape, stiffness, and friction, including Hall-sensor-based position sensing, closed-loop shape rendering, nonlinear stiffness control, Shore 00 calibration, and friction rendering through programmable vibrotactile modulation.

Movie S3. Rendering methods for shape, stiffness, and friction.

Experiments and Applications Video

This video summarizes the perceptual experiments on shape, stiffness, and friction, and then presents the main downstream applications enabled by ArrayTac, including Tac-Anything, Tele-Touch, and usability-related results.

Movie S4. Experimental evaluation of perceptual performance and application frameworks enabled by ArrayTac.

Tac-Anything Pipeline

Tac-Anything turns a single RGB image into tactile semantics by combining depth estimation, segmentation, and tactile-property inference. The resulting shape, stiffness, and friction layers are then rendered through ArrayTac for tactile scene understanding.

Tac-Anything framework

Figure 4. Tac-Anything framework, validation, and user performance.

Hardware and Control

These figures summarize the hardware design of the actuator array and platform, together with the measured control performance that supports high-fidelity multidimensional tactile rendering.

System design and schematics

Figure 7. Exploded views and schematics of the actuator array, stage, and electronics.