We present ArrayTac, a closed-loop piezoelectric tactile display for multidimensional tactile interaction.
Movie S1. System-level demonstration of ArrayTac for multidimensional tactile rendering.
Figure 1. The ArrayTac system architecture, hardware stack, and control pipeline.
Human touch depends on the integration of shape, stiffness, and friction, yet existing tactile displays cannot render these cues together as continuous, high-fidelity signals for intuitive perception. Here, we present ArrayTac, a closed-loop piezoelectric tactile display that simultaneously and continuously renders these three dimensions on a 4 x 4 actuator array. Each unit integrates a three-stage micro-lever amplifier with end-effector Hall-effect feedback, enabling up to 5 mm displacement, >500 Hz array refresh, and 123 Hz closed-loop bandwidth. In psychophysical experiments, naive participants identified complex three-dimensional shapes and distinguished multiple stiffness and friction levels through touch alone, without training. We further demonstrate image-to-touch rendering from a single RGB image and remote palpation of a medical-grade breast tumor phantom over >1,000 km, in which all 11 naive participants correctly identified tumor number and type with sub-centimeter localization error.
We demonstrate cross-city remote palpation of a breast tumor phantom over more than 1,000 km. In this setup, a participant at the local site receives real-time tactile feedback through ArrayTac while teleoperating a remote robotic system, enabling identification of tumor locations and types with sub-centimeter localization error.
Cross-city remote palpation of a breast tumor phantom using ArrayTac.
This video introduces the overall system workflow, including tactile data acquisition, sliding-window interaction, the 4 x 4 tactile display array, the XYZ platform, actuator structure, and system-level information flow.
Movie S2. System overview and hardware architecture of ArrayTac.
This video explains how ArrayTac renders shape, stiffness, and friction, including Hall-sensor-based position sensing, closed-loop shape rendering, nonlinear stiffness control, Shore 00 calibration, and friction rendering through programmable vibrotactile modulation.
Movie S3. Rendering methods for shape, stiffness, and friction.
This video summarizes the perceptual experiments on shape, stiffness, and friction, and then presents the main downstream applications enabled by ArrayTac, including Tac-Anything, Tele-Touch, and usability-related results.
Movie S4. Experimental evaluation of perceptual performance and application frameworks enabled by ArrayTac.
Tac-Anything turns a single RGB image into tactile semantics by combining depth estimation, segmentation, and tactile-property inference. The resulting shape, stiffness, and friction layers are then rendered through ArrayTac for tactile scene understanding.
Figure 4. Tac-Anything framework, validation, and user performance.
These figures summarize the hardware design of the actuator array and platform, together with the measured control performance that supports high-fidelity multidimensional tactile rendering.
Figure 7. Exploded views and schematics of the actuator array, stage, and electronics.