Abstract: Despite their ubiquitous importance in human life, we currently lack a formal method for analyzing and understanding the feel of tool-mediated interactions with physical objects. Computer-controlled haptic interfaces allow users to touch virtual and remote environments through a hand-held tool, feeling forces and torques in response to tool movements. These systems have enabled new applications such as computer-aided sculpture and robot-assisted surgery, but their haptic renderings seldom feel like authentic re-creations of the richly textured and nuanced surfaces one encounters in the real world.
Aiming to create virtual and remote surfaces that feel indistinguishable from their real counterparts, I have blended insights on the human sense of touch and prior work in haptic modeling and rendering to envision the new approach of haptography. Analogous to photography in the visual domain, haptography first enables an individual to quickly record all aspects of the haptic interaction between a hand-held tool and a real surface. Haptographic capture tools are highly sensorized to enable recording of the forces, torques, and accelerations caused by the target interaction, plus the associated tool movements and grip force levels. Notably, these tools are wielded by a human haptographer rather than a robot to speed the capture process and ensure that the data accurately represent the motions and dynamics of a human hand. Custom signal processing techniques can then be used to distill these multi-dimensional recordings into a haptograph, a compact parametric model that embodies the perceptually relevant characteristics of the touched surface, including stiffness, roughness, and friction.
After the capture process is complete, haptography enables an individual to accurately reproduce the feel of a recorded interaction for others to interactively experience. The primary hypothesis of this work is that the feel of tool-mediated contact with real and virtual objects is directly governed by the high-frequency accelerations that occur during the interaction, as opposed to the low-frequency impedance of the contact. Thus, haptographic rendering tools are designed to move with a high-frequency acceleration profile that is precisely matched to the signals recorded from the corresponding real interaction. Because typical impedance-type haptic interfaces are optimized for the display of low-frequency forces, we attach a dedicated voice coil actuator to the handle of such a device and use a combination of feedforward and feedback control to render the desired tool accelerations at the hand of the user.
In addition to sharing my vision for haptography, this talk will cover my research group’s preliminary work on both capturing and recreating the rich feel of real surfaces. We anticipate exciting applications of this research in areas such as robot-assisted surgery, medical training, interactive museum exhibits, and perhaps even online shopping, all situations in which one might like to feel the fine details of contact with a virtual or remote object.