Abstract: Robotic hands are still a long way from matching the grasping and manipulation capability of their human counterparts. One way to push the research further along is to use computer modeling and simulation to learn more about human and robotic grasping. We have created a publicly available simulator called Graspit! to serve this purpose. It can accommodate a wide variety of hand designs, and it can evaluate grasps formed by these hands, as well as perform full dynamic simulation of the grasping process. In this talk, we will describe a wide range of research on intelligent grasping that we have performed using the Graspit! simulator.
The first project is to use low-dimensional grasping subspaces to plan stable grasps. This reduced dimensionality framework can serve as an interface between the human and automated components of a prosthetic system. We have built an on-line grasp planner using GraspIt! that allows a human user to perform dexterous grasping tasks using an artificial hand, even if the user has no direct control over finger posture. This method combines human and automated control for dexterous grasping; we will discuss the interplay between the two and present interesting new directions for research in interactive grasping. The second project is to pre-compute stable grasps using GraspIt! on a wide variety of object models to create an indexed database of models that can be efficiently searched to find stable grasps on sensed objects in the environment.