Sensing is essential for autonomy in robotic applications. Our focus is on how to provide sensing to low power systems that enables them to cope with the high dynamics of the underlying hardware. A disadvantage of using compact, low-power sensors is often their slower speed and lower accuracy making them unsuitable for direct capture and control of high dynamic motion. On the other hand, the inherent instability of some systems (e.g. helicopters or quadrotors), their limited on-board resources and payload, their multi-DoF design and the uncertain and dynamic environment they operate in, present unique challenges both in achieving robust low level control and in implementing higher level functions. We developed tracking algorithms (AGAST) and localization (Z_inf) techniques that can be used for navigation on embedded systems. I will show their application on OMAP3 processors (BeagleBoard.org system).
Perception of the sensors can be boosted by adding external data in form of sensor data fusion or indexing to external databases. I will present an efficient 3D object recognition and pose estimation approach for grasping procedures in cluttered and occluded environments. In contrast to common appearance-based approaches, we rely solely on 3D geometry information. Our method is based on a robust geometric descriptor, a hashing technique and an efficient, localized RANSAC-like sampling strategy.