Abstract: Reliable motion estimation on resource-limited platforms is a crucial task for many applications. While insects solve this problem in an exemplary manner, mobile robots still require a bulky computation and sensor equipment to provide sufficient robustness. In this talk, I will motivate the use of an inertial-visual system as a minimal sensor concept which still allows an efficient and robust navigation. I will focus on image processing, especially efficient feature tracking and motion estimation algorithms, resulting in an algorithm which tracks several hundreds of features in a few milliseconds on low-power processing units. Such high frame-rates are of great interest if high dynamic mobile robots, like multicopters, have to be controlled. A perturbation analysis of motion estimation algorithms gives insights how the resulting accuracy depends on the aperture angle, the tracking accuracy, and the number of features. Furthermore, an algorithm is presented which accurately aligns sensors like an IMU and a camera in time and in space. Finally, I will discuss an insect-inspired navigation concept, which enables long-distance navigation even on memory limited systems. Applications will be presented like the DLR 3D-Modeler and the DLR Multicopters, where these methods are applied to.