This was a hybrid event with in-person attendance in Levine 307 and virtual attendance…
The human fascination to mimic ultra-efficient living beings like insects and birds has led to the rise of small autonomous robots. Smaller robots are safer, more agile and are task-distributable as swarms. One might wonder, why do we not have small robots deployed in the wild today? Smaller robots are constrained by a severe dearth of computation and sensor quality. To further exacerbate the situation, today’s mainstream approach for autonomy on small robots relies on building a 3D map of the scene that is used to plan paths for executing a control algorithm. Such a methodology has severely bounded the potential of small autonomous robots due to the strict distinction between perception, planning, and control. Instead, we re-imagine each agent by drawing inspiration from insects at the bottom of the size and computation spectrum. Specifically, each of our agents is made up of a series of hierarchical competences built on bio-inspired sensorimotor AI loops by utilizing the action-perception synergy. Here, the agent controls its own movement and physical interaction to make up for what it lacks in computation and sensing. Such an approach imposes additional constraints on the data gathered to solve the problem using Active and Interactive Perception. I will present how the world’s first prototype of a RoboBeeHive was built using this philosophy. Finally, I will conclude with a recent theory called Novel Perception that utilizes the statistics of motion fields to tackle various class of problems from navigation and interaction. This method has the potential to be the go-to mathematical formulation for tackling the class of motion-field-based problems in robotics.