Published by Electronic Design
Authored by Melanie Lefkowitz
A Cornell University-led team developed modular robots that can perceive their surroundings, make decisions, and autonomously assume different shapes in order to perform various tasks.
“This is the first time modular robots have been demonstrated with autonomous reconfiguration and behavior that is perception-driven,” said Hadas Kress-Gazit, associate professor in the Sibley School of Mechanical and Aerospace Engineering and principal investigator on the project. “We are creating a modular system that is able to do different tasks autonomously. By changing the high-level task, it totally changes its behavior.”
The results of this research were published in Science Robotics.
The robots consist of wheeled, cube-shaped modules that can detach and reattach to form new shapes with different capabilities. The modules, developed by researchers at the University of Pennsylvania, have magnets to attach to each other, and Wi-Fi to communicate with a centralized system.
These interchangeable modules are connected to a sensor module, which is equipped with multiple cameras and a small computer for collecting and processing data about its surroundings. The robot’s software includes a high-level planner to direct its actions and reconfiguration, as well as perception algorithms that can map, navigate, and classify the environment.
In earlier work, the researchers created an open-source online tool where users could create, simulate, and test designs for robot configurations and behaviors. They populated the library by hosting design competitions and inviting students to invent and test different shapes.
The library now consists of 57 possible robot configurations, such as Proboscis (with a long arm in front), Scorpion (modules arranged in perpendicular lines, with a horizontal row in front), and Snake (modules in a single line), and 97 behaviors, such as pickUp, highReach, drive, or drop. Once the robot is given a task, its high-level planner searches the library for shapes and behaviors that meet the current needs.
Other modular robot systems have successfully performed specific tasks in controlled environments, but these robots are the first to demonstrate fully autonomous behavior and reconfigurations based on the task and an unfamiliar environment, Kress-Gazit said.
“I want to tell the robot what it should be doing, what its goals are, but not how it should be doing it,” she said. “I don’t actually prescribe, ‘Move to the left, change your shape.’ All these decisions are made autonomously by the robot.”
The team proved the effectiveness of its system with three experiments. In the first, a robot was instructed to find, retrieve, and deliver all pink and green objects to a designated zone marked with a blue square on the wall. The robot used the “Car” configuration to explore, and then reshaped itself into “Proboscis” to retrieve a pink object from a narrow pathway, finally returning to its car shape to deliver its haul.
PhD, MEAM '19
Director, GRASP Lab; Faculty Director, Design Studio (Venture Labs); Asa Whitney Professor, MEAM