Monday June 21, 2010

The GRASP Lab video of autonomous flying robots is New Scientist's pick for the number one video of the month, with nearly a million views, on New Scientist TV - Best of the Web. View Original Video.

Presenter: David Hu (Homepage)

Event Dates:
  Friday June 18, 2010 from 11:00am to 12:00pm

* Alternate Location: Wu and Chen Auditorium*

Snakes propel themselves over land using a variety of techniques, including sidewinding, lateral sinuous slithering and a unidirectional accordion-like mode. We explore these friction-based propulsion mechanisms through a combined experimental and theoretical investigation. Particular attention is given to classifying the gaits of snakes according to Froude number and the relative magnitudes of the frictional forces in the tangential and normal directions.

Presenter's Biography:

David Hu is an assistant professor of mechanical engineering at Georgia Tech.  Previously, he served as an instructor at New York University and as a student at the Massachusetts Institute of Technology (2001 B.S. mechanical engineering, 2005 Ph.D. Mathematics).  His lab studies animal locomotion using an array of techniques from interface science (friction and surface tension). Their work has been featured in The Economist, Nature, The New York Times, Discovery Channel, and National Geographic.  This month, their work on snakes is discussed in American Scientist.

Thursday June 10, 2010

The World Cup gets under way Friday, but the injured player was practicing for another international tournament this month: RoboCup. Soccer played by robots. No water breaks required.

Presenter: Davide Scaramuzza (Homepage)

Event Dates:
  Tuesday July 20, 2010 from 11:00am to 12:00pm

* Alternate Location: Levine 307 (3330 Walnut Street)*

Cameras are having a large impact on our society, every cell-phone is equipped with a camera and in the very near future all cars will feature cameras to improve traffic safety and even clothing will be equipped with integrated cameras. While cameras are continuing to pervade our daily lives, optics developers are working on increasing the field of view of the cameras. If the field of view of a camera exceeds 180 degrees, it is called an omnidirectional camera. Up until three years ago, these cameras were too heavy, expensive, and bulky to be integrated in commercial products.

Presenter's Biography:

Davide Scaramuzza (1980, Italian) received his PhD (February 2008) in Computer Vision and Robotics at the ETH Zurich. His PhD thesis won the Robotdalen Scientific Award, which is the most prestigious award for PhD theses in the field of Robotics and Automation. He is currently post-doctoral Fellow at the ETH Zurich, where he is leader and scientific manager of the European project "sFly", which focuses on autonomous navigation of micro helicopters in urban environments using vision as a main sensor modality. Davide is a lecturer of the Master course "Autonomous Mobile Robots" at the ETH Zurich and leader of the ETH-Maverick team which won the 2nd place at the international competition of Micro Aerial Vehicles in September 2009 with the first purely vision based autonomous helicopter. He is also author of the 2nd edition of the textbook "Introduction to Autonomous Mobile Robots" (MIT Press), along with Roland Siegwart (ETH) and Illah Nourbakhsh (CMU), which will appear in late 2010. Finally, Davide is author of the first open-source Omnidirectional Camera Calibration Toolbox for MATLAB which, besides thousands of downloads worldwide, is also currently used at NASA, Philips, Bosch, Daimler, and Chrysler.

Friday May 28, 2010

We don't know whether we should be terrified or overjoyed. We've just come across a video demo from the University of Pennsylvania's GRASP Lab that shows an autonomous quadrotor helicopter performing "precise aggressive maneuvers." And trust us when we say, nothing in the foregoing sentence is an overstatement -- the thing moves with the speed and grace of an angry bee, while accompanied by the perfectly menacing whine of its little engine. See this work of scientific art in motion after the break.

Presenter: Xu Chu Ding (Homepage)

Event Dates:
  Wednesday May 19, 2010 from 1:00pm to 2:00pm

* Alternate Location: Levine 315 (3330 Walnut Street)*

In this work, we provide a real-time algorithmic optimal control framework for autonomous switched systems. Traditional optimal control approaches for autonomous switched system are open-loop in nature. Therefore, the switching times of the system can not be adjusted or adapted when the system parameters or the operational environments change. We aim to close this loop, and apply adaptations to the optimal switching strategy based on new information that can only be captured on-line.

Presenter's Biography:

Xu Chu Ding received his B.S., M.S. and Ph.D. degree in Electrical and Computer Engineering from the Georgia Institute of Technology, Atlanta, in 2004, 2007 and 2009, respectively. He is currently a post-doctoral fellow at Boston University in the department of Mechanical Engineering. His research interests include optimal control of hybrid systems, coordination and control of multi-agent networked systems, and intelligent and persistent surveillance with a network of mobile agents.

Presenter: John Doyle (Homepage)

Event Dates:
  Friday May 21, 2010 from 11:00am to 12:00pm

* Alternate Location: Wu and Chen Auditorium*

This talk will review recent progress on developing a unified theory for complex networks of hard limits on achievable robust performance (laws) and the organizing principles that succeed or fail in achieving them (architectures and protocols).  A collection of new unified hard limit theorems will be compared with case studies drawn from cell biology, development, human physiology and medicine, Internet, wildfire ecology, and more whimsically with Lego, clothing and fashion, and market economics.

Presenter's Biography:

Dr. John Doyle is currently the John G Braun Professor of Control and Dynamical Systems, Electrical Engineering and Bioengineering at the California Institute of Technology. He received his BS and MS degrees in Electrical Engineering from MIT in 1977 and his PhD in Mathematics from UC-Berkeley in 1984. Then Doyle went on to serve as a consultant to Honeywell Technology Center since 1976, and became an Associate Professor (with tenure) at Caltech in 1986, and Professor in 1991. Doyle’s research interests are in theoretical foundations for complex networks in engineering and biology, as well as multi-scale physics, and include integrating modeling, ID, analysis and design of uncertain nonlinear systems, and computation in analysis and simulation, including complexity theory to guide algorithm development. His applications interests are motivated by the interplay between control, dynamical systems, and design and analysis of large, complex systems.

Presenter: Jim Rehg (Homepage)

Event Dates:
  Wednesday May 19, 2010 from 11:00am to 12:00pm

* Alternate Location: Levine 307 (3330 Walnut Street)*

A basic goal of video understanding is the organization of video data into sets of events with associated temporal dependencies. For example, a soccer goal could be explained using a vocabulary of events such as passing, dribbling, tackling, etc. In describing the dependencies between events it is natural to invoke the concept of causality, but previous attempts to perform causal reasoning in video analysis have been limited to special cases, such as sporting events or naïve physics, where strong domain models are available.

Presenter's Biography:

Jim Rehg is a Professor in the School of Interactive Computing at the Georgia Institute of Technology. He is co-Director of the Computational Perception Lab and Associate Director of Research in the Center for Robotics and Intelligent Machines. He received his Ph.D. from CMU in 1995 and worked at the Cambridge Research Lab of DEC (and then Compaq) from 1995-2001, where he managed the computer vision research group. His research interests include computer vision, robotics, machine learning, and computer graphics. He recently served as the general co-chair for CVPR 2009 in Miami.

Tuesday May 4, 2010

After reviewing 78 submissions to the PR2 Beta Program Call for Proposals, Willow Garage announces their selections of eleven recipients for the PR2 Beta robots at no cost. Details on the recipients can be found here.

Monday April 26, 2010

Congratulations to the GRASP Lab!