Friday May 28, 2010

We don't know whether we should be terrified or overjoyed. We've just come across a video demo from the University of Pennsylvania's GRASP Lab that shows an autonomous quadrotor helicopter performing "precise aggressive maneuvers." And trust us when we say, nothing in the foregoing sentence is an overstatement -- the thing moves with the speed and grace of an angry bee, while accompanied by the perfectly menacing whine of its little engine. See this work of scientific art in motion after the break.

Presenter: Xu Chu Ding (Homepage)

Event Dates:
  Wednesday May 19, 2010 from 1:00pm to 2:00pm

* Alternate Location: Levine 315 (3330 Walnut Street)*

In this work, we provide a real-time algorithmic optimal control framework for autonomous switched systems. Traditional optimal control approaches for autonomous switched system are open-loop in nature. Therefore, the switching times of the system can not be adjusted or adapted when the system parameters or the operational environments change. We aim to close this loop, and apply adaptations to the optimal switching strategy based on new information that can only be captured on-line.

Presenter's Biography:

Xu Chu Ding received his B.S., M.S. and Ph.D. degree in Electrical and Computer Engineering from the Georgia Institute of Technology, Atlanta, in 2004, 2007 and 2009, respectively. He is currently a post-doctoral fellow at Boston University in the department of Mechanical Engineering. His research interests include optimal control of hybrid systems, coordination and control of multi-agent networked systems, and intelligent and persistent surveillance with a network of mobile agents.

Presenter: John Doyle (Homepage)

Event Dates:
  Friday May 21, 2010 from 11:00am to 12:00pm

* Alternate Location: Wu and Chen Auditorium*

This talk will review recent progress on developing a unified theory for complex networks of hard limits on achievable robust performance (laws) and the organizing principles that succeed or fail in achieving them (architectures and protocols).  A collection of new unified hard limit theorems will be compared with case studies drawn from cell biology, development, human physiology and medicine, Internet, wildfire ecology, and more whimsically with Lego, clothing and fashion, and market economics.

Presenter's Biography:

Dr. John Doyle is currently the John G Braun Professor of Control and Dynamical Systems, Electrical Engineering and Bioengineering at the California Institute of Technology. He received his BS and MS degrees in Electrical Engineering from MIT in 1977 and his PhD in Mathematics from UC-Berkeley in 1984. Then Doyle went on to serve as a consultant to Honeywell Technology Center since 1976, and became an Associate Professor (with tenure) at Caltech in 1986, and Professor in 1991. Doyle’s research interests are in theoretical foundations for complex networks in engineering and biology, as well as multi-scale physics, and include integrating modeling, ID, analysis and design of uncertain nonlinear systems, and computation in analysis and simulation, including complexity theory to guide algorithm development. His applications interests are motivated by the interplay between control, dynamical systems, and design and analysis of large, complex systems.

Presenter: Jim Rehg (Homepage)

Event Dates:
  Wednesday May 19, 2010 from 11:00am to 12:00pm

* Alternate Location: Levine 307 (3330 Walnut Street)*

A basic goal of video understanding is the organization of video data into sets of events with associated temporal dependencies. For example, a soccer goal could be explained using a vocabulary of events such as passing, dribbling, tackling, etc. In describing the dependencies between events it is natural to invoke the concept of causality, but previous attempts to perform causal reasoning in video analysis have been limited to special cases, such as sporting events or naïve physics, where strong domain models are available.

Presenter's Biography:

Jim Rehg is a Professor in the School of Interactive Computing at the Georgia Institute of Technology. He is co-Director of the Computational Perception Lab and Associate Director of Research in the Center for Robotics and Intelligent Machines. He received his Ph.D. from CMU in 1995 and worked at the Cambridge Research Lab of DEC (and then Compaq) from 1995-2001, where he managed the computer vision research group. His research interests include computer vision, robotics, machine learning, and computer graphics. He recently served as the general co-chair for CVPR 2009 in Miami.

Tuesday May 4, 2010

After reviewing 78 submissions to the PR2 Beta Program Call for Proposals, Willow Garage announces their selections of eleven recipients for the PR2 Beta robots at no cost. Details on the recipients can be found here.

Monday April 26, 2010

Congratulations to the GRASP Lab!

Friday April 16, 2010

Check out the GRASP Lab Open House Collage Video...

Presenter: Roger Cholewiak (Homepage)

Event Dates:
  Wednesday May 26, 2010 from 11:00am to 12:00pm

* Alternate Location: Wu and Chen Auditorium*

Dr Roger Cholewiak will be presenting an overview of his programs of Basic and Applied research on vibrotactile sensitivity and pattern perception at the Cutaneous Communication Laboratory at Princeton and at the Naval Aerospace Medical Research Laboratory in Pensacola. He will describe how the findings from over 35 years of work have been brought out of the laboratory to address real-world problems in his consulting work with engineering firms throughout the country.

Presenter's Biography:

Dr. Roger W. Cholewiak is a Research Psychologist who has published and presented extensively in national and international venues. His expertise is in experimental design and tactile pattern perception, particularly as related to the study of the ability of humans to use the sense of touch as a communication system. He has worked in these areas since receiving his doctorate at the University of Virginia, for over 35 years. He is the retired director of research laboratories at two institutions, the Cutaneous Communication Laboratory at Princeton University and the Naval Aerospace Medical Research Laboratory (NAMRL) in Pensacola, FL, and was a NASA summer research fellow at the Johnson Space Center. At these institutions, he studied the appreciation of patterned vibrotactile stimuli presented to numerous body sites of young and older individuals for sensory substitution (e.g., for blind persons) as well as for sensory augmentation, particularly by aircraft pilots and astronauts. Factors explored in this research were the numbers and locations of vibrotactile sites, the vibrotactile sensitivity and acuity of these body sites, and the design of tactors and the optimal modes of pattern presentation. These were focused on best taking advantage of the capabilities of these loci so as to most appropriately communicate the relevant information for the task. He has directed international doctoral students at Utrecht and Mannheim, and has served as a reviewer of manuscripts for numerous professional journals as well as grants for NIH, ONR, NSF, NASA, and the Air Force.
    Since his retirement, in addition to teaching Sensation and Perception and Research Seminars as an adjunct Professor at the College of New Jersey, Dr. Cholewiak has continued to write professionally, and to serve as a consultant in tactile communication. He currently is consulting with companies and organizations such as the US Army Research Labs, Engineering Acoustics in Florida, Angel Medical in New Jersey, Barron Associates in Virginia, and SME in Massachusetts.

Presenter: Paolo Robuffo Giordano and Antonio Franchi (Homepage)

Event Dates:
  Monday May 10, 2010 from 11:00am to 12:00pm

* Alternate Location: Levine 307 (3330 Walnut Street)*

In this talk, we will present some examples of human-machine interfaces for VR applications and real-world scenarios. We will focus on two main areas: interactive simulation of vehicle motion, and remote control of individual (or swarms) of semi-autonomous agents. The underlying theme is the full exploitation of different sensory inputs (e.g., visual, acoustic, haptic, vestibular) to reproduce a perfect illusion of realism, to improve the situational awareness of human operators, and to ultimately facilitate their tasks.


Presenter's Biography:

Paolo Robuffo Giordano received the M.Sc. degree in Computer Science Engineering and the Ph.D. degree in Systems Engineering from the Dipartimento di Informatica e Sistemistica, Università di Roma "La Sapienza", in 2001 and 2008. He was a PostDoc at the Robotics Institute of the German Aerospace Center (DLR) in from 2007 to 2008. He is currently Project Leader at the Max Planck Institute for Biological Cybernetics, Department Bülthoff, and is in charge of the robotics and control-related activities. His interests are in the general areas of robotics and nonlinear control. In particular, he has been working on kinematic and dynamical modeling of physical systems, motion control of fixed and mobile manipulators, visual servoing, nonlinear state estimation, nonholonomic systems, control design for VR applications, and motion simulation technologies. http://www.kyb.mpg.de/de/~robu_pa


Antonio Franchi is a research scientists at the Max Planck Institute for Biological Cybernetics, Department of Human Perception, Cognition and Action. He received the  M.Sc. degree in Electrical Engineering and the Ph.D. degree in System Engineering from the Department of Computer and System Sciences of the University of Rome ``La Sapienza'' in 2005 and 2009. He was a visiting scholar at the Center for Control, Dynamical Systems, and Computation, UCSB, California, under the supervision of Prof. Francesco Bullo.  His research interests lie in the areas of multi-robot control, human-robot interaction, state estimation and decentralized algorithms, with applications to exploration, localization, motion coordination, pursuit-evasion, and patrolling problems. See http://www.kyb.mpg.de/~antonio for more informations.

Presenter: Karlin Bark (Homepage)

Event Dates:
  Tuesday April 20, 2010 from 3:00pm to 4:00pm

* Alternate Location: Levine 315 (3330 Walnut Street)*

This work is motivated by the idea that new modes of haptic interaction are needed to expand the range of activities and applications for wearable electronic devices. In applications ranging from motion training and physical rehabilitation to teleoperation of a remote system, haptic feedback can provide valuable information about forces and motions, particularly when vision and audition are otherwise occupied. An under appreciated component of haptic sensation, particularly for applications involving portable devices, is skin stretch.

Presenter's Biography:

Karlin Bark received her Ph.D in mechanical engineering at Stanford University in 2009 working under Professor Mark Cutkosky in the Biomimetics and Dexterous Manipulation Laboratory. She received her B.S. degree in mechanical engineering from the University of Michigan in 2003, and the M.S degree in mechanical engineering at Stanford University in 2005 with a focus on design methodology and controls. Her research interests include haptics, human factors based design, and robotics with medical applications. When not pondering the challenges of skin stretch, she enjoys keeping up to date on the latest in movies, television and theater, and spending quality time with her friends and family.