Loading Events

« All Events

  • This event has passed.

Fall 2015 GRASP Seminar: Aaron Steinfeld, Carnegie Mellon University, “Understanding and Creating Appropriate Robot Behavior”

November 13, 2015 @ 11:00 am


 People expect appropriate robot actions, interventions, and requests for assistance. As with most technologies, robots that behave in unexpected and inappropriate ways face misuse, abandonment, and sabotage. Complicating this challenge are human misperceptions of robot capability, intelligence, and performance. This talk will summarize work from several projects focused on this human-robot interaction challenge. Findings and examples will be shown from work on human trust in robots, deceptive robot behavior, robot motion, and robot characteristics. It is also important to examine the human-robot system, rather than just the robot. To this end, it is possible to draw lessons learned from related work in crowdsourcing (e.g., Tiramisu Transit) to help inform methods for enabling and supporting contributions by the general public and domain experts.


- Learn More

Aaron Steinfeld is an Associate Research Professor in the Robotics Institute (RI) at Carnegie Mellon University. He received his BSE, MSE, and Ph.D. degrees in Industrial and Operations Engineering from the University of Michigan and completed a Post Doc at U.C. Berkeley. He is the Co-Director of the Rehabilitation Engineering Research Center on Accessible Public Transportation (RERC-APT), Director of the DRRP on Inclusive Cloud and Web Computing, and the area lead for transportation related projects in the Quality of Life Technology Center (QoLT). His research focuses on operator assistance under constraints, i.e., how to enable timely and appropriate interaction when technology use is restricted through design, tasks, the environment, time pressures, and/or user abilities. His work includes intelligent transportation systems, crowdsourcing, human-robot interaction, rehabilitation, and universal design.


November 13, 2015
11:00 am
Event Category: