are becoming more and more capable at reasoning about
people, objects,
and activities in their environments. The ability to
extract high-level
semantic information from sensor data provides new opportunities
for human robot interaction. One such opportunity is to explore
interacting with robots via natural language. In this talk
I will
present our recent work toward enabling robots to interpret,
or ground,
natural language commands in robot control systems. We build
on techniques
developed by the semantic natural language processing community
on learning combinatory categorial grammars (CCGs) that
parse natural
language input to logic-based semantic meaning. I will demonstrate
results in two application domains: First, learning to follow
natural language directions through indoor environments;
and, second,
learning to ground object attributes via weakly supervised
training.
Joint
work with Luke Zettlemoyer, Cynthia Matuszek, Nicolas
Fitzgerald, Yuyin
Sun, and Liefeng Bo. Support provided by Intel ISTC-PC, NSF, and
ARL, and
ONR.