Abstract: In this talk, we will present some examples of human-machine interfaces
for VR applications and real-world scenarios. We will focus on two main
areas: interactive simulation of vehicle motion, and remote control of
individual (or swarms) of semi-autonomous agents. The underlying theme
is the full exploitation of different sensory inputs (e.g., visual,
acoustic, haptic, vestibular) to reproduce a perfect illusion of
realism, to improve the situational awareness of human operators, and to
ultimately facilitate their tasks.
We will then present some preliminary results obtained on our
CyberMotion simulator, a versatile motion platform for providing
vestibular (self-motion) cues.