Abstract: In this talk, I will describe
how our lab’s collaborative work in understanding the
neurophysiological basis of touch (skin, receptors and neural coding;
psychophysical limits) informs the applied design of neural sensors and
human-machine interfaces, including neural prosthetics and training
simulators in medical environments. Our sense of touch, while not yet as
well understood as vision and audition, is essential for behaviors that
range from avoiding bodily harm to vital social interactions.
Discoveries in this field may help restore sensory function for disabled
populations and enhance human performance and information processing
capability. In particular in this talk, I will discuss work in using
computational models (finite element, neuraltransduction) and artificial
sensor correlates to capture the neural behavior of the skin
mechanics – receptor end organ interaction for the slowly adapting type I
tactile afferent. This work spans science and engineering where modeling
of intact sensory systems is used to define transfer functions for
application to upper limb neural prosthetics and to define the appropriate
range of sensory stimuli for medical simulators.