Multi-sensory-motor research: Investigating auditory, visual, and motor interaction in virtual reality environments (Abstract)
by , , , ,
Abstract:
Perception in natural environments is inseparably linked to motor action. In fact, we consider action an essential component of perceptual representation. But these representations are inherently difficult to investigate: Traditional experimental setups are limited by the lack of flexibility in manipulating spatial features. To overcome these problems, virtual reality (VR) experiments seem to be a feasible alternative, but these setups typically lack ecological realism due to the use of "unnatural" interface-devices (joystick). Thus, we propose an experimental apparatus which combines multisensory perception and action in an ecologically realistic way. The basis is a 10-foot hollow sphere (VirtuSphere) placed on a platform that allows free rotation. A subject inside can walk in any direction for any distance immersed into virtual environment. Both the rotation of the sphere and movement of the subject's head are tracked to process the subject's view within the VR-environment presented on a head-mounted display. Moreover, auditory features are dynamically processed taking greatest care of exact alignment of sound-sources and visual objects using ambisonic-encoded audio processed by a HRTF-filterbank. We present empirical data that confirm ecological realism of this setup and discuss its suitability for multi-sensory-motor research.
Reference:
Multi-sensory-motor research: Investigating auditory, visual, and motor interaction in virtual reality environments (Abstract) (T. Kluss, N. Schult, T. Hantel, C. Zetzsche, K. Schill), 2011.
Bibtex Entry:
@Misc{Kluss2011,
  author   = {T. Kluss and N. Schult and T. Hantel and C. Zetzsche and K. Schill},
  title    = {Multi-sensory-motor research: Investigating auditory, visual, and motor interaction in virtual reality environments (Abstract)},
  year     = {2011},
  abstract = {Perception in natural environments is inseparably linked to motor action. In fact, we consider action an essential component of perceptual representation. But these representations are inherently difficult to investigate: Traditional experimental setups are limited by the lack of flexibility in manipulating spatial features. To overcome these problems, virtual reality (VR) experiments seem to be a feasible alternative, but these setups typically lack ecological realism due to the use of "unnatural" interface-devices (joystick). Thus, we propose an experimental apparatus which combines multisensory perception and action in an ecologically realistic way. The basis is a 10-foot hollow sphere (VirtuSphere) placed on a platform that allows free rotation. A subject inside can walk in any direction for any distance immersed into virtual environment. Both the rotation of the sphere and movement of the subject's head are tracked to process the subject's view within the VR-environment presented on a head-mounted display. Moreover, auditory features are dynamically processed taking greatest care of exact alignment of sound-sources and visual objects using ambisonic-encoded audio processed by a HRTF-filterbank. We present empirical data that confirm ecological realism of this setup and discuss its suitability for multi-sensory-motor research.},
  keywords = {former_other},
}