Visual search during navigation in complex virtual environments: an eyetracking study

This source preferred by Christos Gatzidis and Jan Wiener

Authors: Ramsey, C., Gatzidis, C., Miellet, S. and wiener, J.

Visual search in real situations often involves head and body movement through space. We present results from a visual search task where participants were required to actively navigate through virtual complex scenes whilst searching for a target object. Using a large, complex virtual environment, eye-movements were recorded for three distinct phases; static visual search, visual search during navigation and locomotion without search. Movement through the virtual environment was characterised. By these means we are able to disentangle the specific contributions search and locomotion have on gaze behaviour and analyse how gaze behaviour differs, depending on the form of trajectory. In addition to the benefits of allowing participants to navigate freely through the virtual environment, we show further benefits of integrating eye-tracking with virtual environments by demonstrating how we go beyond the screen and translate 2D gaze coordinates to actual 3D world coordinates, allowing for novel analysis. We investigate how gaze control behaves within the virtual environments during visual search. For example, the distance into the environment of a fixation or gaze point and the distance between a fixation and target object in the virtual environment are two measures that go beyond standard 2D analysis of eye-recordings.

The data on this page was last updated at 04:40 on August 20, 2017.