PerimetryNet: A multiscale fine grained deep network for three-dimensional eye gaze estimation using visual field analysis

Authors: Yu, S., Wang, Z., Zhou, S., Yang, X. and Wu, C.

Journal: Computer Animation and Virtual Worlds

Volume: 34

Issue: 5

eISSN: 1546-427X

ISSN: 1546-4261

DOI: 10.1002/cav.2141

Abstract:

Three-dimensional gaze estimation aims to reveal where a person is looking, which plays an important role in identifying users' point-of-interest in terms of the direction, attention and interactions. Appearance-based gaze estimation methods could provide relatively unconstrained gaze tracking from commodity hardware. Inspired by medical perimetry test, we have proposed a multiscale framework with visual field analysis branch to improve estimation accuracy. The model is based on the feature pyramids and predicts vision field to help gaze estimation. In particular, we analysis the effect of the multiscale component and the visual field branch on challenging benchmark datasets: MPIIGaze and EYEDIAP. Based on these studies, our proposed PerimetryNet significantly outperforms state-of-the-art methods. In addition, the multiscale mechanism and visual field branch can be easily applied to existing network architecture for gaze estimation. Related code would be available at public repository https://github.com/gazeEs/PerimetryNet.

https://eprints.bournemouth.ac.uk/38516/

Source: Scopus

PerimetryNet: A multiscale fine grained deep network for three-dimensional eye gaze estimation using visual field analysis

Authors: Yu, S., Wang, Z., Zhou, S., Yang, X. and Wu, C.

Journal: Computer Animation and Virtual Worlds

ISSN: 1546-4261

Abstract:

Three-dimensional gaze estimation aims to reveal where a person is looking, which plays an important role in identifying users' point-of-interest in terms of the direction, attention and interactions. Appearance-based gaze estimation methods could provide relatively unconstrained gaze tracking from commodity hardware. Inspired by medical perimetry test, we have proposed a multiscale framework with visual field analysis branch to improve estimation accuracy. The model is based on the feature pyramids and predicts vision field to help gaze estimation. In particular, we analysis the effect of the multiscale component and the visual field branch on challenging benchmark datasets: MPIIGaze and EYEDIAP. Based on these studies, our proposed PerimetryNet significantly outperforms state-of-the-art methods. In addition, the multiscale mechanism and visual field branch can be easily applied to existing network architecture for gaze estimation. Related code would be available at public repository https://github.com/gazeEs/PerimetryNet.

https://eprints.bournemouth.ac.uk/38516/

Source: BURO EPrints