Gaze Behavior Analysis and Gaze Position Prediction in Immersive Virtual Reality
In virtual reality (VR) systems, users’ gaze information has gained importance in recent years. It can be applied to many aspects, including VR content design, eye-movement based interaction, gaze-contingent rendering, etc. In this context, it becomes increasingly important to understand users’ gaze behaviors in virtual reality and to predict users’ gaze positions. This paper presents research in gaze behavior analysis and gaze position prediction in virtual reality. Specifically, this paper focuses on static virtual scenes and dynamic virtual scenes under free-viewing conditions. Users’ gaze data in virtual scenes are collected and statistical analysis is performed on the recorded data. The analysis reveals that users’ gaze positions are correlated with their head rotation velocities and the salient regions of the content. In dynamic scenes, users’ gaze positions also have strong correlations with the positions of dynamic objects. A data-driven eye-head coordination model is proposed for realtime gaze prediction in static scenes and a CNN-based model is derived for predicting gaze positions in dynamic scenes.
Our related work:
EHTask: Recognizing User Tasks from Eye and Head Movements in Immersive Virtual Reality
Research progress of user task prediction and algorithm analysis (in Chinese)
Eye Fixation Forecasting in Task-Oriented Virtual Reality
FixationNet: Forecasting Eye Fixations in Task-Oriented Virtual Environments
DGaze: CNN-Based Gaze Prediction in Dynamic Scenes
Temporal Continuity of Visual Attention for Future Gaze Prediction in Immersive Virtual Reality
SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction