A resource-rational account of human eye movements during immersive visual search

When you search for something in a rich 3D scene, your eyes follow specific patterns that reveal how the mind knows what to pay attention to. Virtual reality (VR) lets us measure that behavior in settings closer to everyday life. Standard models often assume an unrealistically simple world, so in this paper we ask how someone with limited time and attention ought to move their gaze when the search environment is complex and respects the structure of the real world. A computational model trained with reinforcement learning predicts gaze shifts that match people’s data in VR, suggesting that naturalistic visual search may reflect a rational use of limited mental resources.

Angela Radulescu
Angela Radulescu
Assistant Professor

My research focuses on the learning mechanisms underlying changes in mental health.