Gaze Allocation Analysis for a Visually Guided Manipulation Task

From Animals to Animats 12: The 12th International Conference on the Simulation of Adaptive Behavior (SAB'12), MIT Press, 2012
In this paper we present a detailed analysis of a reward-based gaze allocation mechanism in a simulated humanoid robot. This mechanism coordinates the robot’s oculomotor system with the rest of its body motor systems, whilst the robot is engaged in a pick-and-place task. The robot has to decide where its gaze should be directed in order to gather information that is relevant to the success of its physical actions. We test three predictions about gaze behaviour by varying the reach/grasp sensitivity in the actions, the observation noise and the camera’s field of view. The results show that our reward-based gaze allocation strategy performs better and is more robust compared to a random and round robin gaze allocation strategies.


<a href="">Gaze Allocation Analysis for a Visually Guided Manipulation Task</a>