Real Saccades for Virtual Robots

Vision is a central theme of research in both robotics and neuroscience. Yet, even though the requirements faced by robots and humans that need to perceive their environments are quite similar (high resolution, low latency, wide field of view etc.), technical vision systems are fundamentally different from the human visual system. One particular reason for these differences are the special properties of the human eye.

A special characteristic of human vision are saccadic eye movements:

Saccade refers to a rapid jerk-like movement of the eyeball which subserves vision by redirecting the visual axis to a new location.”

From John Findlay and Robin Walker (2012) Human saccadic eye movements. Scholarpedia, 7(7):5095.

Clearly, compared to a camera that is statically mounted next to a robot, human vision strongly relies on active actuation of the eyes. Importantly, the perspective on the scene changes after every eye movement which in turn influences the following saccades. Investigating computational models for saccadic eye movements therefore calls for a neurorobotics approach which directly captures this closed-loop interdependence between changing visual input and saccade generation.

To address the challenge of developing and evaluating realistic models of saccadic eye movements, SP10 is closely collaborating with other partners from the Human Brain Project in Co-Design Project 4 on Visuo-Motor Integration. In this project Rainer Goebel and Marion Senden from Maastricht University are currently developing a neural model for saccade generation with nest.

Last week, Mario visited us in Munich to integrate a first working version of the model together with SP10-colleague Florian Walter into the Neurorobotics Platform. After an introductory talk by Mario on visuo-motor integration, we directly started developing a new experiment for the Neurorobotics Platform that controls the eyes of our virtual iCub robot based on the output of the saccade model. This experiment, for the first time, interfaces a nest model comprised of analog non-spiking neurons with the Neurorobotics Platform. In future releases of the platform, these new neuron types will give both neuroscientists and roboticists even more freedom in defining their brain models.

In the next step, the saccade generation model will be connected to a salience map model to study saccades in complex visual scenes that are simulated on the Neurorobotics Platform.

Many thanks to the prompt support from the SP10 development team, especially Kenny Sharma!

The prototype experiment running on the Neurorobotics Platform.
Photo 2
Florian Walter (Technical University of Munich) and Mario Senden (Maastricht University) in front of the integrated prototype experiment on saccade generation.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s