Paper accepted: Towards Grasping with Spiking Neural Networks for Anthropomorphic Robot Hands

We got a paper about grasping with spiking neural networks accepted for ICANN 2017!

The complete architecture is shown in the figure. The hand network (left ) receives the proprioception of all fingers and a grasp type signal to generate fingertip targets. Each finger network (middle) receives its proprioception and fingertip target to generate motor commands.

Screen Shot 2017-07-10 at 11.00.48

Abstract:

Representation and execution of movement in biology is an active field of research relevant to neurorobotics. Humans can remember grasp motions and modify them during execution based on the shape and the intended interaction with objects. We present a hierarchical spiking neural network with a biologically inspired architecture for representing different grasp motions. We demonstrate the ability of our network to learn from human demonstration using synaptic plasticity on two different exemplary grasp types (pinch and cylinder). We evaluate the performance of the network in simulation and on a real anthropomorphic robotic hand. The network exposes the ability of learning finger coordination and synergies between joints that can be used for grasping.

Keywords:

grasp motion representation, spiking networks, neurorobotics, motor primitives.

[1] J. C. Vasquez Tieck, H. Donat, J. Kaiser, I. Peric, S. Ulbrich, A. Roennau, Z. Marius, and R. Dillmann, “Towards Grasping with Spiking Neural Networks for Anthropomorphic Robot Hands,” ICANN, 2017.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s