Björn Kindler

Brain-inspired Computing

Brain-inspired Computing Course (Summer 2019)

Synopsis

In the course "Brain-inspired computing", we give an introduction to biophysical models of nerve cells (neurons) and explore principles of computation and self-organization (learning) in biological and artificial neural networks.

Starting from the ionic current flow across cell membranes, we retrace the Nobel-price winning work of Alan Hodgkin and Andrew Huxley to extract simplified, mathematically tractable models of neural input integration and the generation of neural responses, so-called action potentials. We study signaling between neurons via chemical synapses and analyze the firing response of leaky integrate and firing (LIF) models to spatio-temporal input. Since such a low level description quickly becomes intractable in larger networks, we characterize network evolution and neural coding by statistical methods, such as Fokker-Plank equations, auto- and cross-correlation functions and dynamical systems theory. We then turn to the wide field of neural plasticity, that is, the brain's fascinating ability of self-organized adaptation and learning. We review experimental findings, functional plasticity models and emergent computational capabilities of neural networks with a particular focus on short-term and long-term synaptic plasticity. We touch on the design of artificial physical implementations of simplified neuron- and synapse models in micro-electronic circuitry, enabling the development of novel high-performance neuro-inspired computing platforms. Finally, we examine deep conceptual similarities between information processing and learning in spiking neural networks on the one hand, and principles of Bayesian computation and machine learning on the other, by the examples of multilayer perceptron networks and Boltzmann machines.

A recurring mathematical challenge in computational neuroscience, that will also accompany us throughout the course, is the development of a consistent mathematical description across multiple levels of abstraction, in order to find a balance between biophysical accuracy and analytical tractability of a complex system. Students are expected to be familiar with calculus and linear algebra. For computer simulations, which take a significant share of the exercises, at least basic knowledge in the Python programming language is strongly recommended.

Team

Sebastian Schmitt (Lecturer)

LSF listing Class material (Moodle) Practical Groups

Book Petrovici, "Form Versus Function: Theory and Models for Neuronal Substrates" (free from university network)

Book Gerstner et al., "Neuronal Dynamics: From single neurons to networks and models of cognition" (free online version)