Björn Kindler

Brain-inspired Computing

Brain-inspired Computing Course (Winter 2021/2022)

Synopsis

In the course "Brain-inspired computing", we give an introduction to biophysical models of nerve cells (neurons) and explore principles of computation and self-organization (learning) in biological and artificial neural networks.

Starting from the ionic current flow across cell membranes, we retrace the Nobel-price winning work of Alan Hodgkin and Andrew Huxley to extract simplified, mathematically tractable models of neural input integration and the generation of neural responses, so-called action potentials. We study signaling between neurons via chemical synapses and analyze the firing response of leaky integrate and firing (LIF) models to spatio-temporal input. Since such a low level description quickly becomes intractable in larger networks, we characterize network evolution and neural coding by statistical methods, such as Fokker-Plank equations, auto- and cross-correlation functions and dynamical systems theory. We then turn to the wide field of neural plasticity, that is, the brain's fascinating ability of self-organized adaptation and learning. We review experimental findings, functional plasticity models and emergent computational capabilities of neural networks with a particular focus on short-term and long-term synaptic plasticity. We touch on the design of artificial physical implementations of simplified neuron- and synapse models in micro-electronic circuitry, enabling the development of novel high-performance neuro-inspired computing platforms. Finally, we examine deep conceptual similarities between information processing and learning in spiking neural networks on the one hand, and principles of Bayesian computation and machine learning on the other, by the examples of multilayer perceptron networks and Boltzmann machines.

A recurring mathematical challenge in computational neuroscience, that will also accompany us throughout the course, is the development of a consistent mathematical description across multiple levels of abstraction, in order to find a balance between biophysical accuracy and analytical tractability of a complex system. Students are expected to be familiar with calculus and linear algebra. For computer simulations, which take a significant share of the exercises, at least basic knowledge in the Python programming language is strongly recommended.

Team

Andreas Baumbach (Lecturer), Julian Göltz (Lead tutor), Timo Gierlich (Tutor), Philipp Spilger (Tutor)

Covid-19 information

The current planning is for an in-person lecture in the Winter Semester. We are also streaming the lecture via zoom and provide the recording via the the Übungsgruppensystem of the faculty link. If you do not have access and feel that you should please contact us. All in-person participants need to adhere to the university Covid rules.

Homework

Homework sheets are distributed via the Übungsgruppensystem, every Tuesday. Hand-in is on the following Tuesday prior to the lecture (14:00), either physically in the lecture hall or digitally via the Übungsgruppensystem. Tasks that contain programming exercises should be executable on the EBRAINS collaboratory. Hand-in must also happen via the Übungsgruppensystem.

Book Petrovici, "Form Versus Function: Theory and Models for Neuronal Substrates" (free from university network)

Book Gerstner et al., "Neuronal Dynamics: From single neurons to networks and models of cognition" (free online version)