The rule of the simulation that I would like to have is that the number of computer elements required to simulate a large physical system is only proportional to the space-time-volume of the physical system. I don't want to have an explosion. That is, if you say I want to explain this much physics, I can do it exactly and I need a certain size computer. If doubling the volume of space and time means I'll need an exponentially larger computer, I consider that against the rules.
Richard Feynman, Int. J. Theor. Phys., Vol. 21, Nos. 6/7, 1982
Physics traditionally employs 3 methods to study complex systems in nature : Experiments (experimental physics), analytical analysis (theoretical physics) and numerical simulations (computational physics). In a fourth method, synthetic physical models can be constructed to emulate constituents and their interactions and to study the temporal evolution of a system under controled conditions. The advantage of this approach arises when the number of constituents and the complexity of their interactions make numerical simulations intractable due to prohibitive power requirements or simulation times. Emulated physical models respond to internal parameter changes or external stimuli with the physical time constants of their constituents which is one essential prerequisite for their scalability.
Our Heidelberg research group is working on the second approach. We are involved in international research programs aiming at building and using very large scale physical model electronics systems of neural circuits.
The goal of our research is to contribute to understanding the principles of information processing in the brain and to develop fundamentally different information processing devices. Such devices will help to bridge the enormous energy gap between current von Neumann machines and the brain. They are expected to exhibit similar fault tolerance because computation and memory are not spatially separated but rather spread across a network of almost identical constituents. Finally, and arguably most fundamental is the ability of neural systems to self-organize based on internal rules and external inputs.
An overview of my present, past and planned projects can be found here. I have developed a strong engagement in the European Human Brain Flagship Project (HBP) to link our physical model approach with large scale numerical simulations, notably at the BlueBrain Project at the EPFL Lausanne. The HBP has emerged as one out of two winners from an international competition and started in october 2013. In 2016 the first major milestone was achieved with the opening of the large-scale neuromorphic computing facility in Heidelberg.
During the last 30 years I contributed to several major particle physics experiments at DESY and CERN. Specifically: JADE at PETRA, UA2 at the CERN Proton-Antiproton Collider, H1 at the HERA Electron-Proton Collider and ATLAS at the LHC. I also contributed to European initiatives in detector research, notably the RD1 and RD27 collaborations at CERN. An overview of my publications from this work can be found in INSPIRE.
Based on my experience from designing and running custom electronic data filters ('triggers') for particle detectors and the establishment of the Heidelberg ASIC Laboratory in 1994, I initiated and developed a new line of research in Heidelberg (see above).