KIP-Veröffentlichungen

Jahr 2015
Autor(en) Jakob Jordan, Tom Tetzlaff, Mihai Petrovici, Oliver Breitwieser, Ilja Bytschok, Johannes Bill, Johannes Schemmel, Karlheinz Meier and Markus Diesmann
Titel Deterministic neural networks as sources of uncorrelated noise for probabilistic computations
KIP-Nummer HD-KIP 15-95
KIP-Gruppe(n) F9
Dokumentart Paper
Keywords (angezeigt) spiking noise, decorrelation, inhibitory feedback, neural sampling
Quelle BMC Neuroscience 2015, 16(Suppl 1):P62
doi doi:10.1186/1471-2202-16-S1-P62
Abstract (en)

Neural-network models of brain function often rely on the presence of noise. To date, the interplay of microscopic noise sources and network function is only poorly understood. In computer simulations and in neuromorphic hardware, the number of noise sources (random-number generators) is limited. In consequence, neurons in large functional network models have to share noise sources and are therefore correlated. In general, it is unclear how shared-noise correlations affect the performance of functional network models. Further, there is so far no solution to the problem of how a limited number of noise sources can supply a large number of functional units with uncorrelated noise.

Here, we investigate the performance of neural Boltzmann machines. We show that correlations in the background activity are detrimental to the sampling performance and that the deviations from the target distribution scale inversely with the number of noise sources. Further, we show that this problem can be overcome by replacing the finite ensemble of independent noise sources by a recurrent neural network with the same number of units. As shown recently, inhibitory feedback, abundant in biological neural networks, serves as a powerful decorrelation mechanism: Shared-noise correlations are actively suppressed by the network dynamics. By exploiting this effect, the network performance is significantly improved. Hence, recurrent neural networks can serve as natural finite-size noise sources for functional neural networks, both in biological and in synthetic neuromorphic substrates. Finally we investigate the impact of sampling network parameters on its ability to faithfully represent a given well-defined distribution. We show that sampling networks with sufficiently strong negative feedback can intrinsically suppress correlations in the background activity, and thereby improve their performance substantially.

bibtex
@conference{jordan2015noise,
  author   = {Jakob Jordan, Tom Tetzlaff, Mihai Petrovici, Oliver Breitwieser, Ilja Bytschok, Johannes Bill, Johannes Schemmel, Karlheinz Meier and Markus Diesmann},
  title    = {Deterministic neural networks as sources of uncorrelated noise for probabilistic computations},
  booktitle = {},
  year     = {2015},
  volume   = {16},
  pages    = {Suppl 1: P62}
}
Datei Poster
URL BMC short abstract
KIP - Bibliothek
Im Neuenheimer Feld 227
Raum 3.402
69120 Heidelberg