direkt zum Inhalt springen

direkt zum Hauptnavigationsmenü

Sie sind hier

TU Berlin

Inhalt des Dokuments

Modelle neuronaler Systeme

Ausgewählte Abstracts

Optimal information transmission in a parallel array of integrate-and-fire neurons
Zitatschlüssel hocht03
Autor Hoch, T. and Wenning, G. and Obermayer, K.
Buchtitel Proceedings of the 5th Göttingen Neurobiology Conference
Jahr 2003
Zusammenfassung It is well established, that noise can improve the transmission of weak signals in threshold elements. This effect, called stochastic resonance, has been extensively studied in the context of single neurons. Metabolic considerations and neurophysiological measurements indicate that biological neural systems prefer information transmission via many parallel low intensity channels, compared to few high intensity ones. Thus the information transmission properties of parallel summing arrays of neurons are of substantial interest, and it was recently shown, that the optimal noise level, which maximizes the information transmission over the array, depends on the statistical properties of the input signal and on the number of array elements (Stocks, Phys. Rev. E, 63.04114, 2001, p.1-4). Thus adaptation of single neurons to the optimal noise level, would require the knowledge of the array size. But this is information about the global architecture of the system which is not - in an obvious way - available locally at the single cell.\\\\ We consider a parallel summing array of leaky integrate-and-fire neurons in which the input to each neuron is the sum of a common aperiodic Gaussian stimulus and additive white Gaussian noise. We show that the information rate is maximized at an optimal noise level and that this optimal noise level increases with the number of neurons in the array sub-logarithmically. Furthermore, the information rate curves are very flat at the maximum, which indicates, that the amount of transmitted information does not degrade in a dramatic way for a relative broad region around the optimal noise level. Therefore, we show that a local learning rule performs almost as good as a global one for sufficiently large number of neurons in the array (N\\\>100). Note that this would be the case for the size of typical cell assemblies in cortex.\\\\ On the other hand, these findings change dramatically if we take metabolic considerations into account. Information transmission in nervous systems is metabolically expensive, especially the generation of action potentials consumes a huge amount of energy. Thus to ensure energy efficient coding we postulate that it is advantageous for neurons to maximize the ratio between transmitted information and cost. We assume that the average energetic cost per unit time is a function of the activity of the neurons in the array and a fixed amount of basis cost. Numerical simulations demonstrate that only for the supra-threshold region the optimal noise level depends on the number of neurons in the array, whereas it does not depend on the array size (N\\\>10) for sub-threshold signals. Thus in the latter case the single neurons can use locally available information to adjust to the optimal noise level, even if they do not know the global architecture of the entire system. [Supported by the DFG (SFB 618)].
Link zur Publikation Download Bibtex Eintrag

Zusatzinformationen / Extras


Schnellnavigation zur Seite über Nummerneingabe