Energy Efficient Neurons With Generalized Inverse Gaussian Conditional and Marginal Hitting Times

Neuronal information processing is energetically costly. Energy supply restrictions on information processing have resulted in the evolution of brains to compute and communicate information with remarkable energy efficiency. Indeed, energy minimization subject to functional constraints is a unifying...

Full description

Saved in:
Bibliographic Details
Published inIEEE transactions on information theory Vol. 61; no. 8; pp. 4390 - 4398
Main Authors Jie Xing, Berger, Toby, Sungkar, Mustafa, Levy, William B.
Format Journal Article
LanguageEnglish
Published New York IEEE 01.08.2015
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0018-9448
1557-9654
DOI10.1109/TIT.2015.2444401

Cover

More Information
Summary:Neuronal information processing is energetically costly. Energy supply restrictions on information processing have resulted in the evolution of brains to compute and communicate information with remarkable energy efficiency. Indeed, energy minimization subject to functional constraints is a unifying principle. Toward better comprehension of neuronal information processing and communication from an information-energy standpoint, we consider a continuous time, continuous state-space neuron model with a generalized inverse Gaussian (GIG) conditional density. This GIG model arises from a Lévy diffusion that contains both homogeneous Poisson processes and Wiener processes with drift as special cases. We show that, when the energy constraints consist of a tripartite family apropos of the GIG model, the distribution of input excitation, A, that maximizes bits per Joule (bpJ) generates an output interspike interval duration T that possesses a related GIG marginal distribution. Most importantly, we obtain the exact expression for the bpJ-maximizing distribution of A in terms of Bessel functions that are readily evaluated numerically. This permits display of the strictly concave curves of bpJ-maximized mutual information I(A; T). Because bpJ equals bits per second/watt, these curves serve as the Shannon-theory capacity v. power curves for GIG neuron models. We show that the noise in GIG channels is essentially multiplicative, so our results may have significance for broader classes of fading channels.
Bibliography:SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ISSN:0018-9448
1557-9654
DOI:10.1109/TIT.2015.2444401