Synaptic energy drives the information processing mechanisms in spiking neural networks

  • Received: 01 November 2012 Accepted: 29 June 2018 Published: 01 October 2013
  • MSC : Primary: 58F15, 58F17; Secondary: 53C35.

  • Flow of energy and free energy minimization underpins almost every aspect of naturally occurring physical mechanisms. Inspired by this fact this work establishes an energy-based framework that spans the multi-scale range of biological neural systems and integrates synaptic dynamic, synchronous spiking activity and neural states into one consistent working paradigm. Following a bottom-up approach, a hypothetical energy function is proposed for dynamic synaptic models based on the theoretical thermodynamic principles and the Hopfield networks. We show that a synapse exposes stable operating points in terms of its excitatory postsynaptic potential as a function of its synaptic strength. We postulate that synapses in a network operating at these stable points can drive this network to an internal state of synchronous firing. The presented analysis is related to the widely investigated temporal coherent activities (cell assemblies) over a certain range of time scales (binding-by-synchrony). This introduces a novel explanation of the observed (poly)synchronous activities within networks regarding the synaptic (coupling) functionality. On a network level the transitions from one firing scheme to the other express discrete sets of neural states. The neural states exist as long as the network sustains the internal synaptic energy.

    Citation: Karim El Laithy, Martin Bogdan. Synaptic energy drives the information processing mechanisms in spiking neural networks[J]. Mathematical Biosciences and Engineering, 2014, 11(2): 233-256. doi: 10.3934/mbe.2014.11.233

    Related Papers:

  • Flow of energy and free energy minimization underpins almost every aspect of naturally occurring physical mechanisms. Inspired by this fact this work establishes an energy-based framework that spans the multi-scale range of biological neural systems and integrates synaptic dynamic, synchronous spiking activity and neural states into one consistent working paradigm. Following a bottom-up approach, a hypothetical energy function is proposed for dynamic synaptic models based on the theoretical thermodynamic principles and the Hopfield networks. We show that a synapse exposes stable operating points in terms of its excitatory postsynaptic potential as a function of its synaptic strength. We postulate that synapses in a network operating at these stable points can drive this network to an internal state of synchronous firing. The presented analysis is related to the widely investigated temporal coherent activities (cell assemblies) over a certain range of time scales (binding-by-synchrony). This introduces a novel explanation of the observed (poly)synchronous activities within networks regarding the synaptic (coupling) functionality. On a network level the transitions from one firing scheme to the other express discrete sets of neural states. The neural states exist as long as the network sustains the internal synaptic energy.


    加载中
    [1] PLoS Comput. Biol., 5 (2009), e1000259, 12 pp.
    [2] Ph.D thesis, Leipzig University, 2011.
    [3] in Artificial Neural Networks - ICANN 2009, Lecture Notes in Computer Science, 5768, Springer, 2009, 181-190.
    [4] in ESANN Proceedings, 2010, 357-362.
    [5] in Artificial Neural Networks and Machine Learning - ICANN 2011, Lecture Notes in Computer Science, 6792, Springer, 2011, 40-47.
    [6] in Artificial Neural Networks and Machine Learning - ICANN 2011, Lecture Notes in Computer Science, 6792, Springer, 2011, 56-63.
    [7] J. of Artificial Intelligence and Soft Computing Research, 1 (2011), 17-26.
    [8] in ICONIP 2012, LNCS, 7664, Springer, 2012, 425-434.
    [9] Computational Neuroscience, A Bradford Book, MIT Press, Cambridge, MA, 2003.
    [10] Consciousness and Cognition, 8 (1999), 128-151.
    [11] Trends in Cognitive Sciences, 5 (2001), 16-25.
    [12] Trends in Cognitive Sciences, 13 (2009), 293-301.
    [13] Nature Reviews Neuroscience, 11 (2010), 127-138.
    [14] Ph.D thesis, Institute of Theoretical Computer Science, Austria, 1998.
    [15] Proc. Nat. Acad. Sci. U.S.A., 79 (1982), 2554-2558.
    [16] Science, 304 (2004), 559-564.
    [17] IEEE Transactions on Neural Networks, 15 (2004), 1063-1070.
    [18] Neural Comput., 18 (2006), 245-282.
    [19] Physical Review Letters, 102 (2009), 118110, 4 pp.
    [20] Neuron, 33 (2002), 765-777.
    [21] Trends in Neurosciences, 27 (2004), 744-750.
    [22] Biol. Cybernet., 98 (2008), 459-478.
    [23] CRC Press, 1994.
    [24] Consciousness and Cognition, 8 (1999), 123-127.
    [25] Phys. Rev. E, 72 (2005), 026223, 6 pp.
    [26] Technical report, UW CSE, 2003.
    [27] EMBO Reports, 8 (2007), S16-S19.
    [28] Neurobiology of Learning and Memory, 83 (2005), 79-92.
    [29] Physical Review E, 74 (2006), 011905, 6 pp.
    [30] Neuron, 24 (1999), 95-104.
  • Reader Comments
  • © 2014 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1879) PDF downloads(432) Cited by(1)

Article outline

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog