Synaptic energy drives the information processing mechanisms in spiking neural networks

  • Received: 01 November 2012 Accepted: 29 June 2018 Published: 01 October 2013
  • MSC : Primary: 58F15, 58F17; Secondary: 53C35.

  • Flow of energy and free energy minimization underpins almost every aspect of naturally occurring physical mechanisms. Inspired by this fact this work establishes an energy-based framework that spans the multi-scale range of biological neural systems and integrates synaptic dynamic, synchronous spiking activity and neural states into one consistent working paradigm. Following a bottom-up approach, a hypothetical energy function is proposed for dynamic synaptic models based on the theoretical thermodynamic principles and the Hopfield networks. We show that a synapse exposes stable operating points in terms of its excitatory postsynaptic potential as a function of its synaptic strength. We postulate that synapses in a network operating at these stable points can drive this network to an internal state of synchronous firing. The presented analysis is related to the widely investigated temporal coherent activities (cell assemblies) over a certain range of time scales (binding-by-synchrony). This introduces a novel explanation of the observed (poly)synchronous activities within networks regarding the synaptic (coupling) functionality. On a network level the transitions from one firing scheme to the other express discrete sets of neural states. The neural states exist as long as the network sustains the internal synaptic energy.

    Citation: Karim El Laithy, Martin Bogdan. Synaptic energy drives the information processing mechanisms in spiking neural networks[J]. Mathematical Biosciences and Engineering, 2014, 11(2): 233-256. doi: 10.3934/mbe.2014.11.233

    Related Papers:

    [1] Achilleas Koutsou, Jacob Kanev, Maria Economidou, Chris Christodoulou . Integrator or coincidence detector --- what shapes the relation of stimulus synchrony and the operational mode of a neuron?. Mathematical Biosciences and Engineering, 2016, 13(3): 521-535. doi: 10.3934/mbe.2016005
    [2] Stefano Cosenza, Paolo Crucitti, Luigi Fortuna, Mattia Frasca, Manuela La Rosa, Cecilia Stagni, Lisa Usai . From Net Topology to Synchronization in HR Neuron Grids. Mathematical Biosciences and Engineering, 2005, 2(1): 53-77. doi: 10.3934/mbe.2005.2.53
    [3] Guowei Wang, Yan Fu . Spatiotemporal patterns and collective dynamics of bi-layer coupled Izhikevich neural networks with multi-area channels. Mathematical Biosciences and Engineering, 2023, 20(2): 3944-3969. doi: 10.3934/mbe.2023184
    [4] Manuela Aguiar, Ana Dias, Miriam Manoel . Gradient and Hamiltonian coupled systems on undirected networks. Mathematical Biosciences and Engineering, 2019, 16(5): 4622-4644. doi: 10.3934/mbe.2019232
    [5] Diego Fasoli, Stefano Panzeri . Mathematical studies of the dynamics of finite-size binary neural networks: A review of recent progress. Mathematical Biosciences and Engineering, 2019, 16(6): 8025-8059. doi: 10.3934/mbe.2019404
    [6] Hwayeon Ryu, Sue Ann Campbell . Stability, bifurcation and phase-locking of time-delayed excitatory-inhibitory neural networks. Mathematical Biosciences and Engineering, 2020, 17(6): 7931-7957. doi: 10.3934/mbe.2020403
    [7] Anna Cattani . FitzHugh-Nagumo equations with generalized diffusive coupling. Mathematical Biosciences and Engineering, 2014, 11(2): 203-215. doi: 10.3934/mbe.2014.11.203
    [8] Xiaomeng Feng, Taiping Wang, Xiaohang Yang, Minfei Zhang, Wanpeng Guo, Weina Wang . ConvWin-UNet: UNet-like hierarchical vision Transformer combined with convolution for medical image segmentation. Mathematical Biosciences and Engineering, 2023, 20(1): 128-144. doi: 10.3934/mbe.2023007
    [9] Ningning Zhao, Shihao Cui . Study on 4D taxiing path planning of aircraft based on spatio-temporal network. Mathematical Biosciences and Engineering, 2023, 20(3): 4592-4608. doi: 10.3934/mbe.2023213
    [10] Kyle Wendling, Cheng Ly . Firing rate distributions in a feedforward network of neural oscillators with intrinsic and network heterogeneity. Mathematical Biosciences and Engineering, 2019, 16(4): 2023-2048. doi: 10.3934/mbe.2019099
  • Flow of energy and free energy minimization underpins almost every aspect of naturally occurring physical mechanisms. Inspired by this fact this work establishes an energy-based framework that spans the multi-scale range of biological neural systems and integrates synaptic dynamic, synchronous spiking activity and neural states into one consistent working paradigm. Following a bottom-up approach, a hypothetical energy function is proposed for dynamic synaptic models based on the theoretical thermodynamic principles and the Hopfield networks. We show that a synapse exposes stable operating points in terms of its excitatory postsynaptic potential as a function of its synaptic strength. We postulate that synapses in a network operating at these stable points can drive this network to an internal state of synchronous firing. The presented analysis is related to the widely investigated temporal coherent activities (cell assemblies) over a certain range of time scales (binding-by-synchrony). This introduces a novel explanation of the observed (poly)synchronous activities within networks regarding the synaptic (coupling) functionality. On a network level the transitions from one firing scheme to the other express discrete sets of neural states. The neural states exist as long as the network sustains the internal synaptic energy.


    [1] PLoS Comput. Biol., 5 (2009), e1000259, 12 pp.
    [2] Ph.D thesis, Leipzig University, 2011.
    [3] in Artificial Neural Networks - ICANN 2009, Lecture Notes in Computer Science, 5768, Springer, 2009, 181-190.
    [4] in ESANN Proceedings, 2010, 357-362.
    [5] in Artificial Neural Networks and Machine Learning - ICANN 2011, Lecture Notes in Computer Science, 6792, Springer, 2011, 40-47.
    [6] in Artificial Neural Networks and Machine Learning - ICANN 2011, Lecture Notes in Computer Science, 6792, Springer, 2011, 56-63.
    [7] J. of Artificial Intelligence and Soft Computing Research, 1 (2011), 17-26.
    [8] in ICONIP 2012, LNCS, 7664, Springer, 2012, 425-434.
    [9] Computational Neuroscience, A Bradford Book, MIT Press, Cambridge, MA, 2003.
    [10] Consciousness and Cognition, 8 (1999), 128-151.
    [11] Trends in Cognitive Sciences, 5 (2001), 16-25.
    [12] Trends in Cognitive Sciences, 13 (2009), 293-301.
    [13] Nature Reviews Neuroscience, 11 (2010), 127-138.
    [14] Ph.D thesis, Institute of Theoretical Computer Science, Austria, 1998.
    [15] Proc. Nat. Acad. Sci. U.S.A., 79 (1982), 2554-2558.
    [16] Science, 304 (2004), 559-564.
    [17] IEEE Transactions on Neural Networks, 15 (2004), 1063-1070.
    [18] Neural Comput., 18 (2006), 245-282.
    [19] Physical Review Letters, 102 (2009), 118110, 4 pp.
    [20] Neuron, 33 (2002), 765-777.
    [21] Trends in Neurosciences, 27 (2004), 744-750.
    [22] Biol. Cybernet., 98 (2008), 459-478.
    [23] CRC Press, 1994.
    [24] Consciousness and Cognition, 8 (1999), 123-127.
    [25] Phys. Rev. E, 72 (2005), 026223, 6 pp.
    [26] Technical report, UW CSE, 2003.
    [27] EMBO Reports, 8 (2007), S16-S19.
    [28] Neurobiology of Learning and Memory, 83 (2005), 79-92.
    [29] Physical Review E, 74 (2006), 011905, 6 pp.
    [30] Neuron, 24 (1999), 95-104.
  • This article has been cited by:

    1. Karim Ellatihy, Martin Bogdan, 2017, Chapter 45, 978-3-319-68599-1, 389, 10.1007/978-3-319-68600-4_45
  • Reader Comments
  • © 2014 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(2154) PDF downloads(432) Cited by(1)

Article outline

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog