Loading [MathJax]/jax/output/SVG/jax.js
Special Issues

Enhancement of gamma oscillations in E/I neural networks by increase of difference between external inputs

  • Experimental observations suggest that gamma oscillations are enhanced by the increase of the difference between the components of external stimuli. To explain these experimental observations, we firstly construct a small excitatory/inhibitory (E/I) neural network of IAF neurons with external current input to E-neuron population differing from that to I-neuron population. Simulation results show that the greater the difference between the external inputs to excitatory and inhibitory neurons, the stronger gamma oscillations in the small E/I neural network. Furthermore, we construct a large-scale complicated neural network with multi-layer columns to explore gamma oscillations regulated by external stimuli which are simulated by using a novel CUDA-based algorithm. It is further found that gamma oscillations can be caused and enhanced by the difference between the external inputs in a large-scale neural network with a complicated structure. These results are consistent with the existing experimental findings well.

    Citation: Xiaochun Gu, Fang Han, Zhijie Wang, Kaleem Kashif, Wenlian Lu. Enhancement of gamma oscillations in E/I neural networks by increase of difference between external inputs[J]. Electronic Research Archive, 2021, 29(5): 3227-3241. doi: 10.3934/era.2021035

    Related Papers:

    [1] Xiaochun Gu, Fang Han, Zhijie Wang, Kaleem Kashif, Wenlian Lu . Enhancement of gamma oscillations in E/I neural networks by increase of difference between external inputs. Electronic Research Archive, 2021, 29(5): 3227-3241. doi: 10.3934/era.2021035
    [2] Hao Yang, Peihan Wang, Fang Han, Qingyun Wang . An interpretable mechanism for grating-induced cross-inhibition and gamma oscillation based on a visual cortical neuronal network model. Electronic Research Archive, 2024, 32(4): 2936-2954. doi: 10.3934/era.2024134
    [3] Tianyi Li, Xiaofeng Xu, Ming Liu . Fixed-time synchronization of mixed-delay fuzzy cellular neural networks with $ L\acute{e}vy $ noise. Electronic Research Archive, 2025, 33(4): 2032-2060. doi: 10.3934/era.2025090
    [4] Yunhai Wang, Guodong Huang, Rui Zhu, Shu Zhou, Yuan Chai . Response mechanism of heat-sensitive neurons under combined noise stimulation. Electronic Research Archive, 2024, 32(11): 6405-6423. doi: 10.3934/era.2024298
    [5] Tongna Wang, Bao Li, Youjun Liu, Ruoyao Xu, Yuejuan Xu, Yang Yang, Liyuan Zhang . Complex visual cognitive function based on a large-scale neurovascular and metabolic coupling mechanisms model in whole brain. Electronic Research Archive, 2025, 33(4): 2412-2432. doi: 10.3934/era.2025107
    [6] Weijie Ding, Xiaochen Mao, Lei Qiao, Mingjie Guan, Minqiang Shao . Delay-induced instability and oscillations in a multiplex neural system with Fitzhugh-Nagumo networks. Electronic Research Archive, 2022, 30(3): 1075-1086. doi: 10.3934/era.2022057
    [7] Xinzheng Xu, Xiaoyang Zhao, Meng Wei, Zhongnian Li . A comprehensive review of graph convolutional networks: approaches and applications. Electronic Research Archive, 2023, 31(7): 4185-4215. doi: 10.3934/era.2023213
    [8] Fang Yan, Changyong Dai, Haihong Liu . Oscillatory dynamics of p53 pathway in etoposide sensitive and resistant cell lines. Electronic Research Archive, 2022, 30(6): 2075-2108. doi: 10.3934/era.2022105
    [9] Yiming Zhang, Zhiwei Pan, Shuyou Zhang, Na Qiu . Probabilistic invertible neural network for inverse design space exploration and reasoning. Electronic Research Archive, 2023, 31(2): 860-881. doi: 10.3934/era.2023043
    [10] Ke Yin, Kewei Zhang . Some computable quasiconvex multiwell models in linear subspaces without rank-one matrices. Electronic Research Archive, 2022, 30(5): 1632-1652. doi: 10.3934/era.2022082
  • Experimental observations suggest that gamma oscillations are enhanced by the increase of the difference between the components of external stimuli. To explain these experimental observations, we firstly construct a small excitatory/inhibitory (E/I) neural network of IAF neurons with external current input to E-neuron population differing from that to I-neuron population. Simulation results show that the greater the difference between the external inputs to excitatory and inhibitory neurons, the stronger gamma oscillations in the small E/I neural network. Furthermore, we construct a large-scale complicated neural network with multi-layer columns to explore gamma oscillations regulated by external stimuli which are simulated by using a novel CUDA-based algorithm. It is further found that gamma oscillations can be caused and enhanced by the difference between the external inputs in a large-scale neural network with a complicated structure. These results are consistent with the existing experimental findings well.



    Neuronal oscillations in the gamma range (30-90Hz) appearing in different areas of the brain are thought to carry important information for cognitive and perception functions [7, 3]. Experimental observations have suggested that gamma oscillations can be enhanced by the increase of the difference between the components of an external stimulus (for example, the increase of illumination contrast of a grating stimulus) in the visual cortex. Adjamian et al. [1] showed that gamma activity was stronger in response to a higher difference in luminance of gratings. Henrie et al. [6] found that the gamma-band power in the V1 zone was strengthened with the increase of the difference between light and dark areas of the stimulus. Saleem et al. [15] suggested that the narrowband gamma oscillation was enhanced with the difference in light intensity in the mouse visual cortex.

    The above experimental researches can be summarized that gamma oscillations are strengthened by the increase of the difference between external inputs to neurons, i.e. if there are two types of external inputs to a neural system, the greater the difference between the two inputs is, the stronger the gamma oscillation of the system is. However, this is counterintuitive and different from the conventional understanding: the neural network is prone to synchronize [19, 5, 20] and generate gamma oscillations if neurons are uniform and the external inputs to them are equal. Therefore, it is necessary to establish relevant models of a biological neural network to reproduce the experimental observations and investigate how gamma oscillations change with the variation of external inputs.

    In this paper, we firstly establish a small excitatory/inhibitory (E/I) neuronal network [11, 16] composing of Integrate-and-Fire (IAF) neurons with external current inputs to E-neurons differing from that to I-neurons. The simulation results of the small E/I network show that gamma oscillations are enhanced with the increase of the difference between the external inputs to E- and I- neurons, which are consistent with the biological experimental results well. Then we further study gamma oscillations in a large-scale neural network with complicated structure by using a novel CUDA-based algorithm for its simulation. It is further found that gamma oscillations can be caused and enhanced by the difference between the external inputs in large-scale neural networks with complicated structures.

    In this section, we will study gamma oscillations regulated by external inputs in a small E/I neural network. Both the excitatory neurons (E-neurons) and inhibitory neurons (I-neurons) in this small network are described by Integrate-and-Fire (IAF) model neurons [14] and conductance-based synapse model [4]. IAF model neurons can be described by Eq. 1:

    τdVidt=(ViVL)+R(Nj=1,jiIsynij+Iext) (1)

    where Vi represents the membrane potential of neuron i, τ is the membrane time constant, VL is the equilibrium potential of leakage ions, R is the membrane resistance, Isynij is the synaptic current transmitted from neuron j to neuron i, N denotes the number of the neurons in the network, and Iext is the external stimuli of neuron i.

    The conductance-based synapse model is described as follows:

    {Isynij=gmaxsij(ViEsyn),dsijdt=αF(Vj)(1sij)βsij (2)

    where gmax is the maximum conductance of the synapse (the synaptic weight), Esyn denotes the reverse potential of the synapse, sij denotes the opening level of the synaptic ion channel gate of neuron j connecting to neuron i, α is the gate enhancement factor and β is the gate decay factor, Vj is the membrane potential of presynaptic neuron j. After presynaptic neuron j is discharged, the resulting action potential reaches the synapse after a certain time (synaptic delay) [21] and F(Vj)=1, or else F(Vj)=0.

    Assume that the step length is Δt, the model network can be described by the following discrete equations for each time step t1t2(t2=t1+Δt) [13]:

    {Vi(t2)=Vi(t1)+Δtτ(Vi(t1)+VL+R(Nj=1Isynij(t1)+Iext)Isynij(t1)=gmaxsij(t1)(Vi(t1)Esyn)(ji)sij(t2)=sij(t1)+(αF(Vj)(1sij(t1))βsij(t1))Δt (3)

    The small E/I network consists of 400 excitatory neurons and 100 inhibitory neurons with all-to-all connections (see Fig. 1). The parameters gmax, Esyn, τ for E neurons are set as 0.00048, 0mV, 5ms, and those for I neurons are set as 0.012, 75mV, 1ms, respectively [4, 11]. The parameters VL, R, α, β are set as VL=65mV, R=10kΩ, α=0.9, β=0.003 for both E- and I- neurons. The spiking threshold potentials Vth for E- and I- neurons are both set as 45mV. When the membrane potential Vi reaches the threshold potential, neuron i emits a pulse (action potential) and then Vi resets to the resting membrane potential Vreset(Vreset=65mV for both E- and I- neurons). The synaptic delay is set as 3ms.

    Figure 1.  The structure of the small E/I network with external stimuli. Wiring among E- and I- neurons are all-to-all and only three cells of the E-neuron population (red dots) and I-neuron population (blue dots) are depicted here. Directed wiring is red for excitatory and blue for inhibitory connections. wEE, wII, wEI and wIE are the synaptic weights (gmax) of E to E, I to I, I to E, and E to I connections, respectively.

    Excitatory neuron population and inhibitory neuron population are assumed to receive different inputs mapped from the external stimulus, due to the different receptive fields of them and the non-uniform spatial distribution of the stimulus (for example, the different distributions of the gray values of pixels in a visual stimulus) [8]. Different values, S1 and S2, are thereby assumed to be the external current inputs to E- and I- neuron populations, respectively [8] (i.e. IextE=S1 for excitatory neuron population and IextI=S2 for inhibitory neuron population). Firstly, we simulated the network with the typical inputs of S1 and S2 to observe in which conditions the small network can generate gamma oscillations. Then we regulate the gamma oscillations by increasing the difference between S1 and S2 with the following two cases: (i) increase parameter S1 gradually from 0 to 1.0 but keeping parameter S2 unchanged; (ii) increase parameter S2 gradually from 0 to 1.0 but keeping parameter S1 unchanged.

    The network simulation and data analysis are done with MATLAB 2012a. The biological time of this small E/I network is set as 1s and the time step is 0.01ms. Before our simulations, we have made the excitatory synaptic currents for each neuron equal to its inhibitory synaptic currents by adjusting the parameters of the network. Thereby, there is no net synaptic currents to the neurons in the network under the initial conditions of the simulations. Firstly, we simulated the small network under two typical inputs (S1=0.6,S2=0 and S2=0.2,S1=0), then plotted separately the spiking times of neurons, the population spiking activities, and the power spectrums as shown in Fig. 2 and Fig. 3. Fig. 2(b) and Fig. 3(b) are the average population activities which are calculated from Fig. 2(a) and Fig. 3(a) respectively by counting spikes in time bins of width 1ms and convolving with a Gaussian of width 100ms. The power spectrums of the average population activities in Fig. 2(c) and Fig. 3(c) are calculated using the method of FFT transform [2]. Oscillation frequency is defined as the frequency component whose amplitude is largest among all frequency components of the average population activities. The oscillation power is defined as the power of the oscillation frequency. Thereby, the peak of a distinct bump in the power spectrum is the oscillation power and the frequency at this peak is the oscillation frequency. The peak frequency at the peak power in Fig. 2(c) is near 48Hz and the peak frequency in Fig. 3(c) is near 47Hz which are both within the gamma frequency band. The high peaks of distinct bumps both in Fig. 2(c) and Fig. 3(c) imply that the strong gamma oscillations can be caused when E- and I- neurons receive different external inputs.

    Figure 2.  Gamma oscillations are caused by the typical input that S1 is 0.6 and S2 is zero (50ms-450ms in a 1s simulation). (a) Raster plots of the spiking times of neurons. (b) Average population activity. (c) The power spectrum of the population activity.
    Figure 3.  Gamma oscillations are caused by the typical input that S2 is 0.2 and S1 is zero (50ms-450ms in a 1s simulation). (a) Raster plots of the spiking times of neurons. (b) Average population activity. (c) The power spectrum of the population activity.

    It is worthy of noting that we use the relative value of the power (called Relative Power here), which is defined by Relative Power = Power/PowerSUM (Power stands for the power of each frequency component and PowerSUM is the sum of the power of all frequencies in the power spectrum), to represent the magnitude of power in the power spectrum. Relative Power is more reasonable than the absolute value of the power of a frequency component since the absolute value of the power depends not only on the strength of the oscillation of the corresponding frequency component but also on the firing rate of neurons in a neuronal network.

    How the gamma oscillations generated in the small E/I network are regulated by the difference between S1 and S2 in the two regulation cases is summarized in Fig. 4. In the first case, when S1 (the external inputs to E neurons) is increased from 0 to 1.0, i.e. the difference between the inputs to E- and I- neurons is increased from 0 to 1.0, the peak power of the gamma oscillations increases gradually (the green curve). In the second case, the power becomes stronger and stronger (the blue curve) as S2 (the external inputs to I neurons) is increased from 0 to 1.0, i.e. the difference between the inputs to E- and I- neurons is also increased from 0 to 1.0. The more upward trend for the case of the increase of S2 (the blue curve) indicates that the gamma oscillations of the network are more sensitive to the changes of the inputs to I neurons.

    Figure 4.  Increase of the peak power of gamma oscillations in the small E/I neural network with the increasing of the difference between S1 and S2. The increase of the peak power in the second case (blue curve) is more evident than that in the first case (green curve).

    In short, the simulation results of the small E/I network show that gamma oscillations can be caused by the difference between the external inputs to excitatory and inhibitory neurons and get stronger with the increasing of the input difference, which is consistent well with what observed in existing biological experimental findings. However, a small neural network model cannot fully describe the actual biological nervous system that contains a large number of neurons and has a complex network structure. Therefore, we will next construct a large-scale neural network model with a complicated structure to further explore how the intensity of gamma oscillations change with external stimuli.

    According to the network structure's complexity of the real biological neural systems in the brain, we first construct a column composed of multiple layers and then use 10 multi-layer columns to set up a large-scale neural network model [9, 12]. Each column is composed of four network layers, which are layer 2/3, layer 4, layer 5, and layer 6, respectively, as shown in Fig. 5. There are 8 different types of neurons in a single column including E23, I23, E4, I4, E5, I5, E6, and I6, where "E" and "I" represent excitatory neurons and inhibitory neurons respectively and the number following them represents the layer. Figure 5 shows intra-layer and inter-layer neuron connections within a single column according to Table 2 [11], wherein red circles indicate excitatory neuron populations and excitatory connections are depicted by red directed wiring, blue circles indicate inhibitory neuron populations, and inhibitory connections are depicted by blue directed wiring. IextE and IextI (black wiring in Fig. 5) are external inputs to E and to I neurons respectively in each column.

    Table 2.  Connection Types and parameters of neurons within a column.
    presynaptic neuron postsynaptic neuron α β gmax d(ms)
    E23 E23, E4, E5, I23 0.9 0.003 0.004 2
    I23 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1
    E4 E4, E5, E6, I4 0.9 0.003 0.004 2
    I4 E4, I4 0.9 0.003 0.05 1
    E5 E23, E4, E5, E6, I5 0.9 0.003 0.004 2
    I5 E23, E5, E6, I23, I5, I6 0.9 0.003 0.05 1
    E6 E6, I6 0.9 0.003 0.004 2
    I6 E23, E5, E6, I5, I6 0.9 0.003 0.05 1

     | Show Table
    DownLoad: CSV
    Figure 5.  Intra-layer and inter-layer connections in a single column.

    The excitatory and inhibitory neurons in this large-scale complicated network are both described by Integrate-and-Fire (IAF) neurons (see Eq. 1) and their synapses are also described by the conductance-based synapse model (see Eq. 2). In the following simulations, the neuron parameters within columns are listed in Table 1 [4] and a single column contains 1,000 neurons including 800 excitatory neurons and 200 inhibitory neurons. In a column, the connection probability between neurons is 50% and the connection types and parameters of neurons are listed in Table 2. The connection probability of neurons connected between columns is 7% and the connection types and parameters are listed in Table 3 [11].

    Table 1.  Parameters and counts of different types of neurons within a column.
    Type N Vth(mV) Vreset(mV) Esyn(mV) τ(ms) R(kΩ) VL(mV)
    E23 200 -47 -65 0 5 10 -65
    I23 50 -45 -65 -75 1 10 -65
    E4 200 -47 -65 0 5 10 -65
    I4 50 -45 -65 -75 1 10 -65
    E5 200 -47 -65 0 5 10 -65
    I5 50 -45 -65 -75 1 10 -65
    E6 200 -47 -65 0 5 10 -65
    I6 50 -45 -65 -75 1 10 -65

     | Show Table
    DownLoad: CSV
    Table 3.  Connection types and parameters of neurons between columns.
    presynaptic neuron postsynaptic neuron α β gmax d(ms)
    E23 I23 0.8 0.001 0.98 1
    E4 I4 0.8 0.001 0.98 1
    E5 E5 0.8 0.001 0.16 1
    E5 I5 0.8 0.001 0.98 1
    E5 E23 0.8 0.001 0.16 1
    E5 I23 0.8 0.001 0.98 1
    E6 I6 0.8 0.001 0.98 1

     | Show Table
    DownLoad: CSV

    Based on the existing CUDA parallel algorithm [17] combined with a synapse optimization algorithm [18] which were proposed to implement the simulation for a large-scale neural network with simple structure, we design a novel CUDA-based algorithm to simulate the large-scale complicated neural network with multi-layer columns and the simulation framework is shown in Fig. 6. The red dashed box on the left of Fig. 6 shows the establishment of the complicated column structure of the large-scale network, including the initialization of connection probability between neurons, the initialization of parameters for each type of neurons according to Table 1, the establishment of neuron connections within columns and between columns and parameter settings of gmax and d according to Table 2 and Table 3. The pseudo-code of constructing the complicated network structure is shown in Algorithm 1 (see Fig. 7). The flow chart under the red dashed box in Fig. 6 illustrates the computations of neuron states (the synaptic current Isynij and the membrane potential Vi), of which the algorithm is similar to that proposed in [17]. For the novel CUDA-based algorithm to simulate a large-scale network with N neurons, we can use a one-dimensional grid (kernel) consisting of N/blockDim (number of neurons/block size) one-dimensional blocks and 1024 one-dimensional threads in each block (i.e., the block size is 1024) on GPU [10]. The index number of each block is blockIdx.x=0,1,...,m,...,(N/10241) and the index number of each thread in each block is threadIdx.x=0,1,...,m,...,1023. So the index number of each neuron mapped to a single thread is blockIdx.x1024+threadIdx.x, e.g., the state 0,1 calculations of the (m1024+n)th neuron at all time steps will be executed in the nth thread of the mth block. The CUDA-based parallel architecture for implementing the state calculations of all neurons in the simulations of the large-scale complicated network with N neurons is shown on the right of Fig. 6.

    Figure 6.  The framework of simulation for the large-scale complicated network with multi-layer columns. t0 is the initial time. Δt is the time step.
    Figure 7.  The pseudo-code of constructing the multi-layer column structure of the large-scale complicated network. The variable m_uiNumCell_per_column denotes the number of neurons in each column, the variables idx_column, and jdx_column denote the IDs of columns and the variables ii_idx_column and jj_jdx_column denote the IDs of neurons.

    Therefore, we used a one-dimensional grid (kernel) consisting of 10 one-dimen-sional blocks and 1024 one-dimensional threads in each block on GPU to simulate our large-scale complicated neural network with 10 columns. The biological time of the network is set as 0.5s and the simulation time step is set as 0.01ms. The software environment is 64-bit Windows 10, VS2015 programming environment, C++ programming language, and CUDA-based parallel computing platform. The hardware environment is INTEL i7-7700HQ (dual-core), NVIDIA GeForce GTX 1060 (6GB). Similar to the simulation methods of the small network in Section 2.2, we first simulate the large-scale network with the typical inputs of S1 and S2 (IextE=S1, IextI=S2) to observe whether gamma oscillations can be generated in the large-scale complicated neural network. The method of data analysis is similar to that of the small network and the simulation results are shown in Fig. 8 which are plotted in MATLAB 2010a. Figure 8(a) and Figure 8(b) show separately the raster plots of the spiking times of neurons in each layer of columns when S1=0.3,S2=0 and when S1=0,S2=0.3 (red dots represent the discharges of inhibitory neurons and blue dots represent the discharges of excitatory neurons). It is found that the periodic discharges of neurons are both generated in the large-scale complicated network. Figure 8(c) shows the power spectrums corresponding to Fig. 8(a) (red curve) and Fig. 8(b) (blue curve), respectively. We can find from Fig. 8(c) that there are both distinct peaks implying that the large-scale network generates obvious oscillations when E- and I- neurons receive different external inputs, and their dominant frequencies at the peaks belong to the gamma frequency band (30Hz-90Hz).

    Figure 8.  Gamma oscillations are generated in the large-scale complicated neural network. (a) Raster plots of the spiking times of neurons in each layer of columns when S1=0.3,S2=0. (b) Raster plots of the spiking times of neurons in each layer of columns when S1=0,S2=0.3. (c) Power spectrums of average population activities with the two input cases.

    Next, we regulate the large-scale complicated network to observe how the peak power of the gamma oscillations change by increasing the input difference between S1 and S2 with the two regulation cases similar to the regulation method proposed in Section 2.2. In the first case, the peak power of the gamma oscillation increases gradually as S1 is increased from 0 to 1.0 (i.e., the input difference is increased from 0 to 1.0). In the second case, the peak power also gets strong gradually as S2 is increased from 0 to 1.0 (i.e., the input difference is increased from 0 to 1.0). Therefore, we can summarize that the peak power of gamma oscillations increases when the difference between the inputs to E- and I- neurons gradually increases from 0.1 to 1.0 according to a large amount of data obtained through many simulations, the results of which are shown in Fig. 9.

    Figure 9.  Increase of the peak power of gamma oscillations in the large-scale complicated network with the increasing of the difference between S1 and S2.

    In short, the simulation results show that gamma oscillations can also be caused by the difference between the external inputs to excitatory and inhibitory neurons and get stronger with the increasing of the input difference in the large-scale neural network with complicated structure, which are consistent with the existing biological experimental findings.

    To explain the biological experimental observations that gamma oscillations are enhanced by the increase of the difference between the components of external stimuli (e.g., the increase of illumination contrast of a grating stimulus), we firstly construct a small excitatory/inhibitory (E/I) neural network consisting of IAF neurons with different external inputs to E- and I- neuron populations (E- and I- neurons have different receptive fields, thereby have different external inputs if there is difference between the external stimuli). We study the small E/I network with two different regulation cases and the simulation results show that the greater the difference between the inputs to E- and I- neuron populations is, the stronger the gamma oscillation is. Furthermore, a large-scale complicated neural network with multi-layer columns is constructed to explore gamma oscillations by using a novel CUDA-based algorithm for simulation. We further find that gamma oscillations can be caused and enhanced by the difference between the external inputs in a large-scale neural network with a complicated structure. The results of this paper are consistent well with the biological experimental observations, which is helpful for understanding the mechanism of enhancement of gamma oscillations by external stimuli. In the future, we will add the number of columns, the type and number of neurons to further expand the complicated network scale to study gamma oscillations.Moreover, we will further explore the cognitive funcitons of gamma oscillations with our models.

    This work was supported by the National Natural Science Foundation of China (Grants Nos. 11972115, 11572084), Shanghai Municipal Science and Technology Major Project (No.2018SHZDZX01), Key Laboratory of Computational Neuroscience and Brain-Inspired Intelligence (LCNBI), and ZJLab.



    [1] Induced gamma activity in primary visual cortex is related to luminance and not color contrast: An MEG study. Journal of Vision (2008) 8: 1-7.
    [2] Simultaneous recordings from the primary visual cortex and lateral geniculate nucleus reveal rhythmic interactions and a cortical source for gamma-band oscillations. Journal of Neuroscience (2014) 34: 7639-7644.
    [3] Mechanisms of gamma oscillations. Annu. Rev. Neurosci (2012) 35: 203-225.
    [4] (2001) Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems.Cambridge: MIT Press.
    [5] Stability of synchronization under stochastic perturbations in leaky integrate and fire neural networks of finite size. Discrete Contin. Dyn. Syst. Ser. B (2019) 24: 5183-5201.
    [6] LFP power spectra in V1 cortex: The graded effect of stimulus contrast. Journal of Neurophysiology (2005) 94: 479-490.
    [7] Oscillatory synchronization in large-scale cortical networks predicts perception. Neuron (2011) 69: 387-396.
    [8] Cortical oscillations arise from contextual interactions that regulate sparse coding. Proc. Nat. Acad. Sci. USA (2014) 111: 6780-6785.
    [9] The columnar organization of the neocortex. Brain (1997) 120: 701-722.
    [10] A configurable simulation environment for the efficient simulation of large-scale spiking neural networks on graphics processors. Neural Networks (2009) 22: 791-800.
    [11] S. A. Neymotin, H. Lee, E. Park, A. A. Fenton and W. W. Lytton, Emergence of physiological oscillation frequencies in a computer model of neocortex, Front. Comput. Neurosci., 5 (2011), 19. doi: 10.3389/fncom. 2011.00019
    [12] Functionally independent columns of rat somatosensory barrel cortex revealed with voltage-sensitive dye imaging. Journal of Neuroscience (2001) 21: 8435-8446.
    [13] W. H. Press, S. A. Teukolsky and W. T. Vetterling, Numerical recipes in C: The art of scientific computing, IEEE Concurrency, 6 (1992), 79.
    [14] L. Sacerdote and M. T. Giraudo, Stochastic Integrate and Fire Models: A review on mathematical methods and their applications, Stochastic biomathematical models, Lecture Notes in Math., Math. Biosci. Subser., Springer, Heidelberg, 2058 (2013), 99–148. doi: 10.1007/978-3-642-32157-3_5
    [15] Subcortical source and modulation of the narrowband gamma oscillation in mouse visual cortex. Neuron (2017) 93: 315-322.
    [16] E. Wallace, M. Benayoun, W. van Drongelen and J. D. Cowan, Emergent oscillations in networks of stochastic Spiking Neurons, PLOS ONE, 6 (2011). doi: 10.1371/journal. pone. 0014804
    [17] A novel parallel clock-driven algorithm for simulation of neuronal networks based on virtual synapse. Simulation (2020) 94: 415-427.
    [18] A novel time-event-driven algorithm for simulating spiking neural networks based on circular array. Neurocomputing (2018) 292: 121-129.
    [19] Delay-induced synchronization transition in small-world Hodgkin-Huxley neuronal networks with channel blocking. Discrete Contin. Dyn. Syst. Ser. B (2011) 16: 607-621.
    [20] B. Zhen, Z. Li and Z. Song, Influence of time delay in signal transmission on synchronization between two coupled FitzHugh-Nagumo neurons, Applied Sciences, 9 (2019), 2159. doi: 10.3390/app9102159
    [21] B. Zhen, D. Zhang and Z. Son, Complexity induced by external stimulations in a neural network system with time delay, Math. Probl. Eng., 2020 (2020), 5472351, 9 pp. doi: 10.1155/2020/5472351
  • This article has been cited by:

    1. Lizhi Liu, Cao Chen, Zilin Gao, Bo Cheng, Gang Wang, Cluster synchronization for controlled nodes via the dynamics of edges in complex dynamical networks, 2023, 18, 1932-6203, e0288657, 10.1371/journal.pone.0288657
    2. Weifang Huang, Lijian Yang, Xuan Zhan, Ziying Fu, Ya Jia, Synchronization transition of a modular neural network containing subnetworks of different scales, 2023, 24, 2095-9184, 1458, 10.1631/FITEE.2300008
    3. Hao Yang, Peihan Wang, Fang Han, Qingyun Wang, An interpretable mechanism for grating-induced cross-inhibition and gamma oscillation based on a visual cortical neuronal network model, 2024, 32, 2688-1594, 2936, 10.3934/era.2024134
  • Reader Comments
  • © 2021 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(2894) PDF downloads(232) Cited by(3)

Figures and Tables

Figures(9)  /  Tables(3)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog