Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2009, Bio-Inspired Systems: …
We present an evolving neural network model in which synapses appear and disappear stochastically according to bio-inspired probabilities. These are in general nonlinear functions of the local fields felt by neurons-akin to electrical stimulation-and of the global average field-representing total energy consumption. We find that initial degree distributions then evolve towards stationary states which can either be fairly homogeneous or highly heterogeneous, depending on parameters. The critical cases-which can result in scale-free distributions-are shown to correspond, under a mean-field approximation, to nonlinear drift-diffusion equations. We show how appropriate choices of parameters yield good quantitative agreement with published experimental data concerning synaptic densities during brain development (synaptic pruning).
PLOS ONE, 2019
Living neuronal networks in dissociated neuronal cultures are widely known for their ability to generate highly robust spatiotemporal activity patterns in various experimental conditions. These include neuronal avalanches satisfying the power scaling law and thereby exemplifying self-organized criticality in living systems. A crucial question is how these patterns can be explained and modeled in a way that is biologically meaningful, mathematically tractable and yet broad enough to account for neuronal heterogeneity and complexity. Here we propose a simple model which may offer an answer to this question. Our derivations are based on just few phenomenological observations concerning input-output behavior of an isolated neuron. A distinctive feature of the model is that at the simplest level of description it comprises of only two variables, a network activity variable and an exogenous variable corresponding to energy needed to sustain the activity and modulate the efficacy of signal transmission. Strikingly, this simple model is already capable of explaining emergence of network spikes and bursts in developing neuronal cultures. The model behavior and predictions are supported by empirical observations and published experimental evidence on cultured neurons behavior exposed to oxygen and energy deprivation. At the larger, network scale, introduction of the energy-dependent regulatory mechanism enables the network to balance on the edge of the network percolation transition. Network activity in this state shows population bursts satisfying the scaling avalanche conditions. This network state is self-sustainable and represents a balance between global network-wide processes and spontaneous activity of individual elements.
Philosophical Transactions of the Royal Society B: Biological Sciences, 2005
Cortical activity is the product of interactions among neuronal populations. Macroscopic electrophysiological phenomena are generated by these interactions. In principle, the mechanisms of these interactions afford constraints on biologically plausible models of electrophysiological responses. In other words, the macroscopic features of cortical activity can be modelled in terms of the microscopic behaviour of neurons. An evoked response potential (ERP) is the mean electrical potential measured from an electrode on the scalp, in response to some event. The purpose of this paper is to outline a population density approach to modelling ERPs.
Physical Review E 90 032709(2014)
We investigate a mean-field model of interacting synapses on a directed neural network. Our interest lies in the slow adaptive dynamics of synapses, which are driven by the fast dynamics of the neurons they connect. Cooperation is modeled from the usual Hebbian perspective, while competition is modeled by an original polarity-driven rule. The emergence of a critical manifold culminating in a tricritical point is crucially dependent on the presence of synaptic competition. This leads to a universal 1/t power-law relaxation of the mean synaptic strength along the critical manifold and an equally universal 1/√t relaxation at the tricritical point, to be contrasted with the exponential relaxation that is otherwise generic. In turn, this leads to the natural emergence of long- and short-term memory from different parts of parameter space in a synaptic network, which is the most original and important result of our present investigations.
Physical Review E, 2010
We consider a noise driven network of integrate-and-fire neurons. The network evolves as result of the activities of the neurons following spike-timing-dependent plasticity rules. We apply a self-consistent mean-field theory to the system to obtain the mean activity level for the system as a function of the mean synaptic weight, which predicts a first-order transition and hysteresis between a noise-dominated regime and a regime of persistent neural activity. Assuming Poisson firing statistics for the neurons, the plasticity dynamics of a synapse under the influence of the mean-field environment can be mapped to the dynamics of an asymmetric random walk in synapticweight space. Using a master-equation for small steps, we predict a narrow distribution of synaptic weights that scales with the square root of the plasticity rate for the stationary state of the system given plausible physiological parameter values describing neural transmission and plasticity. The dependence of the distribution on the synaptic weight of the mean-field environment allows us to determine the mean synaptic weight self-consistently. The effect of fluctuations in the total synaptic conductance and plasticity step sizes are also considered. Such fluctuations result in a smoothing of the first-order transition for low number of afferent synapses per neuron and a broadening of the synaptic weight distribution, respectively.
Neurocomputing, 2003
In this paper we propose a new nonlinear evolution model of neuronal activities to obtain the average number density, which is used to describe neurocommunication among populations of neurons. The average number density is a function of the amplitude, phase and time. The number density of the di usion process of neurocommunication is given for the active states of two populations of coupled oscillators under perturbation by both periodic stimulation and random noise. It is emphasized that the oscillatory coupling strengths and initial conditions within and between two populations of neurons are very important for investigating the mechanism of the transmission process. Particularly, the model presented in this paper can be used to describe the evolution process of the amplitudes in activities of multiple interactive populations of neurons.
Communications in Mathematical Sciences, 2012
This paper reviews our recent work addressing the role of both synaptic-input and connectivity-architecture fluctuations in coarse-grained descriptions of integrate-and-fire (I&F) pointneuron network models. Beginning with the most basic coarse-grained description, the all-to-all coupled, mean-field model, which ignores all fluctuations, we add the effects of the two types of fluctuations one at a time. To study the effects of synaptic-input fluctuations, we derive a kinetictheoretic description, first in the form of a Boltzmann equation in (2+1) dimensions, simplifying that to an advection-diffusion equation, and finally reducing the dimension to a system of two (1+1)dimensional kinetic equations via the maximum entropy principle. In the limit of an infinitely-fast conductance relaxation time, we derive a Fokker-Planck equation which captures the bifurcation between a bistable, hysteretic operating regime of the network when the amount of synaptic-input fluctuations is small, and a stable regime when the amount of fluctuations increases. To study the effects of complex neuronal-network architecture, we incorporate the network connectivity statistics in the mean-field description, and investigate the dependence of these statistics on the statistical properties of the neuronal firing rates for three network examples with increasingly complex connectivity architecture.
Long-term, repeated measurements of individual synaptic properties have revealed that synapses can undergo significant directed and spontaneous changes over time scales of minutes to weeks. These changes are presumably driven by a large number of activity-dependent and independent molecular processes, yet how these processes integrate to determine the totality of synaptic size remains unknown. Here we propose, as an alternative to detailed, mechanistic descriptions, a statistical approach to synaptic size dynamics. The basic premise of this approach is that the integrated outcome of the myriad of processes that drive synaptic size dynamics are effectively described as a combination of multiplicative and additive processes, both of which are stochastic and taken from distributions parametrically affected by physiological signals. We show that this seemingly simple model, known in probability theory as the Kesten process, can generate rich dynamics which are qualitatively similar to the dynamics of individual glutamatergic synapses recorded in long-term timelapse experiments in ex-vivo cortical networks. Moreover, we show that this stochastic model, which is insensitive to many of its underlying details, quantitatively captures the distributions of synaptic sizes measured in these experiments, the longterm stability of such distributions and their scaling in response to pharmacological manipulations. Finally, we show that the average kinetics of new postsynaptic density formation measured in such experiments is also faithfully captured by the same model. The model thus provides a useful framework for characterizing synapse size dynamics at steady state, during initial formation of such steady states, and during their convergence to new steady states following perturbations. These findings show the strength of a simple low dimensional statistical model to quantitatively describe synapse size dynamics as the integrated result of many underlying complex processes. Citation: Statman A, Kaufman M, Minerbi A, Ziv NE, Brenner N (2014) Synaptic Size Dynamics as an Effectively Stochastic Process. PLoS Comput Biol 10(10): e1003846.
2021
This article presents a biological neural network model driven by inhomogeneous Poisson processes accounting for the intrinsic randomness of synapses. The main novelty is the introduction of local interactions: each firing neuron triggers an instantaneous increase in electric potential to a fixed number of randomly chosen neurons. We prove that, as the number of neurons approaches infinity, the finite network converges to a nonlinear meanfield process characterised by a jump-type stochastic differential equation. We show that this process displays a phase transition: the activity of a typical neuron in the infinite network either rapidly dies out, or persists forever, depending on the global parameters describing the intensity of interconnection. This provides a way to understand the emergence of persistent activity triggered by weak input signals in large neural networks.
Frontiers in Computational Neuroscience, 2011
Understanding the computational capabilities of the nervous system means to "identify" its emergent multiscale dynamics. For this purpose, we propose a novel model-driven identification procedure and apply it to sparsely connected populations of excitatory integrateand-fire neurons with spike frequency adaptation (SFA). Our method does not characterize the system from its microscopic elements in a bottom-up fashion, and does not resort to any linearization. We investigate networks as a whole, inferring their properties from the response dynamics of the instantaneous discharge rate to brief and aspecific suprathreshold stimulations. While several available methods assume generic expressions for the system as a black box, we adopt a mean-field theory for the evolution of the network transparently parameterized by identified elements (such as dynamic timescales), which are in turn non-trivially related to single-neuron properties. In particular, from the elicited transient responses, the input-output gain function of the neurons in the network is extracted and direct links to the microscopic level are made available: indeed, we show how to extract the decay time constant of the SFA, the absolute refractory period and the average synaptic efficacy. In addition and contrary to previous attempts, our method captures the system dynamics across bifurcations separating qualitatively different dynamical regimes. The robustness and the generality of the methodology is tested on controlled simulations, reporting a good agreement between theoretically expected and identified values. The assumptions behind the underlying theoretical framework make the method readily applicable to biological preparations like cultured neuron networks and in vitro brain slices.
Frontiers in systems neuroscience, 2015
We present numerical simulations of metastable states in heterogeneous neural fields that are connected along heteroclinic orbits. Such trajectories are possible representations of transient neural activity as observed, for example, in the electroencephalogram. Based on previous theoretical findings on learning algorithms for neural fields, we directly construct synaptic weight kernels from Lotka-Volterra neural population dynamics without supervised training approaches. We deliver a MATLAB neural field toolbox validated by two examples of one- and two-dimensional neural fields. We demonstrate trial-to-trial variability and distributed representations in our simulations which might therefore be regarded as a proof-of-concept for more advanced neural field models of metastable dynamics in neurophysiological data.
2018
The ability of processing and storing information is considered a characteristic trait of intelligent systems. In biological neural networks, learning is strongly believed to take place at the synaptic level, in terms of modulation of synaptic efficacy. It can be thus interpreted as the expression of a collective phenomena, emerging when neurons connect each other in constituting a complex network of interactions. In this work, we represent learning as an optimization problem, actually implementing a local search, in the synaptic space, of specific configurations, known as solutions and making a neural network able to accomplish a series of different tasks. For instance, we would like the network to adapt the strength of its synaptic connections, in order to be capable of classifying a series of objects, by assigning to each object its corresponding class-label. Supported by a series of experiments, it has been suggested that synapses may exploit a very few number of synaptic states...
Scientific Reports, 2014
The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are crucial for brain functioning. They strongly depend on the topology of the network and on the fluctuations of the connectivity. We propose a heterogeneous mean-field approach to neural dynamics on random networks, that explicitly preserves the disorder in the topology at growing network sizes, and leads to a set of self-consistent equations. Within this approach, we provide an effective description of microscopic and large scale temporal signals in a leaky integrate-and-fire model with short term plasticity, where quasi-synchronous events arise. Our equations provide a clear analytical picture of the dynamics, evidencing the contributions of both periodic (locked) and aperiodic (unlocked) neurons to the measurable average signal. In particular, we formulate and solve a global inverse problem of reconstructing the in-degree distribution from the knowledge of the average activity field. Our method is very general and applies to a large class of dynamical models on dense random networks.
2015
In the present PhD thesis, we study neuronal structures at different scales, from synapses to neural networks. Our goal is to develop mathematical models and their analysis, in order to determine how the properties of synapses at the molecular level shape their activity and propagate to the network level. This change of scale can be formulated and analyzed using several tools such as partial differential equations, stochastic processes and numerical simulations. In the first part, we compute the mean time for a Brownian particle to arrive at a narrow opening defined as the small cylinder joining two tangent spheres. The method relies on Mobius conformal transformation applied to the Laplace equation. We also estimate, when the particle starts inside a boundary layer near the hole, the splitting probability to reach the hole before leaving the boundary layer, which is also expressed using a mixed boundary-value Laplace equation. Using these results, we develop model equations and the...
Neural Networks, 1996
We consider a randomly connected neural network with linear threshoM elements which update in discrete time steps. The two main features of the network are: (1) equally distributed and purely excitatory connections and (2) synaptic depression after repetitive firing. We focus on the time evolution of the expected network activity. The four types of qualitative behavior are investigated: singular excitation, convergence to a constant activity, oscillation, and chaos. Their occurrence is discussed as a function of the average number of connections and the synaptic depression time. Our model relies on experiments with a slice culture of disinhibited embryonic rat spinal cord. The dynamics of these networks essentially depends on the following characteristics." the low non-structured connectivity, the high synaptic depression time and the large EPSP with respect to the threshoM value.
Biological Cybernetics, 1999
The collective behavior of cortical neurons is strongly affected by the presence of noise at the level of individual cells. In order to study these phenomena in large-scale assemblies of neurons, we consider networks of firing-rate neurons with linear intrinsic dynamics and nonlinear coupling, belonging to a few types of cell populations and receiving noisy currents. Asymptotic equations as the number of neurons tends to infinity (mean field equations) are rigorously derived based on a probabilistic approach. These equations are implicit on the probability distribution of the solutions which generally makes their direct analysis difficult. However, in our case, the solutions are Gaussian, and their moments satisfy a closed system of nonlinear ordinary differential equations (ODEs), which are much easier to study than the original stochastic network equations, and the statistics of the empirical process uniformly converge towards the solutions of these ODEs. Based on this description, we analytically and numerically study the influence of noise on the collective behaviors, and compare these asymptotic regimes to simulations of the network. We observe that the mean field equations provide an accurate description of the solutions of the network equations for network sizes as small as a few hundreds of neurons. In particular, we observe that the level of noise in the system qualitatively modifies its collective behavior, producing for instance synchronized oscillations of the whole network, desynchronization of oscillating regimes, and stabilization or destabilization of stationary solutions. These results shed a new light on the role of noise in shaping collective dynamics of neurons, and gives us clues for understanding similar phenomena observed in biological networks.
Physical Review E, 1994
The European Physical Journal B, 2005
The quintessential property of neuronal systems is their intensive patterns of selective synaptic connections. The current work describes a physics-based approach to neuronal shape modeling and synthesis and its consideration for the simulation of neuronal development and the formation of neuronal communities. Starting from images of real neurons, geometrical measurements are obtained and used to construct probabilistic models which can be subsequently sampled in order to produce morphologically realistic neuronal cells. Such cells are progressively grown while monitoring their connections along time, which are analysed in terms of percolation concepts. However, unlike traditional percolation, the critical point is verified along the growth stages, not the density of cells, which remains constant throughout the neuronal growth dynamics. It is shown, through simulations, that growing beta cells tend to reach percolation sooner than the alpha counterparts with the same diameter. Also, the percolation becomes more abrupt for higher densities of cells, being markedly sharper for the beta cells. In the addition to the importance of the reported concepts and methods to computational neuroscience, the possibility of reaching percolation through morphological growth of a fixed number of objects represents in itself a novel paradigm of great theoretical and practical interest for the areas of statistical physics and critical phenomena.
2019
We present a mean-field formalism able to predict the collective dynamics of large networks of conductance-based interacting spiking neurons. We apply this formalism to several neuronal models, from the simplest Adaptive Exponential Integrate-and-Fire model to the more complex Hodgkin-Huxley and Morris-Lecar models. We show that the resulting mean-field models are capable of predicting the correct spontaneous activity of both excitatory and inhibitory neurons in asynchronous irregular regimes, typical of cortical dynamics. Moreover, it is possible to quantitatively predict the populations response to external stimuli in the form of external spike trains. This mean-field formalism therefore provides a paradigm to bridge the scale between population dynamics and the microscopic complexity of the individual cells physiology.
Physical Review Letters, 2007
We study the dynamics of a noisy network of spiking neurons with spike-frequency adaptation (SFA), using a mean-field approach, in terms of a two-dimensional Fokker-Planck equation for the membrane potential of the neurons and the calcium concentration gating SFA. The long time scales of SFA allows us to use an adiabatic approximation and to describe the network as an effective nonlinear two-dimensional system. The phase diagram is computed for varying levels of SFA and synaptic coupling. Two different population-bursting regimes emerge, depending on the level of SFA in networks with noisy emission rate, due to the finite number of neurons. DOI: PACS numbers: 87.19.La, 05.45.Xt, 87.18.Sn Global oscillations and synchronization emerging in large populations of coupled oscillators are widely studied for their relevance in fields ranging from physics to biology. In particular, synaptically coupled network of neurons spontaneously show both regular and irregular bursts of activity which, besides playing a role in the developmental stages [1], are thought to complement rate-based coding in information transmission 3]. Among several effects involved in promoting and sustaining bursting [4], a prominent role is played by spike-frequency adaptation (SFA), by which a neuron receiving a sustained stimulation gradually lowers its firing rate. Slow potassium currents are thought to play a major role in SFA , and a first step in modeling SFA involves an additional calcium-gated potassium current I AHP , temporarily hyperpolarizing the cell upon spike emission, with a recovery time of the order of hundreds of milliseconds . Several theoretical approaches have investigated the collective behavior of population of integrate and fire (IF) neuron models including SFA [6]. Bursting activity can emerge from the competition between the recurrent synaptic excitation and the selfinhibition induced by adaptation, as proven using both simulations, phase-space analysis of phenomenological rate equations [7] and mean-field approaches both in fully connected networks [8] and in sparsely connected noisy networks . The resulting bursting phenomenology is reminiscent of a relaxation oscillator: a stable high-rate fixed point is destabilized by SFA via a saddle-node bifurcation, bringing the network down to a low fixed point that takes over; with time, the level of SFA decreases, until the low fixed point gets in turn destabilized, again via a saddlenode bifurcation, and the cycle restarts. This behavior seems to be coherent with experimental findings from cultured networks of nervous cells .
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.