Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1994
One of the main experimental tools in probing the interactions between neurons has been the measurement of the correlations in their activity. In general, however the interpretation of the observed correlations is difficult, since the correlation between a pair of neurons is influenced not only by the direct interaction between them but also by the dynamic state of the entire network to which they belong. Thus, a comparison between the observed correlations and the predictions from specific model networks is needed. In this paper we develop the theory of neuronal correlation functions in large networks comprising of several highly connected subpopulations, and obey stochastic dynamic rules. When the networks are in asynchronous states, the crosscorrelations are relatively weak, i.e., their amplitude relative to that of the auto-correlations is of order of 1=N , N being the size of the interacting populations. Using the weakness of the cross-correlations, general equations whi...
1994
Most theoretical investigations of large recurrent networks focus on the properties of the macroscopic order parameters such as population averaged activities or average overlaps with memories. However, the statistics of the fluctuations in the local activities may be an important testing ground for comparison between models and observed cortical dynamics. We evaluated the neuronal correlation functions in a stochastic network comprising of excitatory and inhibitory populations. We show that when the network is in a stationary state, the cross-correlations are relatively weak, i.e., their amplitude relative to that of the auto-correlations are of order of 1/ N, N being the size of the interacting population. This holds except in the neighborhoods of bifurcations to nonstationary states. As a bifurcation point is approached the amplitude of the cross-correlations grows and becomes of order 1 and the decay timeconstant diverges. This behavior is analogous to the phenomenon of critical slowing down in systems at thermal equilibrium near a critical point. Near a Hopf bifurcation the cross-correlations exhibit damped oscillations.
arXiv (Cornell University), 2013
We quantify the finite size effects in a stochastic network made up of rate neurons, for several kinds of recurrent connectivity matrices. This analysis is performed by means of a perturbative expansion of the neural equations, where the perturbative parameters are the intensities of the sources of randomness in the system. In detail, these parameters are the variances of the background or input noise, of the initial conditions and of the distribution of the synaptic weights. The technique developed in this article can be used to study systems which are invariant under the exchange of the neural indices and it allows us to quantify the correlation structure of the network, in terms of pairwise and higher order correlations between the neurons. We also determine the relation between the correlation and the external input of the network, showing that strong signals coming from the environment reduce significantly the amount of correlation between the neurons. Moreover we prove that in general the phenomenon of propagation of chaos does not occur, even in the thermodynamic limit, due to the correlation structure of the 3 sources of randomness considered in the model. Furthermore, we show that the propagation of chaos does not depend only on the number of neurons in the network, but also and mainly on the number of incoming connections per neuron. To conclude, we prove that for special values of the parameters of the system the neurons become perfectly correlated, a phenomenon that we have called stochastic synchronization. These discoveries clearly prevent the use of the mean-field theory in the description of the neural network.
arXiv (Cornell University), 2013
Using a perturbative expansion for weak synaptic weights and weak sources of randomness, we calculate the correlation structure of neural networks with generic connectivity matrices. In detail, the perturbative parameters are the mean and the standard deviation of the synaptic weights, together with the standard deviations of the background noise of the membrane potentials and of their initial conditions. We also show how to determine the correlation structure of the system when the synaptic connections have a random topology. This analysis is performed on rate neurons described by Wilson and Cowan equations, since this allows us to find analytic results. Moreover, the perturbative expansion can be developed at any order and for a generic connectivity matrix. We finally show an example of application of this technique for a particular case of biologically relevant topology of the synaptic connections.
Computational Neuroscience, 1998
Stimulus4ependent changes have been observed in the correlations between the spike trains of simultaneously-recorded pairs of neurons from the auditory cortex of marmosets even when there was no change in the average firing rates. A simple neural model can reproduce most of the characteristics of these experimental observations based on model neurons having leaky integration and fire-and-reset spikes and with Poisson-dis tributed, balanced input. The source of the synchrony in the model was common sensory input. The outputs of neurons in the model appear noisy (almost Poisson) owing to the stochastic nature of the input signal, but there is nevertheless a strong central peak in the correlation of the output spike trains. The experimental data and this simple model clearly demonstrate how even a noisy-looking spike train can convey basic3nformation about a sensory stimulus in the relative spike timing between neurons.
Network models are routinely downscaled compared to nature in terms of numbers of nodes or edges because of a lack of computational resources, often without explicit mention of the limitations this entails. While reliable methods have long existed to adjust parameters such that the first-order statistics of network dynamics are conserved, here we show that limitations already arise if also second-order statistics are to be maintained. The temporal structure of pairwise averaged correlations in the activity of recurrent networks is determined by the effective population-level connectivity. We first show that in general the converse is also true and explicitly mention degenerate cases when this one-to-one relationship does not hold. The one-to-one correspondence between effective connectivity and the temporal structure of pairwise averaged correlations implies that network scalings should preserve the effective connectivity if pairwise averaged correlations are to be held constant. Changes in effective connectivity can even push a network from a linearly stable to an unstable, oscillatory regime and vice versa. On this basis, we derive conditions for the preservation of both mean population-averaged activities and pairwise averaged correlations under a change in numbers of neurons or synapses in the asynchronous regime typical of cortical networks. We find that mean activities and correlation structure can be maintained by an appropriate scaling of the synaptic weights, but only over a range of numbers of synapses that is limited by the variance of external inputs to the network. Our results therefore show that the reducibility of asynchronous networks is fundamentally limited.
Neurocomputing, 2006
Taking into account the variability of coupling strength with increasing time, we present the nonlinear stochastic dynamical model of neuronal population, where the average number density is introduced as a distributed coding pattern of neuronal population. In the absence of external stimulus, numerical simulations indicate that the synchronized activity of neuronal population increases the coupling strength among neuronal oscillators; the coding pattern of the average number density is related to coupling configuration among neural oscillators. These studies also show that the variability of the coupling strength displays a slow learning process in the weak noise, but the coupling strength exhibits transient process in the strong noise. Numerical simulations confirm that the higher the coupling level is, the larger the synchronization of neuronal population is, and the stronger the coupling strength is.
Chaos: An Interdisciplinary Journal of Nonlinear Science, 2006
We analyze the dynamics of networks of spiking neural oscillators. First, we present an exact linear stability theory of the synchronous state for networks of arbitrary connectivity. For general neuron rise functions, stability is determined by multiple operators, for which standard analysis is not suitable. We describe a general nonstandard solution to the multioperator problem. Subsequently, we derive a class of neuronal rise functions for which all stability operators become degenerate and standard eigenvalue analysis becomes a suitable tool. Interestingly, this class is found to consist of networks of leaky integrate-and-fire neurons. For random networks of inhibitory integrate-and-fire neurons, we then develop an analytical approach, based on the theory of random matrices, to precisely determine the eigenvalue distributions of the stability operators. This yields the asymptotic relaxation time for perturbations to the synchronous state which provides the characteristic time sca...
Entropy, 2015
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. All previous works assumed that the weights were i.i.d. random variables, thereby making the analysis much simpler. This hypothesis is not realistic from the biological viewpoint. In order to cope with this extra complexity we introduce the processlevel empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Finally, our analysis of the rate function allows us also to describe this minimum as a stationary Gaussian measure which completely characterizes the activity of the infinite size network.
Acta Applicandae Mathematicae, 1985
Starting from basic physiological evidence, stochastic equations are formulated for the interaction between neurons. The generator potential is formed by deterministic linear spatio-temporal integration of action potentials, while the action potentials are considered as stochastic all-or-none events generated under the influence of the local instantaneous value of generator potential and generator current. Properties of the trajectories in state space are indicated. A neural partition function is defined and shown to be related to a statistical description of the neural activity pattern. The relevance of the mathematical formulation is indicated for the relation of neural correlation and synaptic connectivity. AMS (MOS) subject classifications (1980). 92A15, 94A99, 60G55.
Neural Computation, 2008
The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin of neuronal correlations. Here we study how cells in large recurrent networks of excitatory and inhibitory neurons interact and how the associated correlations affect stationary states of idle network activity. We demonstrate that the structure of the connectivity matrix of such networks induces considerable correlations between synaptic currents as well as between subthreshold membrane potentials, provided Dale's principle is respected.If, in contrast, synaptic weights are randomly distributed, input correlations can vanish, even for densely connected networks. Although correlations are strongly attenuated when proceding from membrane potentials to action potentials (spikes), the resulting weak correlations in the spike output can cause substantial fluctuations in the population activity, even in highly diluted networks. We show that simple mean-field models that take the structure of the coupling matrix into account can adequately describe the power spectra of the population activity. The consequences of Dale's principle on correlations and rate fluctuations are discussed in the light of recent experimental findings.
2008
Synchronous oscillations are believed to be important for neuronal information processing. We use a stochastic model for parallel point processes to estimate the strength of synchrony in an oscillating network of neurons recorded in cat visual cortex. The model has the surprising ability to predict interactions between the neurons solely on the basis of the individual processes, i.e., the autocorrelograms. The strength of synchronization is defined as the mismatch between the predicted and the observed strength of interaction. This method has the advantage of distinguishing changes in the strength of synchrony from changes in the properties of the underlying processes. Thus, the model provides new approaches for the investigation of dynamical changes in the joint oscillatory activity of neuronal networks.
Physica A: Statistical Mechanics and its Applications, 2001
Recent results on the statistical physics of time series generation and prediction are presented. A neural network is trained on quasi-periodic and chaotic sequences and overlaps to the sequence generator as well as the prediction errors are calculated numerically. For each network there exists a sequence for which it completely fails to make predictions. Two interacting networks show a transition to perfect synchronization. A pool of interacting networks shows good coordination in the minority game-a model of competition in a closed market. Finally, as a demonstration, a perceptron predicts bit sequences produced by human beings.
1997
A model is proposed to describe the collective behavior of a biologically plausible neural network, composed of interconnected spiking neurons which separately receive external stationary stimulations. The spiking dynamics of each neuron is represented by an hourglass metaphor. This network model was first studied in a special case where the connections are only inhibitory (Cottrell, 1988(Cottrell, , 1992. We study the network dynamics as a function of the parameters which quantify the strengths of both inhibitory and excitatory connections.
Physical Review E, 2010
Perfect spike-to-spike synchrony is studied in all-to-all coupled networks of identical excitatory, current-based, integrate-and-fire neurons with delta-impulse coupling currents and Poisson spiketrain external drive. This synchrony is induced by repeated cascading "total firing events," during which all neurons fire at once. In this regime, the network exhibits nearly periodic dynamics, switching between an effectively uncoupled state and a cascade-coupled total firing state. The probability of cascading total firing events occurring in the network is computed through a combinatorial analysis conditioned upon the random time when the first neuron fires and using the probability distribution of the subthreshold membrane potentials for the remaining neurons in the network. The probability distribution of the former is found from a first-passage-time problem described by a Fokker-Planck equation, which is solved analytically via an eigenfunction expansion. The latter is found using a central limit argument via a calculation of the cumulants of a single neuronal voltage. The influence of additional physiological effects that hinder or eliminate cascade-induced synchrony are also investigated. Conditions for the validity of the approximations made in the analytical derivations are discussed and verified via direct numerical simulations.
Physical Review Letters, 1998
We consider stochastic neural networks in which synaptic intensities rapidly fluctuate -around means given by a learning rule -competing with neuron activity. Each snapshot of synaptic intensities contains the neuron-neuron correlations in one of the stored patterns chosen at random. The result is apparently noisy behavior which induces robustness, including improved associative and pattern recognition processes. The main result here might apply to biological systems that exhibit fluctuating patterns of synapses. [S0031-9007(98)07250-0]
Physical Review E, 2011
In the absence of synaptic coupling, two or more neural oscillators may become synchronized by virtue of the statistical correlations in their noisy input streams. Recent work has shown that the degree of correlation transfer from input currents to output spikes depends not only on intrinsic oscillator dynamics, but also depends on the length of the observation window over which the correlation is calculated. In this paper we use stochastic phase reduction and regular perturbations to derive the correlation of the total phase elapsed over long time scales, a quantity which provides a convenient proxy for the spike count correlation. Over short time scales, we derive the spike count correlation directly using straightforward probabilistic reasoning applied to the density of the phase difference. Our approximations show that output correlation scales with the autocorrelation of the phase resetting curve over long time scales. We also find a concise expression for the influence of the shape of the phase resetting curve on the initial slope of the output correlation over short time scales. These analytic results together with numerical simulations provide new intuitions for the recent counterintuitive finding that type I oscillators transfer correlations more faithfully than do type II over long time scales, while the reverse holds true for the better understood case of short time scales.
The effect of conelations in neural networks is investigated by wnsider@g biased input and output palm'" Statistical mechanics is applied @ study training times and intend potentials of the MINOVER and ADALME leaming algorithms. For the latter, a direet extension to generalization abilify is obtained. Comparison with computer simulations shows good apemen1 with theoretical predictions. With biased pattems, we find
Physical Review E, 2015
We study the synchronization of a stochastically-driven, current-based, integrate-and-fire neuronal model on a preferential-attachment network with scale-free characteristics and high clustering. The synchrony is induced by cascading total firing events where every neuron in the network fires at the same instant of time. We show that in the regime where the system remains in this highly synchronous state, the firing rate of the network is completely independent of the synaptic coupling, and depends solely on the external drive. On the other hand, the ability for the network to maintain synchrony depends on a balance between the fluctuations of the external input and the synaptic coupling strength. In order to accurately predict the probability of repeated cascading total firing events we go beyond mean-field and tree-like approximations and conduct a detailed second order calculation taking into account local clustering. Our explicit analytical results are shown to give excellent agreement with direct numerical simulations for the particular preferential-attachment network model investigated.
2016
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.