Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1998, Neural Computation
…
22 pages
1 file
Transmission across neocortical synapses depends on the frequency of presynaptic activity (Thomson & Deuchars, 1994). Interpyramidal synapses in layer V exhibit fast depression of synaptic transmission, while other types of synapses exhibit facilitation of transmission. To study the role of dynamic synapses in network computation, we propose a unified phenomenological model that allows computation of the postsynaptic current generated by both types of synapses when driven by an arbitrary pattern of action potential (AP) activity in a presynaptic population. Using this formalism, we analyze different regimes of synaptic transmission and demonstrate that dynamic synapses transmit different aspects of the presynaptic activity depending on the average presynaptic frequency. The model also allows for derivation of mean-field equations, which govern the activity of large, interconnected networks. We show that the dynamics of synaptic transmission results in complex sets of regular and irr...
The dynamics of neural networks is often characterized by collective behavior and quasi synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are crucial for brain functioning and they strongly depend on the topology of the network and on the fluctuations of the connectivity. We propose a heterogeneous mean-field approach to neural dynamics on random networks, that explicitly preserves the disorder in the topology at growing network sizes, and leads to a set of self-consistent equations. Within this approach, we provide an effective description of microscopic and large scale temporal signals in a leaky integrate-and-fire model with short term plasticity, where quasi-synchronous events arise. Our equations provide a clear analytical picture of the dynamics, evidencing the contributions of both periodic (locked) and aperiodic (unlocked) neurons to the measurable average signal. In pa...
Physical Review E 90 032709(2014)
We investigate a mean-field model of interacting synapses on a directed neural network. Our interest lies in the slow adaptive dynamics of synapses, which are driven by the fast dynamics of the neurons they connect. Cooperation is modeled from the usual Hebbian perspective, while competition is modeled by an original polarity-driven rule. The emergence of a critical manifold culminating in a tricritical point is crucially dependent on the presence of synaptic competition. This leads to a universal 1/t power-law relaxation of the mean synaptic strength along the critical manifold and an equally universal 1/√t relaxation at the tricritical point, to be contrasted with the exponential relaxation that is otherwise generic. In turn, this leads to the natural emergence of long- and short-term memory from different parts of parameter space in a synaptic network, which is the most original and important result of our present investigations.
PLoS Computational Biology, 2005
Persistent activity states (attractors), observed in several neocortical areas after the removal of a sensory stimulus, are believed to be the neuronal basis of working memory. One of the possible mechanisms that can underlie persistent activity is recurrent excitation mediated by intracortical synaptic connections. A recent experimental study revealed that connections between pyramidal cells in prefrontal cortex exhibit various degrees of synaptic depression and facilitation. Here we analyze the effect of synaptic dynamics on the emergence and persistence of attractor states in interconnected neural networks. We show that different combinations of synaptic depression and facilitation result in qualitatively different network dynamics with respect to the emergence of the attractor states. This analysis raises the possibility that the framework of attractor neural networks can be extended to represent time-dependent stimuli. Citation: Barak O, Tsodyks M (2007) Persistent activity in neural networks with dynamic synapses. PLoS Comput Biol 3(2): e35.
Scientific Reports, 2014
The dynamics of neural networks is often characterized by collective behavior and quasi-synchronous events, where a large fraction of neurons fire in short time intervals, separated by uncorrelated firing activity. These global temporal signals are crucial for brain functioning. They strongly depend on the topology of the network and on the fluctuations of the connectivity. We propose a heterogeneous mean-field approach to neural dynamics on random networks, that explicitly preserves the disorder in the topology at growing network sizes, and leads to a set of self-consistent equations. Within this approach, we provide an effective description of microscopic and large scale temporal signals in a leaky integrate-and-fire model with short term plasticity, where quasi-synchronous events arise. Our equations provide a clear analytical picture of the dynamics, evidencing the contributions of both periodic (locked) and aperiodic (unlocked) neurons to the measurable average signal. In particular, we formulate and solve a global inverse problem of reconstructing the in-degree distribution from the knowledge of the average activity field. Our method is very general and applies to a large class of dynamical models on dense random networks.
Chapman & Hall/CRC Mathematical & Computational Biology, 2003
2009
The problem of the transformation of microscopic information to the macroscopic level is an intriguing challenge in computational neuroscience, but also of general mathematical importance. Here, a phenomenological mathematical model is introduced that simulates the internal information processing of brain compartments. Synaptic potentials are integrated over small number of realistically coupled neurons to obtain macroscopic quantities. The striatal complex, an
Physical Biology, 2007
The tripartite synapse denotes the junction of a pre-and postsynaptic neuron modulated by a synaptic astrocyte. Enhanced transmission probability and frequency of the postsynaptic current-events are among the significant effects of the astrocyte on the synapse as experimentally characterized by several groups. In this paper we provide a mathematical framework for the relevant synaptic interactions between neurons and astrocytes that can account quantitatively for both the astrocytic effects on the synaptic transmission and the spontaneous postsynaptic events. Inferred from experiments, the model assumes that glutamate released by the astrocytes in response to synaptic activity regulates store-operated calcium in the presynaptic terminal. This source of calcium is distinct from voltage-gated calcium influx and accounts for the long timescale of facilitation at the synapse seen in correlation with calcium activity in the astrocytes. Our model predicts the inter-event interval distribution of spontaneous current activity mediated by a synaptic astrocyte and provides an additional insight into a novel mechanism for plasticity in which a low fidelity synapse gets transformed into a high fidelity synapse via astrocytic coupling.
Theoretical Approaches to Complex Systems, 1978
Journal of biological physics, 2000
Stochastic and reduced biophysical models of synaptictransmission are formulated and evaluated. Thesynaptic transmission involves presynapticfacilitation of neurotransmitter release, depletionand recovery of the presynaptic pool of readilyreleasable vesicles containing neurotransmittermolecules and saturation of postsynaptic receptors ofboth fast non-NMDA and slow NMDA types. The models areshown to display the principal dynamicalcharacteristics experimentally observed of synaptictransmission. The two main types of neural coding,i.e. rate and temporal coding, can be distinguished bymeans of different dynamical properties of synaptictransmission determined by initial neurotransmitterrelease probability and presynaptic firing rate. Fromthe temporal evolution of the postsynaptic membranepotential response to a train of presynaptic actionpotentials at a sustained firing rate, in particularthe steady-state amplitude and steady-state averagelevel of postsynaptic membrane potentials aredete...
Frontiers in Computational Neuroscience, 2013
In this paper we review our research on the effect and computational role of dynamical synapses on feed-forward and recurrent neural networks. Among others, we report on the appearance of a new class of dynamical memories which result from the destabilisation of learned memory attractors. This has important consequences for dynamic information processing allowing the system to sequentially access the information stored in the memories under changing stimuli. Although storage capacity of stable memories also decreases, our study demonstrated the positive effect of synaptic facilitation to recover maximum storage capacity and to enlarge the capacity of the system for memory recall in noisy conditions. Moreover, the dynamical phase described above can be associated to the voltage transitions between up and down states observed in cortical areas in the brain. We studied then the conditions in which the permanence times in the up state are power-law distributed, which is a sign for criticality, and concluded that the experimentally observed large variability of permanence times could be explained as the result of noisy dynamic synapses with large recovery times. Finally, we report also recent results concerning how short-term synaptic processes can transmit weak signals throughout more than one frequency range in noisy neural networks by kind of stochastic multi-resonance. This is consequence of the competition between changes in the transmitted signals as neurons were varying their firing threshold and adaptive noise due to activity-dependent fluctuations in the synapses.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Journal of Statistical Physics, 2019
Journal of Physics A-mathematical and General, 1997
Neural Computation, 2003
The Handbook of Brain …, 2002
Physica A: Statistical Mechanics and its Applications, 2012
Frontiers in Cellular Neuroscience, 2022
Neuroscience, 2001
Frontiers in computational neuroscience, 2015
Physical Review E, 2008
Neural Computation, 1994
Journal of neurophysiology, 2002
Journal of Computational Neuroscience, 2008