Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
When an ambiguous stimulus is observed, our perception undergoes dynamical changes between two states, a situation extensively explored in association with the Necker cube. Such phenomenon refers to bistable perception. Here, we present a model neural network composed of forced FitzHugh-Nagumo neurons, implemented also experimentally in an electronic circuit. We show, that under a particular coupling configuration, the neural network exhibit bistability between two configurations of clusters. Each cluster composed of two neurons undergoes independent chaotic spiking dynamics. As an appropriate external perturbation is applied to the system, the network undergoes changes in the clusters configuration, involving different neurons at each time. We hypothesize that the winning cluster of neurons, responsible for perception, is that exhibiting higher mean frequency. The clusters features may contribute to an increase of local field potential in the neural network.
Physica D, 104, 205 - 211, 1997
The discrete dynamics of a dissipative nonlinear model neuron with self-interaction is discussed. For units with self-excitatory connection hys- teresis effects, i.e. bistability over certain parameter domains, are observed. Numerical simulations demonstrate that self-inhibitory units with non-zero decay rates exhibit complex dynamics including period doubling routes to chaos. These units may be used as basic elements for networks with higher- order information processing capabilities.
Mathematical Modelling of Natural Phenomena
We demonstrate that unidirectional electrical coupling between two periodically spiking Hindmarsh-Rose neurons induces bistability in the system. We find that for certain values of intermediate coupling, the slave neuron exhibits coexistence of two attractors. One of them is the periodic orbit similar to the original attractor without coupling, and the other one is a chaotic attractor or a periodic orbit with higher periodicity, depending on the coupling strength. For strong coupling, the slave neuron is monostable at a periodic orbit similar to the attractor of the master neuron. When the master and slave neurons are in a similar attractor they are completely synchronized, whereas being in different states they are in generalized synchronization. We also present the experimental evidence of this behavior with electronic circuits based on the Hindmarsh-Rose model.
Journal of Neurophysiology, 2007
When a stimulus supports two distinct interpretations, perception alternates in an irregular manner between them. What causes the bistable perceptual switches remains an open question. Most existing models assume that switches arise from a slow fatiguing process, such as adaptation or synaptic depression. We develop a new, attractor-based framework in which alternations are induced by noise and are absent without it. Our model goes beyond previous energy-based conceptualizations of perceptual bistability by constructing a neurally plausible attractor model that is implemented in both firing rate mean-field and spiking cell-based networks. The model accounts for known properties of bistable perceptual phenomena, most notably the increase in alternation rate with stimulation strength observed in binocular rivalry. Furthermore, it makes a novel prediction about the effect of changing stimulus strength on the activity levels of the dominant and suppressed neural populations, a predictio...
The study of balanced networks of excitatory and inhibitory neurons has led to several open questions. On the one hand it is yet unclear whether the asynchronous state observed in the brain is autonomously generated, or if it results from the interplay between external drivings and internal dynamics. It is also not known, which kind of network variabilities will lead to irregular spiking and which to synchronous firing states. Here we show how isolated networks of purely excitatory neurons generically show asynchronous firing whenever a minimal level of structural variability is present together with a refractory period. Our autonomous networks are composed of excitable units, in the form of leaky integrators spiking only in response to driving currents, remaining otherwise quiet. For a non-uniform network, composed exclusively of excitatory neurons, we find a rich repertoire of self-induced dynamical states. We show in particular that asynchronous drifting states may be stabilized in purely excitatory networks whenever a refractory period is present. Other states found are either fully synchronized or mixed, containing both drifting and synchronized components. The individual neurons considered are excitable and hence do not dispose of intrinsic natural firing frequencies. An effective network-wide distribution of natural frequencies is however generated autonomously through self-consistent feedback loops. The asynchronous drifting state is, additionally, amenable to an analytic solution. We find two types of asynchronous activity, with the individual neurons spiking regularly in the pure drifting state, albeit with a continuous distribution of firing frequencies. The activity of the drifting component, however, becomes irregular in the mixed state, due to the periodic driving of the synchronized component. We propose a new tool for the study of chaos in spiking neural networks, which consists of an analysis of the time series of pairs of consecutive interspike intervals. In this space, we show that a strange attractor with a fractal dimension of about 1.8 is formed in the mentioned mixed state.
The activity dynamics of recurrent neural networks can exhibit deterministic chaos due to the nonlinear transfer functions. Chaotic attractors are winded around in nitely many unstable periodic orbits. Each can be stabilized by a feedback control. In the present article we explore the exibility of the chaotic dynamics of a recurrent neuromodule and construct a neural controller which is able to switch between several periodic patterns in two di erent ways: either deterministically by external inputs or spontaneously by dynamic noise. The chaotic attractor acts as an intermediate state between successively stabilized dynamic patterns. It has all the possible motions present, like an attentive state between di erent stimuli.
Proceedings of The National Academy of Sciences, 1990
We consider a randomly diluted higher-order network with noise, consisting of McCulloch-Pitts neurons that interact by Hebbian-type connections. For this model, exact dynamical equations are derived and solved for both parallel and random sequential updating algorithms. For parallel dynamics, we find a rich spectrum of different behaviors including static retrieving and oscillatory and chaotic phenomena in different parts of the parameter space. The bifurcation parameters include firstand second-order neuronal interaction coefficients and a rescaled noise level, which represents the combined effects of the random synaptic dilution, interference between stored patterns, and additional background noise. We show that a marked difference in terms of the occurrence of oscillations or chaos exists between neural networks with parallel and random sequential dynamics.
Chaotic dynamics of neural oscillations has been shown at the single neuron and network levels, both in experimental data and numerical simulations. Theoretical studies over the last twenty years have demonstrated an underlying role of chaos in neural systems. Nevertheless, whether chaotic neural oscillators make a significant contribution to relevant network behavior and whether the dynamical richness of neural networks are sensitive to the dynamics of isolated neurons, still remain open questions. We investigated transition dynamics of a medium-sized heterogeneous neural network of neurons connected by electrical coupling in a small world topology. We make use of an oscillatory neuron model (HB+Ih) that exhibits either chaotic or non-chaotic behavior at different combinations of conductance parameters. Measuring order parameter as a measure of synchrony, we find that the heterogeneity of firing rate and types of firing patterns make a greater contribution than chaos to the steepne...
IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 1997
The results of neurobiological studies in both vertebrates and invertebrates lead to the general question: How is a population of neurons, whose individual activity is chaotic and uncorrelated able to form functional circuits with regular and stable behavior? What are the circumstances which support these regular oscillations? What are the mechanisms that promote this transition? We address these questions using our experimental and modeling studies describing the behavior of groups of spiking-bursting neurons. We show that the role of inhibitory synaptic coupling between neurons is crucial in the self-control of chaos.
Physical Review E
We investigate the onset of collective oscillations in a excitatory pulse-coupled network of leaky integrate-and-fire neurons in the presence of quenched and annealed disorder. We find that the disorder induces a weak form of chaos that is analogous to that arising in the Kuramoto model for a finite number N of oscillators [O. V. Popovych, Phys. Rev. E 71 065201(R) (2005)]. In fact, the maximum Lyapunov exponent turns out to scale to zero for N-->infinity , with an exponent that is different for the two types of disorder. In the thermodynamic limit, the random-network dynamics reduces to that of a fully homogeneous system with a suitably scaled coupling strength. Moreover, we show that the Lyapunov spectrum of the periodically collective state scales to zero as 1/N{2}, analogously to the scaling found for the "splay state."
The European Physical Journal Special Topics, 2013
We investigate cluster formation in populations of coupled chaotic model neurons under homogeneous global coupling, and distance-dependent coupling, where the coupling weights between neurons depend on their relative distance. Three types of clusters emerge for global coupling: synchronized cluster, two state cluster and antiphase cluster. In addition to these, we find a novel three state cluster for distance-dependent coupling, where the population splits into two synchronized groups and one incoherent group. Lastly, we study a system with random inhomogeneous coupling strengths, in order to discern if the special pattern found in distance-dependent coupling arises from the underlying lattice structure or from the inhomogeneity in coupling. a
Physica D: Nonlinear Phenomena, 2014
h i g h l i g h t s • Derived exact asymptotic dynamics for time-varying networks of theta neurons. • Network exhibited macroscopic chaos, quasiperiodicity, and multistability. • Network exhibited fractal basin boundaries and final-state uncertainty. • Escape and switching behaviors depend on both macroscopic and microscopic initial states. • Ability to redirect such macroscopic states with an accessible global parameter. a r t i c l e i n f o Article history: Available online xxxx Keywords: Theta neuron Time-varying network Kuramoto system Macroscopic chaos Quasiperiodicity Final-state uncertainty a
PLoS Computational Biology, 2010
Attractor neural networks are thought to underlie working memory functions in the cerebral cortex. Several such models have been proposed that successfully reproduce firing properties of neurons recorded from monkeys performing working memory tasks. However, the regular temporal structure of spike trains in these models is often incompatible with experimental data. Here, we show that the in vivo observations of bistable activity with irregular firing at the single cell level can be achieved in a large-scale network model with a modular structure in terms of several connected hypercolumns. Despite high irregularity of individual spike trains, the model shows population oscillations in the beta and gamma band in ground and active states, respectively. Irregular firing typically emerges in a high-conductance regime of balanced excitation and inhibition. Population oscillations can produce such a regime, but in previous models only a non-coding ground state was oscillatory. Due to the modular structure of our network, the oscillatory and irregular firing was maintained also in the active state without fine-tuning. Our model provides a novel mechanistic view of how irregular firing emerges in cortical populations as they go from beta to gamma oscillations during memory retrieval.
The paper describes a new methodallowing the representationof an image through weights of the chaotic neural oscillators network. The proposed algorithm uses the permutation entropy of the individual oscillator to form values associated with the output image. Oscillators interaction produce generalized synchronization leading to the clustering and pattern formation. In the narrow range of the weights among oscillators the spontaneous clustering decrease the noise. Increased values lead to the pattern formation and image distortion.
Proceedings of the National Academy of Sciences, 1989
Self-organization of frequencies is studied by using model neurons called VCONs (voltage-controlled oscillator neuron models). These models give direct access to frequency information, in contrast to all-or-none neuron models, and they generate voltage spikes that phase-lock to oscillatory stimulation, similar to phase-locking of action potentials to oscillatory voltage stimulation observed in Hodgkin-Huxley preparations of squid axons. The rotation vector method is described and used to study how networks synchronize, even in the presence of noise or when damaged; the entropy of ratios of phases is used to construct an energy function that characterizes organized behavior. Computer simulations show that rotation numbers (output frequency/input frequency) describe both chaotic and nonchaotic behavior. Learning occurs when synaptic connections strengthen in response to stimulation that is synchronous with cell activity. It is shown that intermittent chaotic firing is suppressed and s...
IJCA, 2014
Chaos provides many interesting properties that can be used to achieve computational tasks. Such properties are sensitivity to initial conditions, space filling, control and synchronization. Chaotic neural models have been devised to exploit such properties. In this paper, a chaotic spiking neuron model is investigated experimentally. This investigation is performed to understand the dynamic behaviours of the model. The aim of this research is to investigate the dynamics of the nonlinear dynamic state neuron (NDS) experimentally. The experimental approach has revealed some quantitative and qualitative properties of the NDS model such as the control mechanism, the reset mechanism, and the way the model may exhibit dynamic behaviours in phase space. It is shown experimentally in this paper that both the reset mechanism and the self-feed back control mechanism are important for the NDS model to work and to stabilise to one of the large number of available unstable periodic orbits (UPOs) that are embedded in its attractor. The experimental investigation suggests that the internal dynamics of the NDS neuron provide a rich set of dynamic behaviours that can be controlled and stabilised. These wide range of dynamic behaviours may be exploited to carry out information processing tasks.
Scientific Reports
Chaotic dynamics has been shown in the dynamics of neurons and neural networks, in experimental data and numerical simulations. Theoretical studies have proposed an underlying role of chaos in neural systems. Nevertheless, whether chaotic neural oscillators make a significant contribution to network behaviour and whether the dynamical richness of neural networks is sensitive to the dynamics of isolated neurons, still remain open questions. We investigated synchronization transitions in heterogeneous neural networks of neurons connected by electrical coupling in a small world topology. The nodes in our model are oscillatory neurons that-when isolated-can exhibit either chaotic or nonchaotic behaviour, depending on conductance parameters. We found that the heterogeneity of firing rates and firing patterns make a greater contribution than chaos to the steepness of the synchronization transition curve. We also show that chaotic dynamics of the isolated neurons do not always make a visible difference in the transition to full synchrony. Moreover, macroscopic chaos is observed regardless of the dynamics nature of the neurons. However, performing a Functional Connectivity Dynamics analysis, we show that chaotic nodes can promote what is known as multi-stable behaviour, where the network dynamically switches between a number of different semi-synchronized, metastable states. Over the past decades, a number of observations of chaos have been reported in the analysis of time series from a variety of neural systems, ranging from the simplest to the more complex 1,2. It is generally accepted that the inherent instability of chaos in nonlinear systems dynamics, facilitates the extraordinary ability of neural systems to respond quickly to changes in their external inputs 3 , to make transitions from one pattern of behaviour to another when the environment is altered 4 , and to create a rich variety of patterns endowing neuronal circuits with remarkable computational capabilities 5. These features are all suggestive of an underlying role of chaos in neural systems (For reviews, see 5-7), however these ideas may have not been put to test thoroughly. Chaotic dynamics in neural networks can emerge in a variety of ways, including intrinsic mechanisms within individual neurons 8-12 or by interactions between neurons 3,13-21. The first type of chaotic dynamics in neural systems is typically accompanied by microscopic chaotic dynamics at the level of individual oscillators. The presence of this chaos has been observed in networks of Hindmarsh-Rose neurons 8 and biophysical conductance-based neurons 9-12. The second type of chaotic firing pattern is the synchronous chaos. Synchronous chaos has been demonstrated in networks of both biophysical and non-biophysical neurons 3,13,15,17,22-24 , where neurons display synchronous chaotic firing-rate fluctuations. In the latter cases, the chaotic behaviour is a result of network connectivity, since isolated neurons do not display chaotic dynamics or burst firing. More recently, it has been shown that asynchronous chaos, where neurons exhibit asynchronous chaotic firing-rate fluctuations, emerge generically from balanced networks with multiple time scales in their synaptic dynamics 20. Different modelling approaches have been used to uncover important conditions for observing these types of chaotic behaviour (in particular, synchronous and asynchronous chaos) in neural networks, such as the synaptic strength 25-27 , heterogeneity of the numbers of synapses and their synaptic strengths 28,29 , and lately the balance of excitation and inhibition 21. The results obtained by Sompolinsky et al. 25 showed that, when the synaptic strength is increased, neural networks display a highly heterogeneous chaotic state via a transition from an inactive state. Other studies demonstrated that chaotic behaviour emerges in the presence of weak and strong heterogeneities, for example a coupled heterogeneous population of neural oscillators with different synaptic strengths 28-30. Recently, Kadmon et al. 21 highlighted the importance of the balance between excitation and inhibition on a
2009
Abstract. We study the dynamics of a simple bistable system driven by multiplicative correlated noise. Such system mimics the dynamics of classical attractor neural networks with an additional source of noise associated, for instance, with the stochasticity of synaptic transmission. We found that the multiplicative noise, which performs as a fluctuating barrier separating the stable solutions, strongly influences the behaviour of the system, giving rise to complex time series and scale-free distributions for the escape times of the system.
Physical Review E, 2000
We introduce a nonlinear dynamical system with self-exciting chaotic dynamics. Its interspike interval return map shows a noisy Poisson-like distribution. Spike sequences from different initial conditions are unrelated but possess the same mean frequency. In the presence of noisy perturbations, sequences started from different initial conditions synchronize. The features of the model are compared with experimental results for irregular spike sequences in neurons. Self-exciting chaos offers a mechanism for temporal coding of complex input signals.
Research Square (Research Square), 2024
This paper investigates the simulation of brain chaos dynamics using a combination of the Chua circuit and diode tunnel mechanisms, aiming to examine chaotic behavior in brain networks. Leveraging the inherent chaotic properties of the Chua circuit, Fitzhugh-Nagumo function and the nonlinear characteristics of diode tunneling, our model offers a platform to mimic the intricate synaptic interactions observed in the brain. By subjecting the model to various stimuli and perturbations, we analyze the emergence and evolution of chaotic patterns, shedding light on the underlying mechanisms of cerebral chaos. Through numerical simulations and experimental validation, we demonstrate the effectiveness of our approach in replicating key features of brain chaos and highlight its potential implications for understanding neurological disorders and cognitive processes. This research contributes to the broader effort of leveraging computational models to explore the complex dynamics of the brain and their implications for neuroscience and microengineering.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.