Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1995, International Journal of Intelligent Systems
Traditionally, associative memory models are based on point attractor dynamics, where a memory state corresponds to a stationary point in state space. However, biological neural systems seem to display a rich and complex dynamics whose function is still largely unknown. We use a neural network model of the olfactory cortex to investigate the functional significance of such dynamics, in particular with regard to learning and associative memory. The model uses simple network units, corresponding to populations of neurons connected according to the structure of the olfactory cortex. All essential dynamical properties of this system are reproduced by the model, especially oscillations at two separate frequency bands and aperiodic behavior similar to chaos. By introducing neuromodulatory control of gain and connection weight strengths, the dynamics can change dramatically, in accordance with the effects of acetylcholine, a neuromodulator known to be involved in attention and learning in animals. With computer simulations we show that these effects can be used for improving associative memory performance by reducing recall time and increasing fidelity. The system is able to learn and recall continuously as the input changes, mimicking a real world situation of an artificial or biological system in a changing environment.
Journal of computational neuroscience, 1998
We discuss the first few stages of olfactory processing in the framework of a layered neural network. Its central component is an oscillatory associative memory, describing the external plexiform layer, that consists of inhibitory and excitatory neurons with dendrodendritic interactions. We explore the computational properties of this neural network and point out its possible functional role in the olfactory bulb. When receiving a complex input that is composed of several odors, the network segments it into its components. This is done in two stages. First, multiple odor input is preprocessed in the glomerular layer via a decorrelation mechanism that relies on temporal independence of odor sources. Second, as the recall process of a pattern consists of associative convergence to an oscillatory attractor, multiple inputs are identified by alternate dominance of memory patterns during different sniff cycles. This could explain how quick analysis of mixed odors is subserved by the rapi...
Advances in neural information processing systems 2, 1990
A generic model of oscillating cortex, which assumes "minimal" coupling justified by known anatomy, is shown to function as an associative memory, using previously developed theory. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long range excitatofy connections. Using a local Hebb-like learning rule for primary and higher order synapses at the ends of the long range connections, the system learns to store the kinds of oscillation amplitude patterns observed in olfactory and visual cortex. This rule is derived from a more general "projection algorithm" for recurrent analog networks, that analytically guarantees content addressable memory storage of continuous periodic sequencescapacity: N /2 Fourier components for an N node network -no "spurious" attractors.
A new learning algorithm for the storage of static and periodic attractors in biologically inspired recurrent analog neural networks is introduced. For a network of n nodes, n static or n/2 periodic attractors may be stored. The algorithm allows programming of the network vector field independent of the patterns to be stored. Stability of patterns, basin geometry, and rates of convergence may be controlled. For orthonormal patterns, the l~grning operation reduces to a kind of periodic outer product rule that allows local, additive, commutative, incremental learning. Standing or traveling wave cycles may be stored to mimic the kind of oscillating spatial patterns that appear in the neural activity of the olfactory bulb and prepyriform cortex during inspiration and suffice, in the bulb, to predict the pattern recognition behavior of rabbits in classical conditioning experiments. These attractors arise, during simulated inspiration, through a multiple Hopf bifurcation, which can act as a critical "decision pOint" for their selection by a very small input pattern.
Physica A: Statistical Mechanics and its Applications, 1992
A number of neural network models, in which fixed-point and limit-cycle attractors of the underlying dynamics are used to store and associatively recall information, are described. In the first class of models, a hierarchical structure is used to store an exponentially large number of strongly correlated memories. The second class of models uses limit cycles to store and retrieve individual memories. A neurobiologically plausible network that generates low-amplitude periodic variations of activity, similar to the oscillations observed in electroencephalographic recordings, is also described. Results obtained from analytic and numerical studies of the properties of these networks are discussed.
Lecture Notes in Computer Science, 2010
Memory is often considered to be embedded into one of the attractors in neural dynamical systems, which provides an appropriate output depending on the initial state specified by an input. However, memory is recalled only under the presence of external inputs. Without such inputs, neural states do not provide such memorized outputs. Hence, each of memories do not necessarily correspond to an attractor of the dynamical system without input and do correspond to an attractor of the dynamics system with input. With this background, we propose that memory recall occurs when the neural activity changes to an appropriate output activity upon the application of an input. We introduce a neural network model that enables learning of such memories. After the learning process is complete, the neural dynamics is shaped so that it changes to the desired target with each input. This change is analyzed as bifurcation in a dynamical system. Conditions on timescales for synaptic plasticity are obtained to achieve the maximal memory capacity.
We describe a modified attractor neural network in which neuronal dynamics takes place on a time scale of the absolute refractory period but the mean temporal firing rate of any neuron in the network is lower by an arbitrary factor that characterizes the strength ofthe effective inhibition. It operates by encoding information on the excitatory neurons only and assuming the inhibitory neurons to be faster and to inhibit the excitatory ones by an effective postsynaptic potential that is expressed in terms of the activity of the excitatory neurons themselves. Retrieval is identified as a nonergodic behavior of the network whose consecutive states have a significantly enhanced activity rate for the neurons that should be active in a stored pattern and a reduced activity rate for the neurons that are inactive in the memorized pattern. In contrast to the Hopfield model the network operates away from fixed points and under the strong influence of noise. As a consequence, of the neurons that should be active in a pattern, only a small fraction is active in any given time cycle and those are randomly distributed, leading to reduced temporal rates. We argue that this model brings neural network models much closer to biological reality. We present the results of detailed analysis of the model as well as simulations.
The ability of sensory networks to transiently store information on the scale of seconds can confer many advantages in processing time-varying stimuli. How a network could store information on such intermediate time scales, between typical neurophysiological time scales and those of long-term memory, is typically attributed to persistent neural activity. An alternative mechanism which might allow for such information storage is through temporary modifications to the neural connectivity which decay on the same second-long time scale as the underlying memories. Earlier work that has explored this method has done so by emphasizing one attractor from a limited, pre-defined set. Here, we describe an alternative, a Transient Attractor network, which can learn any pattern presented to it, store several simultaneously, and robustly recall them on demand using targeted probes in a manner reminiscent of Hopfield networks. We hypothesize that such functionality could be usefully embedded within sensory cortex, and allow for a flexiblygated short-term memory, as well as conferring the ability of the network to perform automatic de-noising, and separation of input signals into distinct perceptual objects. We demonstrate that the stored information can be refreshed to extend storage time, is not sensitive to noise in the system, and can be turned on or off by simple neuromodulation. The diverse capabilities of transient attractors, as well as their resemblance to many features observed in sensory cortex, suggest the possibility that their actions might underlie neural processing in many sensory areas.
2017
The attractor neural network scenario is a popular scenario for memory storage in association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exists two types of retrieval states: one in which firing rates are constant in time, another in which firing rates fluctuate chaotically.
Brain Research, 2013
Nested oscillations, where the phase of the underlying slow rhythm modulates the power of faster oscillations, have recently attracted considerable research attention as the increased phase-coupling of cross-frequency oscillations has been shown to relate to memory processes. Here we investigate the hypothesis that reactivations of memory patterns, induced by either external stimuli or internal dynamics, are manifested as distributed cell assemblies oscillating at gamma-like frequencies with lifetimes on a theta scale. For this purpose, we study the spatiotemporal oscillatory dynamics of a previously developed meso-scale attractor network model as a correlate of its memory function. The focus is on a hierarchical nested organization of neural oscillations in delta/theta (2-5 Hz) and gamma frequency bands (25-35 Hz), and in some conditions even in lower alpha band (8-12 Hz), which emerge in the synthesized field potentials during attractor memory retrieval. We also examine spiking behavior of the network in close relation to oscillations. Despite highly irregular firing during memory retrieval and random connectivity within each cell assembly, we observe precise spatiotemporal firing patterns that repeat across memory activations at a rate higher than expected from random firing. In contrast to earlier studies aimed at modeling neural oscillations, our attractor memory network allows us to elaborate on the functional context of emerging rhythms and discuss their relevance. We provide support for the hypothesis that the dynamics of coherent delta/theta oscillations constitute an important aspect of the formation and replay of neuronal assemblies.
International Journal of Intelligent Systems, 1995
A mathematical model is given for describing activity dynamics, learning, and associative memory in the olfactory bulb. Numerical bifurcation analysis and the calculation of Lyapunov-exponents suggest that chaotic behavior only occurs in the case of strong excitatory coupling in the mitral layer. A Hebbian-type learning rule, supplemented with a nonlinear decay term and a selective decreasing term, is defined and analyzed. Slow learning modifies the bulbar activity dynamics hence it plays a crucial role in odor information processing. 0 1995 John Wiley & Sons, Inc. ' Supported by the Japan Society for the Promotion of Science. 2Supported by the Hungarian Scientific Research Fund under Grant No. OTKA 3Supported by the Academy of Finland.
Neural Computing and Applications, 2011
When a certain input-output mapping is memorized, the neural dynamics provide a prescribed neural activity output that depends on the external input. Without such an input, neural states do not provide memorized output. Only upon input, memory is recalled as an attractor, while neural activity without an input need not fall on such attractor but can fall on another attractor distinct from the evoked one. With this background, we propose that memory recall occurs as a bifurcation from the spontaneous attractor to the corresponding attractor matching the requested target output, as the strength of the external input is increased. We introduce a neural network model that enables the learning of such memories as bifurcations. After the learning process is complete, the neural dynamics are shaped to generate a prescribed target in the presence of each input. We find that the capacity of such memory depends on the timescales for the neural activity and synaptic plasticity. The maximal memory capacity is achieved at a certain relationship between the timescales, where the residence time at previous learned targets during the learning process is minimized.
The olfactory system is constantly solving pattern recognition tasks by the creation of a large space to codify odour representations and optimising their distribution within it. A model of the Olfactory Bulb was developed by Z. Li and J. J. Hopfield [8] based on anatomy and electrophysiology. They used nonlinear simulations observing that the collective behavior produce an oscillatory frequency. Here, the Subthreshold hopf bifurcation is assumed as a candidate for modelling the bulb and the Sub-threshold subcritical hopf bifurcation for modelling the olfactory cortex. Network topology analysis of the subcritical regime is presented as a proof of the importance of synapse plasticity for memory functions in the olfactory cortex. This neuronal feature emerges due to long range correlations of the network and it is found a critical coupling constant between neurons under collective dynamics.
2015
The electrophysiological data recorded in the glomerular stage of the insect olfactory pathway show both a coherent global oscillating behavior of the neurons of this stage- carrier waveform?-, and a reproducible complex activity pattern- code?- of some of these neurons, in phase with the global oscillation. We propose a possible interpretation of this type of biological activity patterns, using a simple model of the glomerular stage of the insect olfactory pathway that has been previously designed. This model is analytically tractable, even when synaptic noise, random synaptic weights, inputs or delays are taken into account. This model exhibits the property of coding its inputs through spatio-temporal patterns which are the attractors of its dynamics. These attractors can be long cycles, robust against synaptic noise and also to input fluctuations, provided that the latter occur within well-defined limits. We give an example of a set of adapted synaptic weights and inputs leading ...
1993
We study an Attractor Neural Network that stores natural concepts, organized in semantic classes. The concepts are represented by distributed patterns over a space of attributes, and are related by both semantic and episodic associations. While semantic relations are expressed through an hierarchical coding over the attribute space, episodic links are realized via specific synaptic projections.
New Ideas in Psychology, 2008
Nonlinearity and dynamics in psychology are found in various domains such as neuroscience, cognitive science, human development, etc. However, the models that have been proposed are mostly of a computational nature and ignore dynamics. In those models that do include dynamic properties, only fixed points are used to store and retrieve information, leaving many principles of nonlinear dynamic systems (NDS) aside; for instance, chaos is often perceived as a nuisance. This paper considers a nonlinear dynamic artificial neural network (NDANN) that implements NDS principles while also complying with general neuroscience constraints. After a theoretical presentation, simulation results will show that the model can exhibit multi-valued, fixed-point, region-constrained attractors and aperiodic (including chaotic) behaviors. Because the capabilities of NDANN include the modeling of spatiotemporal chaotic activities, it may be an efficient tool to help bridge the gap between biological memory neural models and behavioral memory models.
EPL (Europhysics Letters), 2012
We propose a novel associative memory model wherein the neural activity without an input (i.e., spontaneous activity) is modified by an input to generate a target response that is memorized for recall upon the same input. Suitable design of synaptic connections enables the model to memorize input/output (I/O) mappings equaling 70% of the total number of neurons, where the evoked activity distinguishes a target pattern from others. Spontaneous neural activity without an input shows chaotic dynamics but keeps some similarity with evoked activities, as reported in recent experimental studies.
Physica D: Nonlinear Phenomena, 1990
A generic model of oscillating cortex, which assumes "minimal" coupling justified by known anatomy, is shown to function as an associative memory, using previously developed theory. The network has explicit excitatory neurons with local inhibitory interneuron feedback that forms a set of nonlinear oscillators coupled only by long-range excitatory connections. Using a local Hebb-like learning rule for primary and higher-order synapses at the ends of the long-range connections, the system learns to store the kinds of oscillation amplitude patterns observed in olfactory and visual cortex. In olfaction, these patterns "emerge" during respiration by a pattern forming phase transition which we characterize in the model as a multiple Hopf bifurcation. We argue that these bifurcations play an important role in the operation of real digital computers and neural networks, and we use bifurcation theory to derive learning rules which analytically guarantee CAM storage of continuous periodic sequences-capacity: N/2 Fourier components for an N-node network-no "spurious" attractors.
Cognitive Neurodynamics, 2013
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.