Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2013, Neural Networks
This manuscript considers the learning problem of multi-layer neural networks (MNNs) with an activation function which comes from cellular neural networks. A systematic investigation of the partition of the parameter space is provided. Furthermore, the recursive formula of the transition matrix of an MNN is obtained. By implementing the well-developed tools in the symbolic dynamical systems, the topological entropy of an MNN can be computed explicitly. A novel phenomenon, the asymmetry of a topological diagram that was seen in Ban, [J. Differential Equations 246, pp. 552-580, 2009], is revealed.
Journal of Differential Equations, 2012
Let Y ⊆ {−1, 1} Z∞×n be the mosaic solution space of an n-layer cellular neural network. We decouple Y into n subspaces, say Y (n) , and give a necessary and sufficient condition for the existence of factor maps between them. In such a case, Y (i) is a sofic shift for 1 i n. This investigation is equivalent to study the existence of factor maps between two sofic shifts. Moreover, we investigate whether Y (i) and Y ( j) are topological conjugate, strongly shift equivalent, shift equivalent, or finitely equivalent via the well-developed theory in symbolic dynamical systems. This clarifies, in a multi-layer cellular neural network, each layer's structure. As an extension, we can decouple Y into arbitrary k-subspaces, where 2 k n, and demonstrates each subspace's structure.
Journal of Differential Equations, 2009
This study investigates the complexity of the global set of output patterns for one-dimensional multi-layer cellular neural networks with input. Applying labeling to the output space produces a sofic shift space. Two invariants, namely spatial entropy and dynamical zeta function, can be exactly computed by studying the induced sofic shift space. This study gives sofic shift a realization through a realistic model. Furthermore, a new phenomenon, the broken of symmetry of entropy, is discovered in multi-layer cellular neural networks with input.
Neural Computation, 1992
A piecewise linear equation is proposed as a method of analysis of mathematical models of neural networks. A symbolic representation of the dynamics in this equation is given as a directed graph on an N-dimensional hypercube. This provides a formal link with discrete neural networks such as the original Hopfield models. Analytic criteria are given to establish steady states and limit cycle oscillations independent of network dimension. Model networks that display multiple stable limit cycles and chaotic dynamics are discussed. The results show that such equations are a useful and efficient method of investigating the behavior of neural networks.
Proceedings of the 2002 7th IEEE International Workshop on Cellular Neural Networks and Their Applications
The stability and dynamics of a class of Cellular Neural Networks (CNN's) in the central linear pan is investigated using the decoupling lechnique based on discrete spatial transforms, Nyquist and root locus techniques.
Journal of Artificial Intelligence and Soft Computing Research, 2019
A topological property or index of a network is a numeric number which characterises the whole structure of the underlying network. It is used to predict the certain changes in the bio, chemical and physical activities of the networks. The 4-layered probabilistic neural networks are more general than the 3-layered probabilistic neural networks. Javaid and Cao [Neural Comput. and Applic., DOI 10.1007/s00521-017-2972-1] and Liu et al. [Journal of Artificial Intelligence and Soft Computing Research, 8(2018), 225-266] studied the certain degree and distance based topological indices (TI’s) of the 3-layered probabilistic neural networks. In this paper, we extend this study to the 4-layered probabilistic neural networks and compute the certain degree-based TI’s. In the end, a comparison between all the computed indices is included and it is also proved that the TI’s of the 4-layered probabilistic neural networks are better being strictly greater than the 3-layered probabilistic neural net...
Neurocomputing, 2006
We study the influence of the topology of a neural network on its learning dynamics. The network topology can be controlled by one parameter p rw to convert the topology from regular to random in a continuous way [D.J. Watts and S.H. Strogatz, Collective dynamics of small-world networks, Nature 393 (1998) 440-442]. As test problem, which requires a recurrent network, we choose the problem of timing to be learned by the network, that means to connect a predefined input neuron with a output neuron in exactly T f time steps. We analyze the learning dynamics for different parameters numerically by counting the number of paths within the network which are available for solving the problem. Our results show, that there are parameter values for which either a regular, small-world or random network gives the best performance depending strongly on the choice for the predefined input and output neurons. r
ISCAS 2001. The 2001 IEEE International Symposium on Circuits and Systems (Cat. No.01CH37196), 2001
The occurrence of complex dynamic behavior (i.e bifurcation processes, strange and chaotic attractors) in autonomous space-invariant cellular neural networks (CNNs) is investigated. Firstly some sufficient conditions for the instability of CNNs are provided; then some classes of unstable template are identified. Finally it is shown that unstable CNNs often exhibit complex dynamics and for a case study the most significant bifurcation processes are described. It is worth noting that most CNN implementations exploit spaceinvariant templates and so far no example of complex dynamics has been shown in autonomous space-invariant CNNs.
Dans cet article nous relions des processus dynamiques complexes, comme l'apprentissage, aux modifications structurelles qu'ils induisent dans le cadre des réseaux de neurones récurrents aléatoires (RRNNs). Dans ces réseaux, l'apprentissage modifie la dynamique progressivement, réduisant le chaos d'origineà un point-fixe attractif spécifique de la modalité apprise. En appliquant une règle simple d'apprentissage Hebbienne, nous montrons que la réduction de la dynamique s'accompagne de modifications des boucles locales basées sur les liens synaptiques. De plus, l'apprentissage Hebbien semble lier fortement les neurones actifs entre eux, tout en préservant de courts chemins de connection entre les neurones. Nous observons donc, en conséquence des modifications des boucles locales, une réorganisation des synapses les plus fortes en un réseau "petit-monde" (ou "small-world", SW). Ces résultats apportent deséclairages nouveaux sur les bases structurelles qui sous-tendent le traitement de l'information (i.e. reconnaissance de patterns) dans les RRNNs. De plus, ils ouvrent des perspectives sur les relations entre l'apprentissage Hebbien et l'architecture petit-monde.
Advancing Artificial Intelligence Through Biological Process Applications, Chapt. 18, pp. 331-357, 2008
Browne, A. (ed.) Perspectives in Neural Computing. Bristol: Institute of Physics Press. pp. 51-71. , 1997
The functioning of mind and brain is sometimescharacterizedin terms of symbol manipulation and sometimes by some sort of neural network. Nevertheless, no-one would proposethat the brain consistsof separatesymbolic and neural processors. The brain is clearly fundamentally neural yet its style of computation often appearsto be symbolic. How can we produce symbolic interpretations of neuralnetworks'behaviourandwhenmight sucha level ofdescription become appropriate? In this chapter I use an algorithm developed by Crutchfield and Young to produce symbolic descriptions of the behaviour of a simple neural- network model of the cerebral cortex. I explore some of the conditions under which the computational power of the symbolic interpretation of the network's behaviour is maximized, relating this computational power to the computational der4andsof cognitive processes.Finally, I considerthe implications that the study of this toy problem has for symbolic approachesto artificial intelligence.
2003
A mathematical model of architecture and learning processes of multilayer artificial neural netwoks is discussed in the paper. Dynamical systems theory is used to describe the learning precess of networks consisting of linear, weakly nonlinear and nonlinear neurons. Conjugacy between a gradient dynamical system with a constant time step and a cascade generated by its Euler method theorem is applied as well.
Computational Problems of Electrical Engineering
The study of the influence of learning speed (η) on the learning process of a multilayer neural network is carried out. The program for a multilayer neural network was written in Python. The learning speed is considered as a constant value and its optimal value at which the best learning is achieved is determined. To analyze the impact of learning speed, a logistic function, which describes the learning process, is used. It is shown that the learning error function is characterized by bifurcation processes that lead to a chaotic state at η> 0.8. The optimal value of the learning speed is determined. The value determines the appearance of the process of doubling the number of local minima, and is η = 0.62 for a three-layer neural network with 4 neurons in each layer. Increasing the number of hidden layers (3 ÷ 30) and the number of neurons in each layer (4 ÷ 150) does not lead to a radical change in the diagram of the logistic function (xn, η), and hence, in the optimal value of t...
Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290), 2002
Cellular neural networks (CNNs) are analog dynamic processors that have found several applications for the solution of complex computational problems. The mathematical model of a CNN consists in a large set of coupled nonlinear differential equations that have been mainly studied through numerical simulations; the knowledge of the dynamic behavior is essential for developing rigorous design methods and for establishing new applications. CNNs can be divided in two classes: stable CNNs, with the property that each trajectory (with the exception of a set of measure zero) converges towards an equilibrium point; unstable CNNs with either a periodic or a non/periodic (possibly complex) behavior. The manuscript is devoted to the comparison of the dynamic behavior of two CNN models: the original Chua-Yang model and the Full Range model, that was exploited for VLSI implementations.
This paper examines the fundamental property of chaotic systems, namely topological transitivity, and considers it as a tool for the construction of neural models of chaotic attractors and for chaotic analysis of neural networks. The property of topological transitivity can be utilized for the construction of complex neural systems, based on neural models of chaotic attractors. In addition, it can serve as useful tool in the process of detecting chaotic features of neural networks during the recall phase.
Computers, Materials & Continua, 2022
The Cellular Neural Network (CNN) has various parallel processing applications, image processing, non-linear processing, geometric maps, highspeed computations. It is an analog paradigm, consists of an array of cells that are interconnected locally. Cells can be arranged in different configurations. Each cell has an input, a state, and an output. The cellular neural network allows cells to communicate with the neighbor cells only. It can be represented graphically; cells will represent by vertices and their interconnections will represent by edges. In chemical graph theory, topological descriptors are used to study graph structure and their biological activities. It is a single value that characterizes the whole graph. In this article, the vertex-edge topological descriptors have been calculated for cellular neural network. Results can be used for cellular neural network of any size. This will enhance the applications of cellular neural network in image processing, solving partial differential equations, analyzing 3D surfaces, sensory-motor organs, and modeling biological vision.
2021
Graph theory is a discrete branch of mathematics for designing and predicting a network. Some topological invariants are mathematical tools for the analysis of connection properties of a particular network. The Cellular Neural Network (CNN) is a computer paradigm in the field of machine learning and computer science. In this article we have given a close expression to dominating invariants computed by the dominating degree for a cellular neural network. Moreover, we have also presented a 3D comparison between dominating invariants and classical degree-based indices to show that, in some cases, dominating invariants give a better correlation on the cellular neural network as compared to classical indices.
2001
It is shown that first-order autonomous space-invariant cellular neural networks (CNNs) may exhibit a complex dynamic behavior (i.e. equilibrium point and limit cycle bifurcation, strange and chaotic attractors). The most significant limit cycle bifurcation processes, leading to chaos, are investigated through the computation of the corresponding Floquet's multipliers and Lyapunov exponents. It is worth noting that most practical CNN implementations exploit first order cells and spaceinvariant templates: so far no example of complex dynamics has been shown in first-order autonomous space-invariant CNNs.
Complex Systems, 1991
Abstract. A transform is introduced that maps cellular automata and discrete neural networks to dynamical systems on the unit in-terval. This transform is a topological conjugacy except at countably many points. In many cases, it gives rise to continuous full conjugates, in which ...
IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 2002
It is shown that first-order autonomous space-invariant cellular neural networks (CNNs) may exhibit a complex dynamic behavior (i.e., equilibrium point and limit cycle bifurcation, strange and chaotic attractors). The most significant limit cycle bifurcation processes, leading to chaos, are investigated through the computation of the corresponding Floquet's multipliers and Lyapunov exponents. It is worth noting that most practical CNN implementations exploit first-order cells and space-invariant templates: so far no example of complex dynamics has been shown in first-order autonomous space-invariant CNNs.
2010
The objective of this study is to investigate the correlation between the internal topological organization in neural network and the learning ability of the neural network. This study is motivated by the interesting neurophysiological examination that shows the significance of topographic map of adult mammals’brains to their learning ability and plasticity. In this study we propose a model of a layered neural network with Self-Organizing Map in its hidden layer which is connected to Perceptron as a learning part. We run several simulations to show the significant of the topological order in helping the learning process and relearning process.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.