Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Acoustics, Speech, and Signal Processing, 1988. ICASSP-88., 1988 International Conference on
AI
A new feedforward two-layer linear in the parameters neural network, termed as the Functionally Expanded Neural Network (FENN), has been developed specifically for temporal signal processing. This hybrid architecture integrates multiple non-linear basis functions to enhance modeling capabilities for chaotic time series data, outperforming traditional recurrent models in terms of predictive accuracy and computational efficiency. The study highlights the advantages of iterative pruning and retraining strategies, showcasing the application of FENN in various complex non-linear processes including chaotic systems and real-world data.
This paper presents experimental comparison between selected neural architectures for chaotic time series prediction problem. Several feed-forward architectures (Multilayer Perceptrons) are compared with partially recurrent nets (Elman, extended Elman, and Jordan) based on convergence rate, prediction accuracy, training time requirements and stability of results.
International Journal of Bifurcation and Chaos, 1992
This paper shows that the dynamics of nonlinear systems that produce complex time series can be captured in a model system. The model system is an artificial neural network, trained with backpropagation, in a multi-step prediction framework. Results from the Mackey-Glass (D=30) will be presented to corroborate our claim. Our final intent is to study the applicability of the method to the electroencephalogram, but first several important questions must be answered to guarantee appropriate modeling.
Neural Processing …, 2011
The accuracy of a model to forecast a time series diminishes as the prediction horizon increases, in particular when the prediction is carried out recursively. Such decay is faster when the model is built using data generated by highly dynamic or chaotic systems. This paper presents a topology and training scheme for a novel artificial neural network, named "Hybrid-connected Complex Neural Network" (HCNN), which is able to capture the dynamics embedded in chaotic time series and to predict long horizons of such series. HCNN is composed of small recurrent neural networks, inserted in a structure made of feed-forward and recurrent connections and trained in several stages using the algorithm back-propagation through time (BPTT). In experiments using a Mackey-Glass time series and an electrocardiogram (ECG) as training signals, HCNN was able to output stable chaotic signals, oscillating for periods as long as four times the size of the training signals. The largest local Lyapunov Exponent (LE) of predicted signals was positive (an evidence of chaos), and similar to the LE calculated over the training signals. The magnitudes of peaks in the ECG signal were not accurately predicted, but the predicted signal was similar to the ECG in the rest of its structure.
This chapter discusses the use of neural networks for signal processing. In particular, it focuses on one of the most interesting and innovative areas: the chaotic time series processing. This includes time series analysis, identification of chaotic behavior, forecasting, and dynamic reconstruction. An overview of chaotic signal processing both by conventional and neural network methods is given.
Neurocomputing, 2003
This work analyses the problems related to the reconstruction of a dynamical system, which exhibits chaotic behaviour, from time series associated with a single observable of the system itself, by using feedforward neural network model. The starting network architecture is obtained setting the number of input neurons according to the Takens' theorem, and then is imporved by slightly increasing the number of inputs. The choice of the number of the hidden neurons is based on the results obtained testing di erent net structures. The e ectiveness of the method is demonstrated by applying it to the Brusselator system (Phys. Lett. 91 (1982) 263).
Chaos: An Interdisciplinary Journal of Nonlinear Science, 2012
Many research works deal with chaotic neural networks for various fields of application. Unfortunately, up to now these networks are usually claimed to be chaotic without any mathematical proof. The purpose of this paper is to establish, based on a rigorous theoretical framework, an equivalence between chaotic iterations according to Devaney and a particular class of neural networks. On the one hand we show how to build such a network, on the other hand we provide a method to check if a neural network is a chaotic one. Finally, the ability of classical feedforward multilayer perceptrons to learn sets of data obtained from a dynamical system is regarded. Various Boolean functions are iterated on finite states. Iterations of some of them are proven to be chaotic as it is defined by Devaney. In that context, important differences occur in the training process, establishing with various neural networks that chaotic behaviors are far more difficult to learn.
Journal of Computational Methods in Sciences and Engineering, 2016
In this paper, we used four types of artificial neural network (ANN) to predict the behavior of chaotic time series. Each neural network that used in this paper acts as global model to predict the future behavior of time series. Prediction process is based on embedding theorem and time delay determined by this theorem. This ANN applied to the time series that generated by Mackey-glass equation that has a chaotic behavior. At the end, all neural networks are used to solve this problem and their results are compared and analyzed.
2008
This paper examines how efficient neural networks are relative to linear and polynomial approximations to forecast a time series that is generated by the chaotic Mackey-Glass differential delay equation. The forecasting horizon is one step ahead. A series of regressions with polynomial approximators and a simple neural network with two neurons is taking place and compare the multiple correlation coefficients. The neural network, a very simple neural network, is superior to the polynomial expansions, and delivers a virtually perfect forecasting. Finally, the neural network is much more precise, relative to the other methods, across a wide set of realizations.
WSEAS Transactions on Computer Research, 2008
The prediction of chaotic time series with neural networks is a traditional practical problem of dynamic systems. This paper is not intended for proposing a new model or a new methodology, but to study carefully and thoroughly several aspects of a model on which there are no enough communicated experimental data, as well as to derive conclusions that would be of interest. The recurrent neural networks (RNN) models are not only important for the forecasting of time series but also generally for the control of the dynamical system. A RNN with a sufficiently large number of neurons is a nonlinear autoregressive and moving average (NARMA) model, with "moving average" referring to the inputs. The prediction can be assimilated to identification of dynamic process. An architectural approach of RNN with embedded memory, "Nonlinear Autoregressive model process with eXogenous input" (NARX), showing promising qualities for dynamic system applications, is analyzed in this paper. The performances of the NARX model are verified for several types of chaotic or fractal time series applied as input for neural network, in relation with the number of neurons, the training algorithms and the dimensions of his embedded memory. In addition, this work has attempted to identify a way to use the classic statistical methodologies (R/S Rescaled Range analysis and Hurst exponent) to obtain new methods of improving the process efficiency of the prediction chaotic time series with NARX.
Arxiv preprint chao-dyn/ …, 1994
This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series.
AJIT-e Online Academic Journal of Information Technology, 2019
Artificial neural networks are commonly accepted as a very successful tool for global function approximation. Because of this reason, they are considered as a good approach to forecasting chaotic time series in many studies. For a given time series, the Lyapunov exponent is a good parameter to characterize the series as chaotic or not. In this study, we use three different neural network architectures to test capabilities of the neural network in forecasting time series generated from different dynamical systems. In addition to forecasting time series, using the feedforward neural network with single hidden layer, Lyapunov exponents of the studied systems are forecasted.
1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227)
An algorithm is introduced that trains a neural network to identify chaotic dynamics from a single measured timeseries. The algorithm has four special features: 1. The state of the system is extracted from the time-series using delays, followed by weighted Principal Component Analysis (PCA) data reduction. 2. The prediction model consists of both a linear model and a Multi-Layer-Perceptron (MLP). 3. The effective prediction horizon during training is user-adjustable, due to 'error propagation': prediction errors are partially propagated to the next time step. 4. A criterion is monitored during training to select the model that has a chaotic attractor most similar to the real system's attractor. The algorithm is applied to laser data from the Santa Fe time-series competition (set A). The resulting model is not only useful for short-term predictions but it also generates time-series with similar chaotic characteristics as the measured data.
Proceedings of the 9th WSEAS International …, 2008
The problem of chaotic time series prediction is studied in various disciplines now including engineering, medical and econometric applications. Chaotic time series are the output of a deterministic system with positive Liapunov exponent. A time series prediction is a suitable application for a neuronal network predictor. The NN approach to time series prediction is non-parametric, in the sense that it is not necessary to know any information regarding the process that generates the signal. It is shown that the recurrent NN (RNN) with a sufficiently large number of neurons is a realization of the nonlinear ARMA (NARMA) process. In this paper we present the nonlinear autoregressive network with exogenous inputs (NARX), the architecture, the training method, the input data to network, the simulation results.
Lecture Notes in Computer Science, 2003
Nonlinear modeling with neural networks offers a promising approach for studying the prediction of a chaotic time series. In this paper, we propose a neural net based on Extended Kalman Filter to examine the nonlinear dynamic proprieties of some financial time series in order to differentiate between low-dimensional chaos and stochastic behavior. Kalman filtering, because it can deal with varying unobservable states, provides an efficient framework to model these non-stationary exposures. A controlled simulation experiment is used to introduce the issues involved and to present the proposed approach. Measures of forecast accuracy are developed. The pertinence of this model is discussed from the Tunisian Stock Exchange database.
2001
This paper introduces the concept of dynamic embedding manifold (DEM), which allows the Kohonen self-organizing map (SOM) to learn dynamic, nonlinear input-ouput mappings. The combination of the DEM concept with the SOM results in a new modelling technique that we called Vector-Quantized Temporal Associative Memory (VQTAM). We use VQTAM to propose an unsupervised neural algorithm called Self-Organizing N A R X (SONARX) network. The SONARX network is evaluated on the problem of modeling and prediction of three chaotic time series and compared with MLP, RBF and autoregressive (AR) models. Its is shown that SONARX exhibits similar performance when compared to MLP and RBF, while producing much better results than the AR model. The influence of the number of neurons, the memory order, the number of training epochs and the size of the training set in the final prediction error is also evaluated.
Expert Systems with Applications, 2011
In this paper, two CI techniques, namely, single multiplicative neuron (SMN) model and adaptive neurofuzzy inference system (ANFIS), have been proposed for time series prediction. A variation of particle swarm optimization (PSO) with cooperative sub-swarms, called COPSO, has been used for estimation of SMN model parameters leading to COPSO-SMN. The prediction effectiveness of COPSO-SMN and ANFIS has been illustrated using commonly used nonlinear, non-stationary and chaotic benchmark datasets of Mackey-Glass, Box-Jenkins and biomedical signals of electroencephalogram (EEG). The training and test performances of both hybrid CI techniques have been compared for these datasets.
Applied Soft Computing, 2005
This paper investigates the prediction of a Lorenz chaotic attractor having relatively high values of Lypunov's exponents. The characteristic of this time series is its rich chaotic behavior. For such dynamic reconstruction problem, regularized radial basis function (RBF) neural network (NN) models have been widely employed in the literature. However, author recommends using a two-layer multi-layer perceptron (MLP) NN-based recurrent model. When none of the available linear models have been able to learn the dynamics of this attractor, it is shown that the proposed NN-based auto regressive (AR) and auto regressive moving average (ARMA) models with regularization have not only learned the true trajectory of this attractor, but also performed much better in multi-step-ahead predictions. However, equivalent linear models seem to fail miserably in learning the dynamics of the time series, despite the low values of Akaike's final prediction error (FPE) estimate. Author proposes to employ the recurrent NN-based ARMA model with regularization which clearly outperforms all other models and thus, it is possible to obtain good results for prediction and reconstruction of the dynamics of the chaotic time series with NN-based models.
Chaos and Complex Systems, 2012
In this paper a traditional Multi Layer Perceptron with a tapped delay line as input is trained to identify the parameters of the Chua's circuit when fed with a sequence of values of a scalar state variable. The analysis of the a priori identifiability of the system, performed resorting to differential algebra, allows one to choose a suitable observable and the minimum number of taps. The results confirm the appropriateness of the proposed approach.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.