Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1994, Arxiv preprint chao-dyn/ …
This paper is the second in a series of two, and describes the current state of the art in modelling and prediction of chaotic time series.
This paper presents experimental comparison between selected neural architectures for chaotic time series prediction problem. Several feed-forward architectures (Multilayer Perceptrons) are compared with partially recurrent nets (Elman, extended Elman, and Jordan) based on convergence rate, prediction accuracy, training time requirements and stability of results.
2009 IEEE 12th International Conference on Computer Vision, 2009
We use concepts from chaos theory in order to model nonlinear dynamical systems that exhibit deterministic behavior. Observed time series from such a system can be embedded into a higher dimensional phase space without the knowledge of an exact model of the underlying dynamics. Such an embedding warps the observed data to a strange attractor, in the phase space, which provides precise information about the dynamics involved. We extract this information from the strange attractor and utilize it to predict future observations. Given an initial condition, the predictions in the phase space are computed through kernel regression. This approach has the advantage of modeling dynamics without making any assumptions about the exact form (linear, polynomial, radial basis, etc.) of the mapping function. The predicted points are then warped back to the observed time series. We demonstrate the utility of these predictions for human action synthesis, and dynamic texture synthesis. Our main contributions are: multivariate phase space reconstruction for human actions and dynamic textures, a deterministic approach to model dynamics in contrast to the popular noise-driven approaches for dynamic textures, and video synthesis from kernel regression in the phase space. Experimental results provide qualitative and quantitative analysis of our approach on standard data sets.
International Journal of Bifurcation and Chaos, 1992
This paper shows that the dynamics of nonlinear systems that produce complex time series can be captured in a model system. The model system is an artificial neural network, trained with backpropagation, in a multi-step prediction framework. Results from the Mackey-Glass (D=30) will be presented to corroborate our claim. Our final intent is to study the applicability of the method to the electroencephalogram, but first several important questions must be answered to guarantee appropriate modeling.
Chaos and Complex Systems, 2012
In this paper a traditional Multi Layer Perceptron with a tapped delay line as input is trained to identify the parameters of the Chua's circuit when fed with a sequence of values of a scalar state variable. The analysis of the a priori identifiability of the system, performed resorting to differential algebra, allows one to choose a suitable observable and the minimum number of taps. The results confirm the appropriateness of the proposed approach.
Introduction Predicting the continuation of a time series is an interesting problem, with important applications in almost all fields of human activity. The standard theory views the series as a realization of a random process[1], which is appropiate for systems with many irreducible degrees of freedom. However, for deterministic time series associated to systems with complex chaotic dynamics, only a few degrees of freedom are relevant. Furthermore, even if chaos prevents long-term predictions, the intrinsic determinism in the series offers new possibilities for short-term forecasting[2]. On this basis, many algorithms have been recently devised to reconstruct the underlying dynamics and allow accurate predictions of the next few values in the future[3]. Given the observations of a system x i N 0 , the problem is then the reconstruction of the time-series dynamics x t = F (X t<F14.
Applied Mathematics and Computation, 1998
This report was prepared as an account of work sponsored by an agency of the United States Government Neither the United States Government nor any agency themf, nor any of their employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or proctss disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or servia by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, m mmendotion, or favoring by the United States Government or any agency thereof. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof. DISCLAIMER Portions of this document may be illegible in electronic image products. Images are produced from the best available original document.
Proceedings of the 9th WSEAS International …, 2008
The problem of chaotic time series prediction is studied in various disciplines now including engineering, medical and econometric applications. Chaotic time series are the output of a deterministic system with positive Liapunov exponent. A time series prediction is a suitable application for a neuronal network predictor. The NN approach to time series prediction is non-parametric, in the sense that it is not necessary to know any information regarding the process that generates the signal. It is shown that the recurrent NN (RNN) with a sufficiently large number of neurons is a realization of the nonlinear ARMA (NARMA) process. In this paper we present the nonlinear autoregressive network with exogenous inputs (NARX), the architecture, the training method, the input data to network, the simulation results.
Neurocomputing, 2003
This work analyses the problems related to the reconstruction of a dynamical system, which exhibits chaotic behaviour, from time series associated with a single observable of the system itself, by using feedforward neural network model. The starting network architecture is obtained setting the number of input neurons according to the Takens' theorem, and then is imporved by slightly increasing the number of inputs. The choice of the number of the hidden neurons is based on the results obtained testing di erent net structures. The e ectiveness of the method is demonstrated by applying it to the Brusselator system (Phys. Lett. 91 (1982) 263).
Modeling, Identification and Control: A Norwegian Research Bulletin, 1994
Certain deterministic non-linear systems may show chaotic behaviour. Time series derived from such systems seem stochastic when analyzed with linear techniques. However, uncovering the deterministic structure is important because it allows for construction of more realistic and better models and thus improved predictive capabilities. This paper describes key features of chaotic systems including strange attractors and Lyapunov exponents. The emphasis is on state space reconstruction techniques that are used to estimate these properties, given scalar observations. Data generated from equations known to display chaotic behaviour are used for illustration. A compilation of applications to real data from widely di erent elds is given. If chaos is found to be present, one may proceed to build non-linear models, which is the topic of the second paper in this series.
Echo State Networks (ESN) present a novel approach to analysing and training recurrent neural networks (RNNs). It leads to a fast, simple and constructive algorithm for supervised training of RNNs. A very powerful blackbox modeling tool to build models to simulate, predict, filter, classify, or control nonlinear dynamical systems, what makes ESNs excel over traditional techniques is that it can efficiently encode and retain massive information in an ESN "echo" network state about a long previous history. This makes ESNs excellent approximators of, among other nonlinear dynamical systems, chaotic time series -with obvious applications in prediction of such tasks as currency exchange rates. This research proposal aims to further improve upon the best empirical result of predicting a chaotic time series, which was obtained by an ESN, by replacing the linear readout mechanism employed by ESNs with a multi layer perceptron (MLP). This shall allow the resulting network to harness the dynamical memory of an ESN with the approximating powers of the gradient descent algorithm of a MLP, resulting in a powerful approximator, smaller in size than using just an ESN and allowing for more feasible practical implementations in telecommunications.
1996
In this work methods for performing time series prediction on complex real world time series are examined. In particular series exhibiting non-linear or chaotic behaviour are selected for analysis. A range of methodologies based on Takens’ embedding theorem are considered and compared with more conventional methods. A novel combination of methods for determining the optimal embedding parameters are employed and tried out with multivariate financial time series data and with a complex series derived from an experiment in biotechnology. The results show that this combination of techniques provide accurate results while improving dramatically the time required to produce predictions and analyses, and eliminating a range of parameters that had hitherto been fixed empirically. The architecture and methodology of the prediction software developed is described along with design decisions and their justification. Sensitivity analyses are employed to justify the use of this combination of methods, and comparisons are made with more conventional predictive techniques and trivial predictors showing the superiority of the results generated by the work detailed in this thesis.
Physica D: Nonlinear Phenomena, 1998
Local linear prediction, based on the ordinary least squares (OLS) approach, is one of several methods that have been applied to prediction of chaotic time series. Apart from potential linearization errors, a drawback of this approach is the high variance of the predictions under certain conditions. Here, a different set of so-called linear regularization techniques, originally derived to solve ill-posed regression problems, are compared to OLS for chaotic time series corrupted by additive measurement noise. These methods reduce the variance compared to OLS, but introduce more bias. A main tool of analysis is the singular value decomposition (SVD), and a key to successful regularization is to damp the higher order SVD components. Several of the methods achieve improved prediction compared to OLS for synthetic noise-corrupted data from well-known chaotic systems. Similar results were found for real-world data from the R-R intervals of ECG signals. Good results are also obtained for real sunspot data, compared to published predictions using nonlinear techniques.
2001
We address two aspects in chaotic time series analysis, namely the definition of embedding parameters and the largest Lyapunov exponent. It is necessary for performing state space reconstruction and identification of chaotic behavior. For the first aspect, we examine the mutual information for determination of time delay and false nearest neighbors method for choosing appropriate embedding dimension. For the second aspect we suggest neural network approach, which is characterized by simplicity and accuracy.
WSEAS Transactions on Computer Research, 2008
The prediction of chaotic time series with neural networks is a traditional practical problem of dynamic systems. This paper is not intended for proposing a new model or a new methodology, but to study carefully and thoroughly several aspects of a model on which there are no enough communicated experimental data, as well as to derive conclusions that would be of interest. The recurrent neural networks (RNN) models are not only important for the forecasting of time series but also generally for the control of the dynamical system. A RNN with a sufficiently large number of neurons is a nonlinear autoregressive and moving average (NARMA) model, with "moving average" referring to the inputs. The prediction can be assimilated to identification of dynamic process. An architectural approach of RNN with embedded memory, "Nonlinear Autoregressive model process with eXogenous input" (NARX), showing promising qualities for dynamic system applications, is analyzed in this paper. The performances of the NARX model are verified for several types of chaotic or fractal time series applied as input for neural network, in relation with the number of neurons, the training algorithms and the dimensions of his embedded memory. In addition, this work has attempted to identify a way to use the classic statistical methodologies (R/S Rescaled Range analysis and Hurst exponent) to obtain new methods of improving the process efficiency of the prediction chaotic time series with NARX.
Expert Systems with Applications, 2011
In this paper, two CI techniques, namely, single multiplicative neuron (SMN) model and adaptive neurofuzzy inference system (ANFIS), have been proposed for time series prediction. A variation of particle swarm optimization (PSO) with cooperative sub-swarms, called COPSO, has been used for estimation of SMN model parameters leading to COPSO-SMN. The prediction effectiveness of COPSO-SMN and ANFIS has been illustrated using commonly used nonlinear, non-stationary and chaotic benchmark datasets of Mackey-Glass, Box-Jenkins and biomedical signals of electroencephalogram (EEG). The training and test performances of both hybrid CI techniques have been compared for these datasets.
Neural Processing …, 2011
The accuracy of a model to forecast a time series diminishes as the prediction horizon increases, in particular when the prediction is carried out recursively. Such decay is faster when the model is built using data generated by highly dynamic or chaotic systems. This paper presents a topology and training scheme for a novel artificial neural network, named "Hybrid-connected Complex Neural Network" (HCNN), which is able to capture the dynamics embedded in chaotic time series and to predict long horizons of such series. HCNN is composed of small recurrent neural networks, inserted in a structure made of feed-forward and recurrent connections and trained in several stages using the algorithm back-propagation through time (BPTT). In experiments using a Mackey-Glass time series and an electrocardiogram (ECG) as training signals, HCNN was able to output stable chaotic signals, oscillating for periods as long as four times the size of the training signals. The largest local Lyapunov Exponent (LE) of predicted signals was positive (an evidence of chaos), and similar to the LE calculated over the training signals. The magnitudes of peaks in the ECG signal were not accurately predicted, but the predicted signal was similar to the ECG in the rest of its structure.
Chaos, Solitons & Fractals, 2008
Based on the genetic algorithm (GA) and steepest descent method (SDM), this paper proposes a hybrid algorithm for the learning of neural networks to identify chaotic systems. The systems in question are the logistic map and the Duffing equation. Different identification schemes are used to identify both the logistic map and the Duffing equation, respectively. Simulation results show that our hybrid algorithm is more efficient than that of other methods.
Journal of Computational Methods in Sciences and Engineering, 2016
In this paper, we used four types of artificial neural network (ANN) to predict the behavior of chaotic time series. Each neural network that used in this paper acts as global model to predict the future behavior of time series. Prediction process is based on embedding theorem and time delay determined by this theorem. This ANN applied to the time series that generated by Mackey-glass equation that has a chaotic behavior. At the end, all neural networks are used to solve this problem and their results are compared and analyzed.
2008
This paper examines how efficient neural networks are relative to linear and polynomial approximations to forecast a time series that is generated by the chaotic Mackey-Glass differential delay equation. The forecasting horizon is one step ahead. A series of regressions with polynomial approximators and a simple neural network with two neurons is taking place and compare the multiple correlation coefficients. The neural network, a very simple neural network, is superior to the polynomial expansions, and delivers a virtually perfect forecasting. Finally, the neural network is much more precise, relative to the other methods, across a wide set of realizations.
Technological Forecasting and Social Change, 1991
It is well known that a nonlinear recursive equation can produce a chaotic sequence at certain values of the parameter. Furthermore, in the chaotic regime, extremely small changes in the initial value or in the value of the parameter produce very large changes in the sequence. It is surmising therefore that a short segment of a chaotic sequence can be used to reconstruct large portions of the sequence and to forecast future values of the sequence over short ranges. Furthermore, while the accuracy of the forecasts is dependent on the precision of the data, the relationship is much less sensitive than might have been expected. This is demonstrated by fitting a two-parameter model to two different types of chaotic equations: one a polynomial and the other a trigonometric function. This leads to the expectation that under certain circumstances, it may be possible to forecast values in a chaotic series over limited ranges, even if initial data are somewhat degraded.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.