Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
1994, Chaos, Solitons & Fractals
A number of different measures of complexity have been described, discussed, and applied to the logistic map. A classification of these measures has been proposed, distinguishing homogeneous and generating partitions in phase space as well as structural and dynamical elements of the considered measure. The specific capabilities of particular measures to detect particular types of behavior of dynamical systems have been investigated and compared with each other. 134 R. WACKERBAUER el al.
Physical Review E, 2020
We propose a metric to characterize the complex behavior of a dynamical system and to distinguish between organized and disorganized complexity. The approach combines two quantities that separately assess the degree of unpredictability of the dynamics and the lack of describability of the structure in the Poincaré plane constructed from a given time series. As for the former, we use the permutation entropy Sp, while for the later, we introduce an indicator, the structurality ∆, which accounts for the fraction of visited points in the Poincaré plane. The complexity measure thus defined as the sum of those two components is validated by classifying in the (Sp,∆) space the complexity of several benchmark dissipative and conservative dynamical systems. As an application, we show how the metric can be used as a powerful biomarker for different cardiac pathologies and to distinguish the dynamical complexity of two electrochemical dissolutions.
Observational data of natural systems, as measured in astrophysical, geophysical or physiological experiments are typically quite di erent from those obtained in laboratories. Due to the peculiarities with these data, well-known characteristics processes, such as periodicities or fractal dimension, often do not provide a suitable description. To study such data, we present here the use of measures of complexity, which are mainly basing on symbolic dynamics. We distinguish two types of such quantities: traditional measures (e.g. algorithmic complexity) which are measures of randomness and alternative measures (e.g. -complexity) which relate highest complexity to some critical points. It is important to note that there is no optimum measure of complexity. Its choice should depend on the context. Mostly, a combination of some such quantities is appropriate. Applying this concept to three examples in astrophysics, cardiology and cognitive psychology, we show that it can be helpful also in cases where other tools of data analysis fail.
Statistical complexity measures are used to quantify the degree of complexity of the delayed logistic map, with linear and nonlinear feedback. We employ two methods for calculating the complexity measures, one with the 'histogram-based' probability distribution function and the other one with ordinal patterns. We show that these methods provide complementary information about the complexity of the delay-induced dynamics: there are parameter regions where the histogram-based complexity is zero while the ordinal pattern complexity is not, and vice versa. We also show that the time series generated from the nonlinear delayed logistic map can present zero missing or forbidden patterns, i.e. all possible ordinal patterns are realized into orbits.
The generalized Statistical Complexity Measure (SCM) is a functional that characterizes the probability distribution P associated to the time series generated by a dynamical system under study. It quantifies not only randomness but also the presence of correlational structures. In this seminar several fundamental issues are reviewed: a) selection of the information measure I; b) selection of the probability metric space and its corresponding distance D; c) definition of the generalized disequilibrium Q; d) selection of the probability distribution P associated to a dynamical system or time series under study, which in fact, is a basic problem. Here we show that improvements can be expected if the underlying probability distribution is "extracted" by appropriate consideration regarding causal effects in the system's dynamics. Several well-known model-generated time series, usually regarded as being of either stochastic or chaotic nature, are analyzed. The main achievement of this approach is the possibility of clearly distinguish between them in the Entropy-Complexity representation space, something that is rather difficult otherwise.
Open Physics, 2015
We have proposed novel measures based on the Kolmogorov complexity for use in complex system behavior studies and time series analysis. We have considered background of the Kolmogorov complexity and also we have discussed meaning of the physical as well as other complexities. To get better insights into the complexity of complex systems and time series analysis we have introduced the three novel measures based on the Kolmogorov complexity: (i) the Kolmogorov complexity spectrum, (ii) the Kolmogorov complexity spectrum highest value and (iii) the overall Kolmogorov complexity. The characteristics of these measures have been tested using a generalized logistic equation. Finally, the proposed measures have been applied on different time series originating from: the model output (the biochemical substance exchange in a multi-cell system), four different geophysical phenomena (dynamics of: river flow, long term precipitation, indoor 222 Rn concentration and UV radiation dose) and economy (stock prices dynamics). Results which are obtained offer deeper insights into complexity of the system dynamics behavior and time series analysis when the proposed complexity measures are applied. distributed manner; there are many connections between the system's parts [2,3], (3) it is difficult to model complex systems and to predict their behavior even if one knows to a large extent the parts of such systems and the connections between the parts . The complexity of a system depends on the number of its elements and connections between the elements (the system´s structure). In the review paper Crutchfield [5] has underlined: " Spontaneous organization, as a common phenomenon, reminds us of a more basic, nagging puzzle. If, as Poincaré found, chaos is endemic to dynamics, why is the world not a mass of randomness? The world is, in fact, quite structured, and we now know several of the mechanisms that shape microscopic fluctuations as they are amplified to macroscopic patterns. Critical phenomena in statistical mechanics and pattern formation in dynamics are two arenas that explain in predictive detail how spontaneous organization works. Moreover, everyday experience shows us that nature inherently organizes; it generates pattern. Pattern is as much the fabric of life as life's unpredictability". These sentences are also related to the phenomenon of the complexity of systems in many disciplines, ranging from philosophy and cognitive science to evolutionary and developmental biology and particle astrophysics [5,9 and refrences herein].
Physica D: Nonlinear Phenomena, 2004
A statistical measure of complexity utilising the concept of entropy or information is proposed. Our way in this study is to use a nonextensive entropy instead of an extensive (additive) Shannon entropy in the deÿnition, but can be characterised as a di erence between the qth-order RÃ enyi entropy and the second one. Furthermore, we devise a conditional, joint, and mutual complexity measure as a coherent possibility. The behavior of the measure for the logistic map shows that it is more sensitive to nonextensivity at the transition point ac ∼ 3:8284 : : : than any other values when 0 ¡ q ¡ 1.
Physica A: Statistical Mechanics and its Applications, 2005
Open Systems & Information Dynamics, 1992
Measures of complexity and meaning in nonlinear dynamical systems are presented, applied to specific examples, and compared with each other. A basic conceptual and operational equivalence of both kinds of measures is described. This equivalence substantiates earlier indications by Arian and Grassberger for a close relationship between complexity and meaning. Both concepts are suggested as candidates to demonstrate the necessity of certain extensions and modifications required to update several habitual regulative principles of the exact sciences.
2010
A generalized Statistical Complexity Measure (SCM) is a functional that characterizes the probability distribution P associated to the time series generated by a given dynamical system. It quantifies not only randomness but also the presence of correlational structures. We review here several fundamental issues in such a respect, namely, (a) the selection of the information measure I; (b) the choice of the probability metric space and associated distance D; (c) the question of defining the so-called generalized disequilibrium Q; (d) the adequate way of picking up the probability distribution P associated to a dynamical system or time series under study, which is indeed a fundamental problem. In this communication we show (point d) that sensible improvements in the final results can be expected if the underlying probability distribution is "extracted" via appropriate consideration regarding causal effects in the system's dynamics.
Physica A: Statistical Mechanics and its Applications, 2006
We discuss bounds on the values adopted by the generalized statistical complexity measures [M.T. Martin et al., Phys. Lett. A 311 (2003) 126; P.W. Lamberti et al., Physica A 334 (2004) 119] introduced by Lo´pez Ruiz et al. [Phys. Lett. A 209 (1995) 321] and Shiner et al. [Phys. Rev. E 59 (1999) 1459]. Several new theorems are proved and illustrated with reference to the celebrated logistic map. r
The definition of complexity through Statistical Complexity Measures (SCM) has recently seen major improvements. Mostly, effort is concentrated in measures on time series. We propose a SCM definition for spatial dynamical systems. Our definition is in line with the trend to combine entropy with measures of structure (such as disequilibrium). We study the behaviour of our definition against the vectorial noise model of Collective Motion. From a global perspective, we show how our SCM is minimal at both the microscale and macroscale, while it reaches a maximum at the ranges that define the mesoscale in this model. From a local perspective, the SCM is minimum both in highly ordered and chaotic areas, while it reaches a maximum at the edges between such areas. These characteristics suggest this is a good candidate for detecting the mesoscale of arbitrary dynamical systems as well as regions where the complexity is maximal in such systems.
International Journal of Theoretical Physics, 1997
The relation between chaotic behavior and complexity for one-dimensional maps is discussed. The one-dimensional maps are mapped into a binary string via symbolic dynamics in order to evaluate the complexity. We apply the complexity measure of Lempel and Ziv to these binary strings. To characterize the chaotic behavior, we calculate the Liapunov exponent. We show that the exact normalized complexity for the logistic map~ [0, l] ~ [0, l],f(x) = 4x(l-x) is given by 1. Since the discovery of chaotic attractors, chaos has become an important concept in nearly all branches of the natural sciences. The difference and differential equations which are believed to govern our natural world and may exhibit chaotic attractors are widely discussed in literature (see, for example, Steeb 1992a,b, 1996). The simplest systems showing chaotic behavior are one-dimensional maps J2 1-~/, where I is an interval. A definition for chaos is as follows: Let X be a set. The mapping g: X-~ X is said to be chaotic on X if (1) g has sensitive dependence on initial conditions, (2) g is topological transitive, and (3) periodic points are dense in X. Here we use the Liapunov exponent to characterize chaos. The exponent measures the sensitive dependence on initial conditions. Many different definitions of complexity have been proposed in the literature. Among them are: algorithmic complexity (Kolmogorov-Chaitin)
Complexity, 2007
For many systems characterized as "complex" the patterns exhibited on different scales differ markedly from one another. For example, the biomass distribution in a human body "looks very different" depending on the scale at which one examines it. Conversely, the patterns at different scales in "simple" systems (e.g., gases, mountains, crystals) vary little from one scale to another. Accordingly, the degrees of self-dissimilarity between the patterns of a system at various scales constitute a complexity "signature" of that system. Here we present a novel quantification of self-dissimilarity. This signature can, if desired, incorporate a novel information-theoretic measure of the distance between probability distributions that we derive here. Whatever distance measure is chosen, our quantification of self-dissimilarity can be measured for many kinds of real-world data. This allows comparisons of the complexity signatures of wholly different kinds of systems (e.g., systems involving information density in a digital computer vs. species densities in a rain forest vs. capital density in an economy, etc.). Moreover, in contrast to many other suggested complexity measures, evaluating the self-dissimilarity of a system does not require one to already have a model of the system. These facts may allow self-dissimilarity signatures to be used as the underlying observational variables of an eventual overarching theory relating all complex systems. To illustrate self-dissimilarity, we present several numerical experiments. In particular, we show that the underlying structure of the logistic map is picked out by the self-dissimilarity signature of time series produced by that map.
International Journal of Theoretical Physics, 1986
Quantities are defined operationally which qualify as measures of complexity of patterns arising in physical situations. Their main features, distinguishing them from previously used quantities, are the following: (1) they are measuretheoretic concepts, more closely related to Shannon entropy than to computational complexity; and (2) they are observables related to ensembles of patterns, not to individual patterns. Indeed, they are essentially Shannon information needed to specify not individual patterns, but either measure-theoretic or algebraic properties of ensembles of patterns arising in a priori translationally invariant situations. Numerical estimates of these complexities are given for several examples of patterns created by maps and by cellular automata.
Arxiv preprint adap-org/9909002, 1999
In recent studies, new measures of complexity for nonlinear systems have been proposed based on probabilistic grounds, as the LMC measure (Phys. Lett. A 209 (1995) 321) or the SDL measure (Phys. Rev. E 59 (1999) 2). All these measures share an intuitive consideration: complexity seems to emerge in nature close to instability points, as for example the phase transition points characteristic of critical phenomena. Here we discuss these measures and their reliability for detecting complexity close to critical points in complex systems composed of many interacting units. Both a two-dimensional spatially extended problem (the 2D Ising model) and a ∞-dimensional (random graph) model (random Boolean networks) are analysed . It is shown that the LMC and the SDL measures can be easily generalized to extended systems but fails to detect real complexity.
Physical Review E, 2009
We evaluate information theoretic quantities that quantify complexity in terms of k-th order statistical dependencies that cannot be reduced to interactions among k − 1 random variables. Using symbolic dynamics of coupled maps and cellular automata as model systems, we demonstrate that these measures are able to identify complex dynamical regimes.
Physics Letters A, 2001
We apply a generalized version of the Kolmogorov-Sinai entropy, based on a non-extensive form, to analyzing the dynamics of the logistic map at the chaotic threshold, the paradigm of power-law sensitivity to initial conditions. We make the statistical averages on the distribution of the power indexes β, and we show that the resulting entropy time evolution becomes a linear function of time if we assign to the non-extensive index q the value Q < 1 prescribed by the heuristic arguments of earlier work. We also show that the emerging entropy index Q is determined by the asymptotic mean value of the index β, and that this same mean value determines the strength of the logarithmic time increase of entropy, stemming from the adoption of the ordinary Shannon form.
Arxiv preprint nlin/0307013, 2003
Some aspects of the predictability problem in dynamical systems are reviewed. The deep relation among Lyapunov exponents, Kolmogorov-Sinai entropy, Shannon entropy and algorithmic complexity is discussed. In particular, we emphasize how a characterization of the unpredictability of a system gives a measure of its complexity. A special attention is devoted to finite-resolution effects on predictability, which can be accounted with suitable generalization of the standard indicators. The problems involved in systems with intrinsic randomness is discussed, with emphasis on the important problems of distinguishing chaos from noise and of modeling the system. PACS numbers: PACS 45.05.+x, 05.45.-a All the simple systems are simple in the same way, each complex system has its own complexity (freely inspired by Anna Karenina by Lev N.
Journal of Statistical Physics, 2004
We construct a complexity measure from first principles, as an average over the “obstruction against prediction” of some observable that can be chosen by the observer. Our measure evaluates the variability of the predictability for characteristic system behaviors, which we extract by means of the thermodynamic formalism. Using theoretical and experimental applications, we show that “complex” and “chaotic” are different notions of perception. In comparison to other proposed measures of complexity, our measure is easily computable, non-divergent for the classical 1-d dynamical systems, and has properties of non-overuniversality. The measure can also be computed for higher-dimensional and experimental systems, including systems composed of different attractors. Moreover, the results of the computations made for classical 1-d dynamical systems imply that it is not the nonhyperbolicity, but the existence of a continuum of characteristic system length scales, that is at the heart of complexity.
NATO ASI Series, 1989
This volume serves as a general introduction to the state of the art of quantitatively characterizing chaotic and turbulent behavior. It is the outgrowth of an international workshop on "Quantitative Measures of Dynamical Complexity and Chaos" held at Bryn Mawr College, June 22-24, 1989. The workshop was co-sponsored by the Naval Air Development Center in Warminster, PA and by the NATO Scientific Affairs Programme through its special program on Chaos and Complexity. Meetings on this subject have occurred regularly since the NATO workshop held in June 1983 at Haverford College only two kilometers distant from the site of this latest in the series. At that first meeting, organized by J. Gollub and H. Swinney, quantitative tests for nonlinear dynamics and chaotic behavior were debated and promoted [1). In the six years since, the methods for dimension, entropy and Lyapunov exponent calculations have been applied in many disciplines and the procedures have been refined. Since then it has been necessary to demonstrate quantitatively that a signal is chaotic rather than it being acceptable to observe that "it looks chaotic". Other related meetings have included the Pecos River Ranch meeting in September 1985 of G. Mayer-Kress [2) and the reflective and forward looking gathering near Jerusalem organized by M. Shapiro and I. Procaccia in December 1986 [3). This meeting was proof that interest in measuring chaotic and turbulent signals is widespread. Those facing limits of precision or length of data sets are hard at work developing new algorithms and refining the accuracy of old ones. Applications to symbolic dynamics and to spatio-temporal dynamics are also now emerging with "complexity" as the byword for what is even a richer subject than "chaos". The success of the meeting was in large part guaranteed by the enthusiasm of the participants, but without the tireless efforts of a few key persons, the order of the meeting would have fallen victim to the ever looming chaos. Special thanks go to Ann Daudert, secretary of the physics department at Bryn Mawr College, and her assistant, Linath Lin. We also ackno~ledge the behind-the-scenes and late-night efforts of the staff of the Bryn Mawr Summer Conference Office under the direction of L. Zernicke. Many others of our colleagues and associates contributed as needed, including M.E. Farrell, G. Alman, H. Lin, and N. Tufillaro. To all of them go our warmest gratitude. With help such as theirs, it will always be more of a pleasure than a burden to organize a meeting. Finally we should acknowledge special efforts that enlivened the meeting. J. Doran and her staff provided excellent meals and refreshments. L. Caruso-Haviland and a small crew of dedicated performers and technical staff enriched one evening with "Chaotic Metamorphoses", an inspired combination of video, cinematography, choreography, and readings. The program notes for the performance are included as part of these v proceedings. Their conference T-shirts, "Complexity and Chaos at Bryn Mawr College", were duly earned. Perhaps our principal regret (and pleasure) will be the constant task of explaining the scientific meaning of the T-Shirt title in an effort to ride the public relations wave crest.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.