Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2009, Physical Review E
We evaluate information theoretic quantities that quantify complexity in terms of k-th order statistical dependencies that cannot be reduced to interactions among k − 1 random variables. Using symbolic dynamics of coupled maps and cellular automata as model systems, we demonstrate that these measures are able to identify complex dynamical regimes.
Physical Review E, 2020
We propose a metric to characterize the complex behavior of a dynamical system and to distinguish between organized and disorganized complexity. The approach combines two quantities that separately assess the degree of unpredictability of the dynamics and the lack of describability of the structure in the Poincaré plane constructed from a given time series. As for the former, we use the permutation entropy Sp, while for the later, we introduce an indicator, the structurality ∆, which accounts for the fraction of visited points in the Poincaré plane. The complexity measure thus defined as the sum of those two components is validated by classifying in the (Sp,∆) space the complexity of several benchmark dissipative and conservative dynamical systems. As an application, we show how the metric can be used as a powerful biomarker for different cardiac pathologies and to distinguish the dynamical complexity of two electrochemical dissolutions.
Arxiv preprint adap-org/9909002, 1999
In recent studies, new measures of complexity for nonlinear systems have been proposed based on probabilistic grounds, as the LMC measure (Phys. Lett. A 209 (1995) 321) or the SDL measure (Phys. Rev. E 59 (1999) 2). All these measures share an intuitive consideration: complexity seems to emerge in nature close to instability points, as for example the phase transition points characteristic of critical phenomena. Here we discuss these measures and their reliability for detecting complexity close to critical points in complex systems composed of many interacting units. Both a two-dimensional spatially extended problem (the 2D Ising model) and a ∞-dimensional (random graph) model (random Boolean networks) are analysed . It is shown that the LMC and the SDL measures can be easily generalized to extended systems but fails to detect real complexity.
Physica A: Statistical Mechanics and its Applications, 2005
International Journal of Theoretical Physics, 1986
Quantities are defined operationally which qualify as measures of complexity of patterns arising in physical situations. Their main features, distinguishing them from previously used quantities, are the following: (1) they are measuretheoretic concepts, more closely related to Shannon entropy than to computational complexity; and (2) they are observables related to ensembles of patterns, not to individual patterns. Indeed, they are essentially Shannon information needed to specify not individual patterns, but either measure-theoretic or algebraic properties of ensembles of patterns arising in a priori translationally invariant situations. Numerical estimates of these complexities are given for several examples of patterns created by maps and by cellular automata.
Information and Computation/information and Control, 2008
Cellular Automata can be considered discrete dynamical sys- tems and at the same time a model of parallel computation. In this paper we investigate the connections between dynamical and computa- tional properties of Cellular Automata. We propose a classification of Cellular Automata according to the language complexities which rise from the basins of attraction of subshift attractors and investigate the
Physical Review Letters, 1989
Statistical mechanics is used to describe the observed information processing complexity of nonlinear dynamical systems. We introduce a measure of complexity distinct from and dual to the informationtheoretic entropies and dimensions. A technique is presented that directly reconstructs minimal equations of motion from the recursive structure of measurement sequences. Application to the perioddoubling cascade demonstrates a form of superuniversality that refers only to the entropy and complexity of a data stream.
Artificial life, 2015
In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.
Physical Review E, 1996
We introduce a measure of complexity in terms of the average number of bits per time unit necessary to specify the sequence generated by the system. In random dynamical system, this indicator coincides with the rate K of divergence of nearby trajectories evolving under two different noise realizations. The meaning of K is discussed in the context of the information theory, and it is shown that it can be determined from real experimental data. In presence of strong dynamical intermittency, the value of K is very different from the standard Lyapunov exponent λ σ computed considering two nearby trajectories evolving under the same randomness. However, the former is much more relevant than the latter from a physical point of view as illustrated by some numerical computations for noisy maps and sandpile models.
Unifying Themes in Complex Systems, 2008
The generalized Statistical Complexity Measure (SCM) is a functional that characterizes the probability distribution P associated to the time series generated by a dynamical system under study. It quantifies not only randomness but also the presence of correlational structures. In this seminar several fundamental issues are reviewed: a) selection of the information measure I; b) selection of the probability metric space and its corresponding distance D; c) definition of the generalized disequilibrium Q; d) selection of the probability distribution P associated to a dynamical system or time series under study, which in fact, is a basic problem. Here we show that improvements can be expected if the underlying probability distribution is "extracted" by appropriate consideration regarding causal effects in the system's dynamics. Several well-known model-generated time series, usually regarded as being of either stochastic or chaotic nature, are analyzed. The main achievement of this approach is the possibility of clearly distinguish between them in the Entropy-Complexity representation space, something that is rather difficult otherwise.
Journal of Atomic, Molecular, and Optical Physics, 2012
We suggest a new method for the analysis of experimental time series that can distinguish high dimensional dynamics from stochastic motion. It is based on the idea of statistical complexity, i.e. the Shannon entropy of the so-called -machine (a Markov-type model of the observed time series). This approach has been recently demonstrated to be efficient for making a distinction between a molecular trajectory in water and noise. In this paper we analyse the difference between chaos and noise using the Chirikov-Taylor Standard map as an example in order to elucidate the basic mechanism that makes the value of complexity in deterministic systems high. In particular, we show that the value of statistical complexity is high for the case of chaos, and attains zero value for the case of stochastic noise. We further study the Markov property of the data generated by the Standard map to clarify the role of long time memory in differentiating the cases of deterministic systems and stochastic motion.
Complexity, 2004
This work concerns the interaction between two classical problems: the forecasting of the dynamical behaviors of elementary cellular automata (ECA) from its intrinsic mathematical laws and the conditions that determine the emergence of complex dynamics. To approach these problems, and inspired by the theory of reversible logical gates, we decompose the ECA laws in a "spectrum" of dyadic Boolean gates. Emergent properties due to interactions are captured generating another spectrum of logical gates. The combined analysis of both spectra shows the existence of characteristic bias in the distribution of Boolean gates for ECA belonging to different dynamical classes. These results suggest the existence of signatures capable to indicate the propensity to develop complex dynamics. Logical gates "exclusive-or" and "equivalence" are among these signatures of complexity. An important conclusion is that within ECA space, interactions are not capable to generate signatures of complexity in the case these signatures are absent in the intrinsic law of the automaton.
Encyclopedia of Complexity and Systems Science, 2009
Complexity is a multifaceted concept, related to the degree of organization of systems. Patterns of complex organization and behavior are identified in all kinds of systems in nature and technology. Essential for the characterization of complexity is its quantification, the introduction of complexity measures or descriptors, following Lord Kelvin's words that science begins when we can use numbers. Historically, the first attempt to quantify complexity was based on Shannon's information theory [1], and it involved the information content as a measure of molecular complexity [2]. Fifty years later, the complexity of molecules and their interactions is assessed by a variety of methods, with information theory preserving its leading role. This article aims to review the vast area of complexity measures, based on information theory as applied to chemical and biochemical systems. Many of these measures have found application for predicting physicochemical properties and biological activities of chemical compounds, contributing thus to the development of new drugs and chemical products. The expertise accumulated has recently found a new vast area of application, the networks of biomolecules performing the basic functions of life in cells and organisms. The essence of life itself has been reformulated to incorporate as an essential component the processing of information. Information Theoretic Complexity Measures, Table 1 Comparison of the predicted and measured values of the nuclear binding energies of some nuclides of chemical elements #101 to 108 [12]ˇz
The European Physical Journal Special Topics, 2013
The coupling complexity index is an information measure introduced within the framework of ordinal symbolic dynamics. This index is used to characterize the complexity of the relationship between dynamical system components. In this work, we clarify the meaning of the coupling complexity by discussing in detail some cases leading to extreme values, and present examples using synthetic data to describe its properties. We also generalize the coupling complexity index to the multivariate case and derive a number of important properties by exploiting the structure of the symmetric group. The applicability of this index to the multivariate case is demonstrated with a real-world data example. Finally, we define the coupling complexity rate of random and deterministic time series. Some formal results about the multivariate coupling complexity index have been collected in an Appendix.
Journal of Computational Biology, 2014
Context dependence is central to the description of complexity. Keying on the pairwise definition of ''set complexity,'' we use an information theory approach to formulate general measures of systems complexity. We examine the properties of multivariable dependency starting with the concept of interaction information. We then present a new measure for unbiased detection of multivariable dependency, ''differential interaction information.'' This quantity for two variables reduces to the pairwise ''set complexity'' previously proposed as a context-dependent measure of information in biological systems. We generalize it here to an arbitrary number of variables. Critical limiting properties of the ''differential interaction information'' are key to the generalization. This measure extends previous ideas about biological information and provides a more sophisticated basis for the study of complexity. The properties of ''differential interaction information'' also suggest new approaches to data analysis. Given a data set of system measurements, differential interaction information can provide a measure of collective dependence, which can be represented in hypergraphs describing complex system interaction patterns. We investigate this kind of analysis using simulated data sets. The conjoining of a generalized set complexity measure, multivariable dependency analysis, and hypergraphs is our central result. While our focus is on complex biological systems, our results are applicable to any complex system.
International Journal of Modern Physics B, 1998
We introduce a measure of complexity in terms of the average number of bits per time unit necessary to specify the sequence generated by the system. In random dynamical system, this indicator coincides with the rate K of divergence of nearby trajectories evolving under two different noise realizations. The meaning of K is discussed in the context of the information theory, and it is shown that it can be determined from real experimental data. In presence of strong dynamical intermittency, the value of K is very different from the standard Lyapunov exponent λ σ computed considering two nearby trajectories evolving under the same randomness. However, the former is much more relevant than the latter from a physical point of view as illustrated by some numerical computations for noisy maps and sandpile models.
2020
Recently, it has been argued that entropy can be a direct measure of complexity, where the smaller value of entropy indicates lower system complexity, while its larger value indicates higher system complexity. We dispute this view and propose a universal measure of complexity that is based on Gell-Mann’s view of complexity. Our universal measure of complexity is based on a non-linear transformation of time-dependent entropy, where the system state with the highest complexity is the most distant from all the states of the system of lesser or no complexity. We have shown that the most complex is the optimally mixed state consisting of pure states, i.e., of the most regular and most disordered which the space of states of a given system allows. A parsimonious paradigmatic example of the simplest system with a small and a large number of degrees of freedom is shown to support this methodology. Several important features of this universal measure are pointed out, especially its flexibili...
Entropy, 2015
Interdependencies of stochastically interacting units are usually quantified by the Kullback-Leibler divergence of a stationary joint probability distribution on the set of all configurations from the corresponding factorized distribution. This is a spatial approach which does not describe the intrinsically temporal aspects of interaction. In the present paper, the setting is extended to a dynamical version where temporal interdependencies are also captured by using information geometry of Markov chain manifolds.
2013
exact Complexity: The spectral Decomposition of intrinsic Computation James Crutchfield, University of California, Davis I'll present exact expressions for a wide family of complexity measures for hidden Markov processes. An eigen-decomposition using the Cauchy integral formula for operator-valued functions leads to closed-form expressions involving the full eigenvalue spectrum of a process's Ɛ-machine causal state dynamic. The measures include correlation functions, power spectra, past-future mutual information (excess entropy), transient and synchronization informations, and many others. Local Complexity for Heterogeneous spatial systems David Feldman, College of the Atlantic I will begin with a quick review of the excess entropy, a well understood and broadly applicable measure of complexity that allows for a comparison of information processing abilities among very different systems. I will then present some relatively new work in which a local form of the two-dimensiona...
Chaos, Solitons & Fractals, 1994
A number of different measures of complexity have been described, discussed, and applied to the logistic map. A classification of these measures has been proposed, distinguishing homogeneous and generating partitions in phase space as well as structural and dynamical elements of the considered measure. The specific capabilities of particular measures to detect particular types of behavior of dynamical systems have been investigated and compared with each other. 134 R. WACKERBAUER el al.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.