Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2002, Journal of Econometrics
AI
The paper discusses Information and Entropy Econometrics (IEE), which integrates the principles of Information Theory and Maximum Entropy for statistical inference in economics. It explores the historical evolution of Maximum Entropy through significant contributions from early researchers such as Bernoulli, Bayes, and Shannon, then emphasizes various approaches to estimating probability distributions under uncertainty in economic data. The analysis highlights the importance of entropy measures in both discrete and continuous random variables.
Journal of Quantitative Methods, 2019
Advances in Applied Mathematics, 1981
Friedman and Shimony exhibited an anomaly in Jaynes' maximum entropy prescription: that if a certain unknown parameter is assumed to be characterized a priori by a normalizable probability measure, then the prior and posterior probabilities computed by means of the prescription are consistent with probability theory only if this measure assigns probability I to a single value of the parameter and probability 0 to the entire range of other values. We strengthen this result by deriving the same conclusion using only the assumption that the probability measure is u-finite. We also show that when the hypothesis and evidence to which the prescription is applied are expressed in certain rather simple languages, then the maximum entropy prescription yields probability evaluation in agreement with one of Catnap's X-continuum of inductive methods, namely X = 00. We conclude that the maximum entropy prescription is correct only under special circumstances, which are essentially those in which it is appropriate to use h = co.
2012
We will discuss the maximum entropy production (MaxEP) principle based on Jaynes' information theoretical arguments, as was done by Dewar (2003, 2005). With the help of a simple mathematical model of a non-equilibrium system, we will show how to derive minimum and maximum entropy production. Furthermore, the model will help us to clarify some confusing points and to see differences between some MaxEP studies in the literature.
Journal of Physics A: Mathematical and Theoretical, 2007
We will discuss the maximum entropy production (MaxEP) principle based on Jaynes' information theoretical arguments, as was done by . With the help of a simple mathematical model of a non-equilibrium system, we will show how to derive minimum and maximum entropy production. Furthermore, the model will help us to clarify some confusing points and to see differences between some MaxEP studies in the literature.
Entropy
This article is about the profound misuses, misunderstanding, misinterpretations and misapplications of entropy, the Second Law of Thermodynamics and Information Theory. It is the story of the “Greatest Blunder Ever in the History of Science”. It is not about a single blunder admitted by a single person (e.g., Albert Einstein allegedly said in connection with the cosmological constant, that this was his greatest blunder), but rather a blunder of gargantuan proportions whose claws have permeated all branches of science; from thermodynamics, cosmology, biology, psychology, sociology and much more.
Physical Review E
There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H(p) = − i pi log pi. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Pólya urn processes, which are simple self-reinforcing processes, the source information rate is SIT = 1 1−c 1 N log N , the thermodynamical (extensive) entropy is (c, d)-entropy, SEXT = S (c,0) , and the entropy in the maxent principle (MEP) is SMEP(p) = − i log pi. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is SIT = 1 + 1 2 log W , the extensive entropy is SEXT = H(p), and the maxent result is SMEP(p) = H(p/p1) + H(1 − p/p1). Finally, for multinomial mixture processes, the information rate is given by the conditional entropy H f , with respect to the mixing kernel f , the extensive entropy is given by H, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.
C. Beisbart and S. Hartmann (eds), Probabilities in Physics. Oxford University Press, 115-142., 2011
Entropy, 2014
The routine definitions of Shannon entropy for both discrete and continuous probability laws show inconsistencies that make them not reciprocally coherent. We propose a few possible modifications of these quantities so that 1) they no longer show incongruities, 2) they go one into the other in a suitable limit as the result of a renormalization. The properties of the new quantities would slightly differ from that of the usual entropies in a few other respects PACS: 02.50.Cw, 05.45.Tp MSC: 94A17, 54C70
Entropy, 2001
In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a first reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here
Synthese 201, 121, 2023
The Maximum Entropy Production Principle (MEPP) stands out as an overarching principle that rules life phenomena in Nature. However, its explanatory power beyond heuristics remains controversial. On the one hand, the MEPP has been successfully applied principally to non-living systems far from thermodynamic equilibrium. On the other hand, the underlying assumptions to lay the MEPP's theoretical foundations and range of applicability increase the possibilities of conflicting interpretations. More interestingly, from a metaphysical stance, the MEPP's philosophical status is hotly debated: does the MEPP passively translate physical information into macroscopic predictions or actively select the physical solution in multistable systems, granting the connection between scientific models and reality? This paper deals directly with this dilemma by discussing natural determination from three angles: (1) Heuristics help natural philosophers to build an ontology. (2) The MEPP's ontological status may stem from its selection of new forms of causation beyond physicalism. (3) The MEPP's ontology ultimately depends on the much-discussed question of the ontology of probabilities in an information-theoretic approach and the ontology of macrostates according to the Boltzmannian definition of entropy.
Entropy
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single...
Entropy
In the maximum entropy theory of ecology (METE), the form of a function describing the distribution of abundances over species and metabolic rates over individuals in an ecosystem is inferred using the maximum entropy inference procedure. Favretti shows that an alternative maximum entropy model exists that assumes the same prior knowledge and makes predictions that differ from METE's. He shows that both cannot be correct and asserts that his is the correct one because it can be derived from a classic microstate-counting calculation. I clarify here exactly what the core entities and definitions are for METE, and discuss the relevance of two critical issues raised by Favretti: the existence of a counting procedure for microstates and the choices of definition of the core elements of a theory. I emphasize that a theorist controls how the core entities of his or her theory are defined, and that nature is the final arbiter of the validity of a theory.
Entropy, 2020
The aim of this paper is to examine the role of thermodynamics, and in particular, entropy, for the development of economics within the last 150 years. The use of entropy has not only led to a significant increase in economic knowledge, but also to the emergence of such scientific disciplines as econophysics, complexity economics and quantum economics. Nowadays, an interesting phenomenon can be observed; namely, that rapid progress in economics is being made outside the mainstream. The first significant achievement was the emergence of entropy economics in the early 1970s, which introduced the second law of thermodynamics to considerations regarding production processes. In this way, not only was ecological economics born but also an entropy-based econometric approach developed. This paper shows that non-extensive cross-entropy econometrics is a valuable complement to traditional econometrics as it explains phenomena based on power-law probability distribution and enables econometri...
Physica A: Statistical Mechanics and its Applications, 2009
Econophysics, is based on the premise that some ideas and methods from physics can be applied to economic situations. We intend to show in this paper how a physics concept such as entropy can be applied to an economic problem. In so doing, we demonstrate how information in the form of observable data and moment constraints are introduced into the method of Maximum relative Entropy (MrE). A general example of updating with data and moments is shown. Two specific econometric examples are solved in detail which can then be used as templates for real world problems. A numerical example is compared to a large deviation solution which illustrates some of the advantages of the MrE method.
International Journal of Modern Physics B, 2010
A review is presented of the relation between information and entropy, focusing on two main issues: the similarity of the formal definitions of physical entropy, according to statistical mechanics, and of information, according to information theory; and the possible subjectivity of entropy considered as missing information. The paper updates the 1983 analysis of Shaw and Davis. The difference in the interpretations of information given respectively by Shannon and by Wiener, significant for the information sciences, receives particular consideration. Analysis of a range of material, from literary theory to thermodynamics, is used to draw out the issues. Emphasis is placed on recourse to the original sources, and on direct quotation, to attempt to overcome some of the misunderstandings and oversimplifications that have occurred with these topics. While it is strongly related to entropy, information is neither identical with it, nor its opposite. Information is related to order and pattern, but also to disorder and randomness. The relations between information and the "interesting complexity," which embodies both patterns and randomness, are worthy of attention.
The variational principles called maximum entropy (MaxEnt) and maximum caliber (MaxCal) are reviewed. MaxEnt originated in the statistical physics of Boltzmann and Gibbs, as a theoretical tool for predicting the equilibrium states of thermal systems. Later, entropy maximization was also applied to matters of information, signal transmission, and image reconstruction. Recently, since the work of Shore and Johnson, MaxEnt has been regarded as a principle that is broader than either physics or information alone. MaxEnt is a procedure that ensures that inferences drawn from stochastic data satisfy basic self-consistency requirements.
Entropy, 2021
Entropy is a concept that emerged in the 19th century. It used to be associated with heat harnessed by a thermal machine to perform work during the Industrial Revolution. However, there was an unprecedented scientific revolution in the 20th century due to one of its most essential innovations, i.e., the information theory, which also encompasses the concept of entropy. Therefore, the following question is naturally raised: “what is the difference, if any, between concepts of entropy in each field of knowledge?” There are misconceptions, as there have been multiple attempts to conciliate the entropy of thermodynamics with that of information theory. Entropy is most commonly defined as “disorder”, although it is not a good analogy since “order” is a subjective human concept, and “disorder” cannot always be obtained from entropy. Therefore, this paper presents a historical background on the evolution of the term “entropy”, and provides mathematical evidence and logical arguments regard...
2016
The routine definitions of Shannon entropy for both discrete and continuous probability laws show inconsistencies that make them not reciprocally coherent. We propose a few possible modifications of these quantities so that 1) they no longer show incongruities, 2) they go one into the other in a suitable limit as the result of a renormalization. The properties of the new quantities would slightly differ from that of the usual entropies in a few other respects
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.