Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012
…
12 pages
1 file
We will discuss the maximum entropy production (MaxEP) principle based on Jaynes' information theoretical arguments, as was done by Dewar (2003, 2005). With the help of a simple mathematical model of a non-equilibrium system, we will show how to derive minimum and maximum entropy production. Furthermore, the model will help us to clarify some confusing points and to see differences between some MaxEP studies in the literature.
Journal of Physics A: Mathematical and Theoretical, 2007
We will discuss the maximum entropy production (MaxEP) principle based on Jaynes' information theoretical arguments, as was done by . With the help of a simple mathematical model of a non-equilibrium system, we will show how to derive minimum and maximum entropy production. Furthermore, the model will help us to clarify some confusing points and to see differences between some MaxEP studies in the literature.
2016
The asymptotic convergence of probability density function (pdf) and convergence of differential entropy are examined for the non-stationary processes that follow the maximum entropy principle (MaxEnt) and maximum entropy production principle (MEPP). Asymptotic convergence of pdf provides new justification of MEPP while convergence of differential entropy is important in asymptotic analysis of communication systems. A set of equations describing the dynamics of pdf under mass conservation and energy conservation constraints is derived. It is shown that for pdfs with compact carrier the limit pdf is unique and can be obtained from Jaynes's MaxEnt principle.
Entropy, 2001
In its modern formulation, the Maximum Entropy Principle was promoted by E.T. Jaynes, starting in the mid-fifties. The principle dictates that one should look for a distribution, consistent with available information, which maximizes the entropy. However, this principle focuses only on distributions and it appears advantageous to bring information theoretical thinking more prominently into play by also focusing on the "observer" and on coding. This view was brought forward by the second named author in the late seventies and is the view we will follow-up on here. It leads to the consideration of a certain game, the Code Length Game and, via standard game theoretical thinking, to a principle of Game Theoretical Equilibrium. This principle is more basic than the Maximum Entropy Principle in the sense that the search for one type of optimal strategies in the Code Length Game translates directly into the search for distributions with maximum entropy. In the present paper we offer a self-contained and comprehensive treatment of fundamentals of both principles mentioned, based on a study of the Code Length Game. Though new concepts and results are presented, the reading should be instructional and accessible to a rather wide audience, at least if certain mathematical details are left aside at a first reading. The most frequently studied instance of entropy maximization pertains to the Mean Energy Model which involves a moment constraint related to a given function, here
Journal of Quantitative Methods, 2019
Physical Review E
There are three ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial Bernoulli processes (Jaynes maximum entropy principle). Even though these notions are fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H(p) = − i pi log pi. For many complex systems, which are typically history-dependent, non-ergodic and non-multinomial, this is no longer the case. Here we show that for such processes the three entropy concepts lead to different functional forms of entropy. We explicitly compute these entropy functionals for three concrete examples. For Pólya urn processes, which are simple self-reinforcing processes, the source information rate is SIT = 1 1−c 1 N log N , the thermodynamical (extensive) entropy is (c, d)-entropy, SEXT = S (c,0) , and the entropy in the maxent principle (MEP) is SMEP(p) = − i log pi. For sample space reducing (SSR) processes, which are simple path-dependent processes that are associated with power law statistics, the information rate is SIT = 1 + 1 2 log W , the extensive entropy is SEXT = H(p), and the maxent result is SMEP(p) = H(p/p1) + H(1 − p/p1). Finally, for multinomial mixture processes, the information rate is given by the conditional entropy H f , with respect to the mixing kernel f , the extensive entropy is given by H, and the MEP functional corresponds one-to-one to the logarithm of the mixing kernel.
Arxiv preprint arXiv:1011.3989, 2010
This paper develops an analytical and rigorous formulation of the maximum entropy generation principle. The result is suggested as the Fourth Law of Thermodynamics.
Journal of Econometrics, 2002
Physics Essays, 20, 487 (2007), 2007
Entropy
The term entropy is used in different meanings in different contexts, sometimes in contradictory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermodynam...
A variational principle where the Lagrange multipliers of a trial distribution are used as variational parameters is discussed as an efficient. practical route to the determination of the distribution of maximal entropy. * The theoretical reason is that only line&y independent constraints provide a basis for the unique expansion of hpp
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Thermodynamic Physical Chemistry of Aqueous Systems, 2011
HAL (Le Centre pour la Communication Scientifique Directe), 2008
Physical Review E, 2009
International Journal of Mathematics in Operational Research, 2018
arXiv (Cornell University), 2006
Journal of Physics A: Mathematical and Theoretical, 2011
Brazilian Journal of Physics, 1998
Complex Systems Summer School, 2003
Studies In History and Philosophy of …
Physica A: Statistical Mechanics and its Applications, 2012