Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
Frontiers in Entropy Across the Disciplines
This chapter deals with our recent attempt to extend the notion of equilibrium (EQ) entropy to nonequilibrium (NEQ) systems so that it can also capture memory effects. This is done by enlarging the equilibrium state space S to S ′ by introducing internal variables. These variables capture the irreversibility due to internal processes. By a proper choice of the enlarged state space S ′ , the entropy becomes a state function, which shares many properties of the EQ entropy, except for a nonzero irreversible entropy generation. We give both a thermodynamic and statistical extension of the entropy and prove their equivalence in all cases by taking an appropriate S ′. This provides a general nonnegative statistical expression of the entropy for any situation. We use the statistical formulation to prove the second law. We give several examples to determine the required internal variables, which we then apply to several cases of interest to calculate the entropy generation. We also provide a possible explanation for why the entropy in the classical continuum 1-d Tonks gas can become negative by considering a lattice model for which the entropy is always nonnegative.
Journal of Statistical Physics, 1989
A definition originally proposed by H. S. Green is used to calculate the entropy of nonequilibrium steady states. This definition provides a well-defined coarse graining of the entropy. Although the dimension of the phase space accessible to nonequilibrium steady states is less than the ostensible dimension of that space, the Green entropy is computed from within the accessible phase space, thereby avoiding the divergences inherent in the fine-grained entropy. It is shown that the Green entropy is a maximum at equilibrium and that away from equilibrium, the thermodynamic temperature computed from the Green entropy is different from the kinetic temperature.
Entropy, 2015
We review the concept of nonequilibrium thermodynamic entropy and observables and internal variables as state variables, introduced recently by us, and provide a simple first principle derivation of additive statistical entropy, applicable to all nonequilibrium states by treating thermodynamics as an experimental science. We establish their numerical equivalence in several cases, which includes the most important case when the thermodynamic entropy is a state function. We discuss various interesting aspects of the two entropies and show that the number of microstates in the Boltzmann entropy includes all possible microstates of non-zero probabilities even if the system is trapped in a disjoint component of the microstate space. We show that negative thermodynamic entropy can appear from nonnegative statistical entropy.
Ima Journal of Applied Mathematics, 2004
We propose a new way of defining entropy of a system, which gives a general form which may be nonextensive as Tsallis entropy, but is linearly dependent on component entropies, like Renyi entropy, which is extensive. This entropy has a conceptually novel but simple origin and is mathematically easy to define by a very simple expression, though the probability distribution
Journal of Statistical Physics, 1981
It is proposed to define entropy for nonequilibrium ensembles using a method of coarse graining which partitions phase space into sets which typically have zero measure. These are chosen by considering the totality of future possibilities for observation on the system. It is shown that this entropy is necessarily a nondecreasing function of the time t. There is no contradiction with the reversibility of the laws of motion because this method of coarse graining is asymmetric under time reversal. Under suitable conditions (which are stated explicitly) this entropy approaches the equilibrium entropy as t ~ + oo and the fine-grained entropy as t---~-oo. In particular, the conditions can always be satisfied if the system is a K-system, as in the Sinai billiard models. Some theorems are given which give information about whether it is possible to generate the partition used here for coarse graining from time translates of a finite partition, and at the same time elucidate the connection between our concept of entropy and the entropy invariant of Kolmogorov and Sinai.
Entropy, 2014
We present the state of the art on the modern mathematical methods of exploiting the entropy principle in thermomechanics of continuous media. A survey of recent results and conceptual discussions of this topic in some well-known non-equilibrium theories (Classical irreversible thermodynamics CIT, Rational thermodynamics RT, Thermodynamics of irreversible processes TIP, Extended irreversible thermodynamics EIT, Rational Extended thermodynamics RET) is also summarized.
International Journal of Thermophysics, 1993
The objective of this paper is twofold: first, to examine how the concepts of extended irreversible thermodynamics are related to the notion of accompanying equilibrium state introduced by Kestin; second, to compare the behavior of both the classical local equilibrium entropy and that used in extended irreversible thermodynamics. Whereas the former does not show a monotonic increase, the latter exhibits a steady increase during the heat transfer process; therefore it is more suitable than the former one to cope with the approach to equilibrium in the presence of thermal waves.
Physical Review E, 2009
The total entropy production fluctuations are studied in some exactly solvable models. For these systems, the detailed fluctuation theorem holds even in the transient state, provided initially the system is prepared in thermal equilibrium. The nature of entropy production during the relaxation of a system to equilibrium is analyzed. The averaged entropy production over a finite time interval gives a better bound for the average work performed on the system than that obtained from the well known Jarzynski equality. Moreover, the average entropy production as a quantifier for information theoretic nature of irreversibility for finite time nonequilibrium processes is discussed.
viXra, 2018
Following worries that the entropy functions of classical thermodynamics and statistical thermodynamics were not equivalent, attention is drawn here to work by Lazar Mayants indicating that this is not the case and the two are, in fact, equivalent.
Physical Review E, 2010
Recently, Kawai, Parrondo, and Van den Broeck have related dissipation to time-reversal asymmetry. We generalized the result by considering a protocol where the physical system is driven away from an initial thermal equilibrium state with temperature β 0 to a final thermal equilibrium state at a different temperature. We illustrate the result using a model with an exact solution, i.e., a particle in a moving one-dimensional harmonic well.
Physical Review E, 2015
Stochastic thermodynamics and the associated fluctuation relations provide means to extend the fundamental laws of thermodynamics to small scales and systems out of equilibrium. The fluctuating thermodynamic variables are usually treated in the context of isolated Hamiltonian evolution, or Markovian dynamics in open systems. In this work we introduce an explicitly non-Markovian model of dynamics of an open system, where the correlations between the system and the environment drive a subset of the environment outside of equilibrium. This allows us to identify the non-Markovian sources of entropy production. We show that the non-Markovian components lead to modifications in the standard fluctuation relations for entropy. As a concrete example, we explicitly derive such modified fluctuation relations for the case of an overheated single electron box.
Journal of Statistical Mechanics: Theory and Experiment, 2012
In nature stationary nonequilibrium systems cannot exist on their own, rather they need to be driven from outside in order to keep them away from equilibrium. While the internal mean entropy of such stationary systems is constant, the external drive will on average increase the entropy in the environment. This external entropy production is usually quantified by a simple formula, stating that each microscopic transition of the system between two configurations c → c with rate w c→c changes the entropy in the environment by ∆S env = ln w c→c − ln w c →c. According to this formula irreversible transitions c → c with a vanishing backward rate w c →c = 0 would produce an infinite amount of entropy. However, in experiments designed to mimic such processes, a divergent entropy production, that would cause an infinite increase of heat in the environment, is not seen. The reason is that in an experimental realization the backward process can be suppressed but its rate always remains slightly positive, resulting in a finite entropy production. The paper discusses how this entropy production can be estimated and specifies a lower bound depending on the observation time.
arXiv: Mathematical Physics, 2014
From a new rigorous formulation of the general axiomatic foundations of thermodynamics we derive an operational definition of entropy that responds to the emergent need in many technological frameworks to understand and deploy thermodynamic entropy well beyond the traditional realm of equilibrium states of macroscopic systems. The new definition is achieved by avoiding to resort to the traditional concepts of "heat" (which restricts $a$ $priori$ the traditional definitions of entropy to the equilibrium domain) and of "thermal reservoir" (which restricts $in$ $practice$ our previous definitions of non-equilibrium entropy to the many-particle domain). The measurement procedure that defines entropy is free from intrinsic limitations and can be applied, $in$ $principle$, even to non-equilibrium states of few-particle systems, provided they are separable and uncorrelated. The construction starts from a previously developed set of carefully worded operational definitio...
Stochastic Dynamics Out of Equilibrium, 2019
Thermodynamics makes definite predictions about the thermal behavior of macroscopic systems in and out of equilibrium. Statistical mechanics aims to derive this behavior from the dynamics and statistics of the atoms and molecules making up these systems. A key element in this derivation is the large number of microscopic degrees of freedom of macroscopic systems. Therefore, the extension of thermodynamic concepts, such as entropy, to small (nano) systems raises many questions. Here we shall reexamine various definitions of entropy for nonequilibrium systems, large and small. These include thermodynamic (hydrodynamic), Boltzmann, and Gibbs-Shannon entropies. We shall argue that, despite its common use, the last is not an appropriate physical entropy for such systems, either isolated or in contact with thermal reservoirs: physical entropies should depend on the microstate of the system, not on a subjective probability distribution. To square this point of view with experimental results of Bechhoefer we shall argue that the Gibbs-Shannon entropy of a nano particle in a thermal fluid should be interpreted as the Boltzmann entropy of a dilute gas of Brownian particles in the fluid.
Nature, 2011
Landauer's erasure principle exposes an intrinsic relation between thermodynamics and information theory: the erasure of information stored in a system, S, requires an amount of work proportional to the entropy of that system. This entropy, H(S|O), depends on the information that a given observer, O, has about S, and the work necessary to erase a system may therefore vary for different observers. Here, we consider a general setting where the information held by the observer may be quantum-mechanical, and show that an amount of work proportional to H(S|O) is still sufficient to erase S. Since the entropy H(S|O) can now become negative, erasing a system can result in a net gain of work (and a corresponding cooling of the environment).
Brazilian Journal of Physics, 1998
We consider the question of the existence of a generalized H-theorem in the context of the variational method in the information-theoretical approach that generates the nonequilibrium statistical operator formalism. After brie y reviewing how the latter provides mechano-statistical foundations for phenomenological irreversible thermodynamics, the socalled Informational Statistical Thermodynamics, we discuss how dissipative phenomena are accounted for by the procedure. Such e ects are related to a generalized H-theorem and a w eak criterion of positive e n tropy production. These results are a consequence of the de nition of a coarse-grained statistical entropy, resulting from the projection of the full nonequilibrium distribution function on the subspace of the slow relaxing dynamical quantities, that are appropriate for the description of the irreversible evolution of the system from the state of initial preparation.
Qualitative Theory of Dynamical Systems, 2021
In Nonequilibrium Thermodynamics and Information Theory, the relative entropy (or, KL divergence) plays a very important role. Consider a Hölder Jacobian J and the Ruelle (transfer) operator L log J. Two equilibrium probabilities µ 1 and µ 2 , can interact via a discretetime Thermodynamic Operation given by the action of the dual of the Ruelle operator L * log J. We argue that the law µ → L * log J (µ), producing nonequilibrium, can be seen as a Thermodynamic Operation after showing that it's a manifestation of the Second Law of Thermodynamics. We also show that the change of relative entropy satisfies
The status of heat and work in nonequilibrium thermodynamics is quite confusing and nonunique at present with conflicting interpretations even after a long history of the first law dE(t) = d e Q(t) − dW e (t) in terms of exchange heat and work, and is far from settled. Moreover, the exchange quantities lack certain symmetry (see text). By generalizing the traditional concept to also include their time-dependent irreversible components d i Q(t) and d i W (t) allows us to express the first law in a symmetric form dE(t) = dQ(t) − dW (t) in which dQ(t) and work dW (t) appear on equal footing and possess the symmetry. We prove that d i Q(t) ≡ d i W (t); as a consequence, irreversible work turns into irreversible heat. Statistical analysis in terms of microstate probabilities p i (t) uniquely identifies dW (t) as isentropic and dQ(t) as isometric (see text) change in dE(t), a result known in equilibrium. We show that such a clear separation does not occur for d e Q(t) and dW e (t). Hence, our new formulation of the first law provides tremendous advantages and results in an extremely useful formulation of non-equilibrium thermodynamics, as we have shown recently [Phys. Rev. E 81, 051130 (2010); ibid 85, 041128 and 041129 (2012)]. We prove that an adiabatic process does not alter p i. All these results remain valid no matter how far the system is out of equilibrium. When the system is in internal equilibrium, dQ(t) ≡ T (t)dS(t) in terms of the instantaneous temperature T (t) of the system, which is reminiscent of equilibrium, even though, neither d e Q(t) ≡ T (t)d e S(t) nor d i Q(t) ≡ T (t)d i S(t). Indeed, d i Q(t) and d i S(t) have very different physics. We express these quantities in terms of d e p i (t) and d i p i (t), and demonstrate that p i (t) has a form very different from that in equilibrium. The first and second laws are no longer independent so that we need only one law, which is again reminiscent of equilibrium. The traditional formulas like the Clausius inequality d e Q(t)/T 0 < 0, ∆ e W < −∆ [E(t − T 0 S(t))], etc. become equalities dQ(t)/T (t) ≡ 0, ∆W = −∆ [E(t − T (t)S(t)], etc, a quite remarkable but unexpected result in view of the fact that ∆ i S(t) > 0. We identify the uncompensated transformation N (t, τ) during a cycle. We determine the irreversible components in two simple cases to show the usefulness of our approach; here, the traditional formulation is of no use. Our extension bring about a very strong parallel between equilibrium and non-equilibrium thermodynamics, except that one has irreversible entropy generation d i S(t) > 0 in the latter.
Physical Review E, 2011
Entropy production is one of the most important characteristics of non-equilibrium steady states.
Aapp Physical Mathematical and Natural Sciences, 2008
What is the physical significance of entropy? What is the physical origin of irreversibility? Do entropy and irreversibility exist only for complex and macroscopic systems? Most physicists still accept and teach that the rationalization of these fundamental questions is given by Statistical Mechanics. Indeed, for everyday laboratory physics, the mathematical formalism of Statistical Mechanics (canonical and grand-canonical, Boltzmann, Bose-Einstein and Fermi-Dirac distributions) allows a successful description of the thermodynamic equilibrium properties of matter, including entropy values. However, as already recognized by Schrödinger in 1936, Statistical Mechanics is impaired by conceptual ambiguities and logical inconsistencies, both in its explanation of the meaning of entropy and in its implications on the concept of state of a system. An alternative theory has been developed by Gyftopoulos, Hatsopoulos and the present author to eliminate these stumbling conceptual blocks while maintaining the mathematical formalism so successful in applications. To resolve both the problem of the meaning of entropy and that of the origin of irreversibility we have built entropy and irreversibility into the laws of microscopic physics. The result is a theory, that we call Quantum Thermodynamics, that has all the necessary features to combine Mechanics and Thermodynamics uniting all the successful results of both theories, eliminating the logical inconsistencies of Statistical Mechanics and the paradoxes on irreversibility, and providing an entirely new perspective on the microscopic origin of irreversibility, nonlinearity (therefore including chaotic behavior) and maximal-entropy-generation nonequilibrium dynamics. In this paper we discuss the background and formalism of Quantum Thermodynamics including its nonlinear equation of motion and the main general results. Our objective is to show in a not-too-technical manner that this theory provides indeed a complete and coherent resolution of the century-old dilemma on the meaning of entropy and the origin of irreversibility, including Onsager reciprocity relations and maximal-entropy-generation nonequilibrium dynamics, which we believe provides the microscopic foundations of heat, mass and momentum transfer theories, including all their implications such as Bejan's Constructal Theory of natural phenomena.
Physica D: Nonlinear Phenomena, 2004
Boltzmann defined the entropy of a macroscopic system in a macrostate M as the log of the volume of phase space (number of microstates) corresponding to M . This agrees with the thermodynamic entropy of Clausius when M specifies the locally conserved quantities of a system in local thermal equilibrium (LTE). Here we discuss Boltzmann's entropy, involving an appropriate choice of macro-variables, for systems not in LTE. We generalize the formulas of Boltzmann for dilute gases and of Resibois for hard sphere fluids and show that for macro-variables satisfying any deterministic autonomous evolution equation arising from the microscopic dynamics the corresponding Boltzmann entropy must satisfy an H-theorem.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.