0% found this document useful (0 votes)
132 views25 pages

Entropy in Statistical Mechanics. - Thermodynamic Contacts

This document discusses key concepts in statistical mechanics including: 1. Entropy is defined in statistical mechanics as the natural log of the accessible phase space volume. This definition shows entropy is independent of measurement units and additive for combined systems. 2. Thermodynamic equilibrium between two systems occurs when their temperature, pressure, or chemical potentials are equal depending on the type of contact (thermal, mechanical, or material-transferring). 3. The canonical ensemble provides an alternative statistical approach that is often easier to apply than the microcanonical ensemble.

Uploaded by

sridharbkp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
132 views25 pages

Entropy in Statistical Mechanics. - Thermodynamic Contacts

This document discusses key concepts in statistical mechanics including: 1. Entropy is defined in statistical mechanics as the natural log of the accessible phase space volume. This definition shows entropy is independent of measurement units and additive for combined systems. 2. Thermodynamic equilibrium between two systems occurs when their temperature, pressure, or chemical potentials are equal depending on the type of contact (thermal, mechanical, or material-transferring). 3. The canonical ensemble provides an alternative statistical approach that is often easier to apply than the microcanonical ensemble.

Uploaded by

sridharbkp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd

Lecture 3

• Entropy in statistical mechanics.


• Thermodynamic contacts:
i. mechanical contact,
Ice melting" - classic
ii. heat contact, example of entropy
increase described in 1862
by Rudolf Clausius as an
iii. diffusion contact. increase in the
disgregation of the
• Equilibrium. molecules of the body of
ice.
• Chemical potential.
• The main distributions in statistical mechanics.
• A system in the canonical ensemble.
Thermostat. 1
Entropy in Statistical Mechanics
From the principles of thermodynamics we learn that the
thermodynamic entropy S has the following important
properties:
 dS is an exact differential and is equal to DQ/T for a
reversible process, where DQ is the heat quantity added
to the system.
Entropy is additive: S=S1+S2. The entropy of the
combined system is the sum of the entropy of two
separate parts.
S  0. If the state of a closed system is given
macroscopically at any instant, the most probable state at
any other instant is one of equal or greater entropy.
2
State Function
This statement means that entropy is a state function in
that the value of entropy does not depend on the past
history of the system but only on the actual state of the
system...
In this case one of the great accomplishments of
statistical mechanics is to give us a physical picture of
entropy.
The entropy  of a system (in classical statistical physics)
in statistical equilibrium can be defined as
=ln  (3.1)

where  is the volume of phase space accessible to the


system, i.e., the volume corresponding to the energies
between E- 12  and E+ 1  . 3
2
Let us show first, that changes in the entropy are
independent of the system of units used to measure . 
As  is a volume in the phase space of N point particles it
has dimensions

(Momentum  Length)3N =(Action)3N (3.2)

Let denote the unit of action; then


  /  3N
is
dimensionless. If we were to define
 (3.3)
  ln 3 N  ln   3 N ln 

we see that for changes
(3.4)
   ln 
4
independent of the system units.  =Plank’s constant is a
natural unit of action in phase space.

It is obvious that the entropy , as defined by (3.1), has a


definite value for an ensemble in statistical equilibrium;
thus the change in entropy is an exact differential . Once
the ensemble is specified in terms of the spread in the
phase space, the entropy is known.

We see that, if  is interpreted as a measure of the


imprecision of our knowledge of a system or as a measure
of the “randomness” of a system then the entropy is also
to be interpreted as a measure of the imprecision or
randomness.

5
Entropy is an additive
It can be easily shown that  is an additive. Let us
consider a system made up of two parts, one with N1
particles and the other with N2 particles. Then
N  N1  N 2 (3.5)

and the phase space of the combined system is the


product space of the phase spaces of the individual parts:
  1 2 (3.6)

The additive property of the entropy follows directly:

  ln   ln 1 2  ln 1  ln 2   1   2 (3.7)


6
Thermodynamic contacts between
two systems
We have supposed that the condition of statistical
equilibrium is given by the most probable condition of a
closed system, and therefore we may also say that the
entropy  is a maximum when a closed system is in
equilibrium condition.

The value of  for a system in equilibrium will depend on


the energy E (  <E>) of the system; on the number Ni of
each molecular species i in the system; and on external
variables, such as volume, strain, magnetization, etc.

7
Let us consider the condition for equilibrium in a system
made up of two interconnected subsystems, as in Fig. 3.1.
Initially a rigid, insulating, non-permeable barrier separates
the subsystems from each other.

Insulation

1 2
Barrier

Fig. 3.1.

8
Thermal contact
Thermal contact - the systems can exchange the
energy. In equilibrium, there is no any flow of
energy between the systems. Let us suppose that the
barrier is allowed to transmit energy, the other inhibitions
remaining in effect. If the conditions of the two
subsystems 1 and 2 do not change we say they are in
thermal equilibrium.
equilibrium
In the thermal equilibrium the entropy  of the total
system must be a maximum with respect to small
transfers of energy from one subsystem to the other.
Writing, by the additive property of the entropy
9
  1   2

we have in equilibrium
 =1+2=0 (3.8)

  1    2 
    E1    E2  0 (3.9)
 E1   E2 

We know, however, that


(3.9)
E  E1  E2  0
as the total system is thermally closed,
closed the energy in a
microcanonical ensemble being constant.
constant Thus
  1    2  
      E1  0 (3.11)
 E1   E2  
10
As E1 was an arbitrary variation we must have

  1    2 
    (3.12)
 E1   E 2 
in thermal equilibrium. If we define a quantity  by

1 
 (3.13)
 E

then in thermal equilibrium


1   2 (3.14)

Here  is known as the temperature and will shown later to


be related to the absolute temperature T by =kT , where k
is the Boltzmann constant,
constant 1.38010-23 j/deg K. 11
Mechanical contact
Mechanical contact the systems are separated by the
mobile barrier; The equilibrium is reached in this case by
adequation of pressure from both sides of the barrier.

We now imagine that the wall is allowed to move and also


passes energy but do not passes particles. The volumes
V1, V2 of the two systems can readjust to maximize the
entropy. In mechanical equilibrium

  1    2    1    2 
    V1    V2    E1    E2  0 (3.15)
 V1   V2   E1   E2 

12
After thermal equilibrium has been established the last two
terms on the right add up to zero, so we must have

  1    2 
 
 1 
V   V2  0 (3.16)
 V1   V2 

Now the total volume V=V1+V2 is constant, so that

V  V1  V2 (3.17)


We have then

  1    2  
      V1  0 (3.18)
 V1   V2  

13
As V1 was an arbitrary variation we must have
  1    2 
    (3.19)
 V1   V2 

in mechanical equilibrium. If we define a quantity  by


   
   (3.20)
  V  E , N

we see that for a system in thermal equilibrium the


condition for mechanical equilibrium is

1   2 (3.21)

We show now that  has the essential characteristics of


the usual pressure p.
14
Material-transferring contact

The systems can be exchange by the particles. Let us


suppose that the wall allows diffusion through it of
molecules of the i th chemical species. We have

N i1  N i 2 (3.22)

For equilibrium
  1    2  
      N i1  0 (3.23)
 N i1   N i 2  
or
 1  2 (3.24)

 N i 1 N i 2
15
We define a quantity i by the relation
i   
   (3.25)
  N i  E ,V

The quantity i is called the chemical potential of the ith


species. For equilibrium at constant temperature

i1  i 2 (3.26)

16
The Canonical Ensemble
The microcanonical ensemble is a general statistical tool,
but it is often very difficult to use in practice because of
difficulty in evaluating the volume of phase space or the
number of states accessible to the system.
The canonical ensemble invented by Gibbs avoids some of
the difficulties, and leads us easily to the familiar
Boltzmann factor exp(-E/kT) for the ration of populations
of two states differing by E in energy.

We shall see that the canonical ensemble describes


systems in thermal
In 1901, contact
at the age with
of 62, a heat
Gibbs reservoir; the
(1839-1903)
microcanonical ensemble
published a book Elementary
calleddescribes Principles
systems in
that are
Statistical
perfectly Mechanics (Dover, New York).
insulated. 17
We imagine that each system of the ensemble is divided
up into a large number of subsystems, which are in
mutual thermal contact and can exchange energy with
each other. We direct our attention (Fig.3.2) to one
subsystem denoted s; the rest of the system will be
denoted by r and is sometimes referred to as a heat
reservoir. Total system (t)

Rest ( r ) or heat
The total system is denoted by t
reservoir and has the constant energy Et
as it is a member of a
(s)
Subsystem microcanonical ensemble. For
each value of the energy we
think of an ensemble of systems
(and subsystems).
(Fig.3.2) 18
The subsystems will usually, but not necessarily, be
themselves of macroscopic dimensions.

The subsystem may be a single molecule if, as in a gas,


the interactions between molecules are very weak,
thereby permitting us to specify accurately the energy of
a molecule. In a solid, a single atom will not be a
satisfactory subsystem as the bond energy is shared with
neighbors.

Letting dwt denote the probability that the total system is


in an element of volume dt of the appropriate phase
space, we have for a microcanonical ensemble
19
dwt  Cdt if the energy is in Et at Et
dwt  0 otherwise (3.27)

here C is constant. Then we can write


dwt  Cds dt if the region of phase space is accessible
dwt  0 otherwise (3.28)

We ask now the probability dws that the subsystem is in


ds, without specifying the condition of the reservoir, but
still requiring that the total system being in Et at Et .
Then

dws  Cds r (3.29)

20
where r is the volume of phase space of the reservoir
which corresponds to the energy of the total system being
in Et at Et.
Our task is to evaluate r; that is, if we know that the
subsystem is in ds, how much phase space is accessible
to the heat reservoir?
The entropy of the reservoir is

 r  ln  r (3.30)

 r  e  r (3.31)
Note that
Er  Et  E s (3.32)

21
where we may take Es<< Et because the subsystem is
assumed to be small in comparison with the total system.
We expend
 ( E )
 r ( Er )   r ( Et  E s )   r ( Et )  r t E s ...... (3.33)
Et
Thus
  r ( Et ) 
r  exp r ( Et ) exp  Es  (3.34)
  Et 
As Et is necessarily close to Er, we can write, using 1  
(3.13)  E
1 1  r ( E t )
   
 kT E t (3.35)
Here  is the temperature characterizing every part of the
system, as thermal contact as assumed. Finally, from
(3.29), (3.34) and (3.35) 22
dws  Ae  E  ds
s
(3.36)

where
A  Ce r ( Er )
(3.37)

may be viewed as a quantity which takes care of the


normalization:
 Es
 dws  1  A e ds (3.38)

Thus for the subsystem the probability density (distribution


function) is given by the canonical ensemble

E

( E )  Ae kT (3.39)

23
where here and henceforth the subscript s is dropped.
We emphasize that E is the energy of the entire
subsystem.

We note that ln is additive for two subsystems in


thermal contact:
ln1=lnA1-E1/
ln2=lnA2-E2/

ln12=lnA1A2 - (E1+E2)/

so that , with =12; A=A1A2 ; E=E1+E2, , we have

ln=lnA-E/ (3.40)

24
for the combined systems. This additive property is
central to the use of the canonical ensemble.

The average value of any physical quantity f(p,q) over


canonical distribution is given by
 E ( p ,q )/ kT
 f 
 e f ( p, q )d
 E ( p ,q ) / kT
 e d
We have to note that for a subsystem consisting of a large
number of particles, the subsystem energy in a canonical
ensemble is very well defined.

This is because the density of energy levels or the volume


in phase space is a strongly varying function of energy, as
is also the distribution function (3.39). 25

You might also like