Uploaded by User63157

chapter 5

advertisement
Chapter 5
The Gibbs Statistical
Mechanics
In Chapter 3 we developed Boltzmann’s statistical mechanics and in
Chapter 4 we applied it to perfect gases of non-interacting classical atoms and
molecules. Strictly, Boltzmann’s statistical method, the method of the most
probable distribution, addresses a mathematical model. The model is an assembly of NA weakly interacting systems, weakly interacting so that each system
could be regarded as statistically independent. Here we denote the number of
systems in the assembly by NA explicitly to distinguish itfrom the number N
of physical particles in a system to be used below. The method gave
ns = e−α e−β²s
(5.1)
as the expected number of systems in state of energy ²s where α and β are
Langrange multipliers as yet undetermined. The physics entered when we interpreted the assembly as a dilute gas and the systems as the weakly interacting
−1
atoms or molecules in the gas. Then we found that β = (kT ) and α = −βµ
where µ = − kT log (Z0 /N ) is the chemical potential of the gas.
Boltzmann’s interpretation of the model severely limited the applicability
of his method. Firstly, it could apply only to dilute gases. Atoms in solids or
liquids interact strongly with one another and simply cannot be regarded as
statistically independent. Secondly, the interpretation strictly applies only to
classical gases. In the counting we had to be able to distinguish the gas atoms
(the systems). With distinguishability the number of permutations of the atoms
among their cells can be used as a measure of how often a given distribution is
likely to occur. Distinguishability is a classical concept. Boltzmann’s interpretation leaves us without a statistical mechanics for solids, liquids, dense gases,
or quantum systems.
There were also problems of rigor. When the assembly was taken as
a gas at STP, we saw that the expected number of atoms, ns , in a cell in
phase space (of volume h3 ) was one in 104 to 105 . Hence most cells or states
1
2
Statistical Mechanics.
were unoccupied, a few contained an atom and most rarely was ns ≥ 2. Yet
the combinatorial method assumed ns was large. This could be overcome by
combining single cells into larger cells, but the Sachur-Tetrode equation for the
gas entropy confirmed the size h3 . Also we had to “correct” the Boltzmann
counting in section 3.3 so that the entropy emerged from the method as a
correctly extensive thermodynamic quantity.
5.1
The Gibbs Interpretation
The limitations noted above are elegantly removed in Gibbs’ interpretation. We employ the same mathematical model, an assembly of NA weakly
interacting systems. Now, however, the system is taken as the whole body under study. It is the solid, liquid or gas composed itself of many particles. The
assembly is (NA −1) mental copies of this single system under study. We assume
that each possible state of the many particle system is represented at least once
in the assembly. That is, we imagine a large enough assembly of copies that
in the assembly each state is represented. The copies are assumed to interact
weakly with each other so that heat can be exchanged between them and a
uniform temperature maintained in the assembly. The total assembly itself is
assumed isolated.
The assembly of copies might be constructed as follows. We begin with a
large block of solid, containing, say, 1025 atoms. We mentally divide the block
into 1010 cubes. Each cube contains 1015 atoms and is still large enough to
display the macroscopic collective character of the solid. We select one cube
as our system and the remainder form (NA − 1) mental copies. The cubes are
taken as weakly interacting and exchange heat so that the energies of the cubes
fluctuate. Through these fluctuations the cubes sample all possible energy states
available to the solid and, at a given time, each possible state of our system is
represented somewhere in the assembly. If we are considering a gas, the system
is the gas and the assembly is a collection of mental copies of the gas.
Since we have only re-interpreted Boltzmann’s mathematical model, we
can again use Boltzmann’s method of most probable distribution to find the
expected number, N S , of systems in state S. We use capital letters to remind
ourselves that we are considering states of a many-particle system. The energy
of the system is now, quite generally.
E=
X pi 2
+ V (r1 . . . rN )
2m
i
(5.2)
In this case we make no assumption about the interaction potential V (r1 . . . rN )
among the N particles within the system. We denote by ES the possible energy
states of this many-particle system. Since the assembly of these systems is
isolated (its energy, EA , is constant) and their number NA is fixed.
EA =
X
S
NS ES
;
NA =
X
S
NS
(5.3)
Statistical Mechanics.
3
These are the same conditions imposed on the assembly by Boltzmann. Thus
we may follow his method in section 3.1 exactly with no change to find the
expected occupation N S of state having energy ES . This gives
N S = e−α e−β ES .
(5.4)
We also define the partition function of the many-particle system as
X
e−β ES .
Z=
(5.5)
S
Then
NS =
NA −β ES
NA
∂
e
=−
log Z .
Z
β ∂ ES
(5.6)
and
PS =
NS
∂
= − β −1
(log Z) .
NA
∂ES
A little care is needed with the thermodynamic, internal energy, U , of the
system. Since EA is the energy of the assembly of NA systems
U=
EA
1 X
∂
=
N S ES = −
(log Z)
NA
NA
∂β
S
is the ensemble average of the energy of a single system.
To establish β, as before, we take the system through a change of state
and watch log Z. Since Z is a function of β and ES , the change in log Z is
d (log Z) =
X ∂
∂
(log Z) dβ +
(log Z) dES
∂β
∂ES
S
= −U dβ +
X
S
(−
β
N S ) dES
NA
´
³
1 X
= − d (U β) + βdU + β −
N S dES
NA
S
P
As before in section (3.2) dWA = − S N S dES is the net work done by the
assembly in raising the energy levels of the assembly by dES . As with U , the
average work done by the system is
dW = −
1 X
N S dES
NA
S
Then
d(log Z + U β) = β(dU + dW ) = β T dS
4
Statistical Mechanics.
in a reversible change of state.
Again we identify β as ∝ 1/T and introduce Boltzmann’s constant k so that,
β =
and identify entropy as
S=k
£
1
kT
log Z + βU
(5.7)
¤
(5.8)
The free energy is
F = U − TS
= U − kT
£
log Z + βU
¤
or
F = − kT log Z
(5.9)
These relations are all now completely general for any system having an
arbitrary interaction among the N particles in it. It is valid for a solid, liquid or a
gas. We therefore have a general prescription for evaluating the thermodynamic
properties, via the differentiation of F , through the partition function. We need
only evaluate Z.
Gibbs’ interpretation is also valid equally for quantum or classical particles in the system since we have said nothing about the statistics of the particles
within the system. The particles may be Bose, Fermi or Classical. By this simple
re-interpretation, Gibbs removed the two limitations to Boltzmann’s statistical
mechanics. Now, ES and Z refer to the whole system rather than the particles
that make it up and since the whole system is large it will always be classical.
We may also take NA , the number of systems in the assembly, as large as we
like so that each NS is large. Further, we do not have to “correct” the counting to get eq. (5.9) (the equivalent of eq. (3.19)), the relation of F to Z of
the N particle systems appears naturally. In this way the two points of rigor
that bothered us in Boltzmann’s statistical mechanics are also removed. From
now on we will use the Gibbs’ method exclusively. Our first application will
be to the perfect classical gas to demonstrate its validity and connection with
Boltzmann’s interpretation.
To complete our connection with thermodynamics we recall that F , an
extensive quantity, is proportional to the average number N of particales in the
system,
F (T, V, N ) = − kT log Z (T, V, N ).
(5.10)
If T , V , and N are constant, then F is a constant, a minimum at equilibrium. If
the pressure rather than volume is constant, the Gibbs free energy is constant,
G (T, p, N ) = F (T, V, N ) − pV
(5.11)
In many applications N is not constant but can vary about the mean
value, N . If V is constant, this means a variation in number density n = N /V .
Statistical Mechanics.
5
In this case, as we noted in Chapter 2, neither F nor G will be constant and
it is convenient to introduce a thermodynamic function which is constant. For
example, we may have a liquid in equilibrium with its vapor in which particles
are exchanged between the vapour and liquid phases. In this case the chemical
potential µ is constant and the same in both phases. A convenient function is
then the thermodynamic potential 1
F − µN = Ω (T, V, µ)
where
µ
µ=
∂F
∂N
(5.12)
¶
T, V, N =N
Here the derivative is to be evaluated at N equal to its average value, N . From
its definition, µ is the change in Gibbs (Helmholtz) free energy on adding an additional particle beyond N to the system at constant temperature and pressure
(Volume),
δG = µ δN
(T, p, constant)
δF = µ δN
(T, V, constant)
We will find it convenient to remove the restriction of constant N to
evaluate the partition function Z (T, V, N ), because the restriction to fixed
number of particles makes calculation of summations difficult. Instead we allow
N to vary and show later that N never fluctuates far from the mean value N .
The partition function for variable N is naturally related to Ω = Ω (T, V, µ).
5.2
Application to a Perfect Boltzmann Gas
A perfect Boltzmann gas is a gas for N non-interacting, classical point
particles. The gas energy is then
Egas =
N
X
pi 2
2m
i=1
The partition function for the gas, from eq. (5.5), is
1 Equations (5.11) and (5.12) are examples of Legrendre transformations. If we have a
function f (x, y) of independent variables x and y and we seek a function (g, say) which depends
on x and z as independent variables (where z ≡ ∂f /∂y), then this function is g = f − yz
That is
dg = df (x, y) − (y dz + z dy) =
g = g(x, z)
∂f
∂f
∂f
dx +
dy − y dz − z dy =
dx − y dz
∂x
∂y
∂x
6
Statistical Mechanics.
Z
X
Z=
e−β ES =
−β
X pi 2
dS e
i
2m
(5.13)
states of gas
Now the phase space is 6N dimensional since we must specify the position and
momentum of each particle in the gas to specify the state of the gas. An element
of volume in Γ is
dΓ =
N
Y
d~ri . d~
pi = d~r1 . . . d~rN . d~
p1 . . . d~
pN
i=1
1
and the density of states in phase space, from section (1.5), is (dS/dΓ) = 3N .
h
Hence
Z
0
Z=
pi 2
N
dΓ Y −β
2m
e
h3N i=1
(5.14)
Some care is needed in interpreting the integration over the states of the
gas in eq. (5.14) correctly. Since the macroscopic state of the gas is unchanged
when we interchange atoms among the cells in phase space, we cannot integrate
each d~ri d~
pi over all space independently. This would over count the number
of states of the gas. We must divide by the number of permutations among the
particles which lead to the same macroscopic gas state. Since, from the point
of view of the macroscopic state, the particles are indistinguishable, we must
divide by N !, which is number of ways we can permute the particles and leave
the state of the gas unchanged.
Then
Z 0
Z
1
dΓ =
dΓ
(5.15)
N!
where the integral is now unrestricted.
This gives
Z
Z
N
i
βpi 2
1 Yh 1
− 2m
Z=
d~
r
d~
p
e
i
i
N ! 1 h3
1 h 1
=
N ! h3
Z=
Z
Z
βp2
d~
p e− 2m
d~r
iN
1
Z0 N
N!
where
Z0 = V
(5.16)
³ 2πmkT ´ 32
h2
=
V
λT 3
Statistical Mechanics.
7
1
and again λT = (h2 /2πmkT ) 2 is the thermal wavelength of the particle. As in
section 3.2, we recover
¡ Z0 N ¢
(5.17)
N!
and from this we may calculate the thermodynamic properties of the gas.
To get the expected occupation of the single particle states, ns , we note
we may write a particular state of the gas as
X
Egas state =
ns0 ²s0
F = − kT log Z = − kT log
s0
Here ns0 is the occupation of single particle states s0 . The partition function is
X
P
Z=
e−β s0 ns0 ²s0
gas states S
The average value of ns is, by definition,
X
P
1
ns =
ns e−β s0
Z gas states
ns0 ²s0
or
∂
(log Z)
(5.18)
∂²s
This hold generally for any gas of non-interacting particles (classical or quantum). We may check that we recover the result eq. (5.1) for a gas ofPclassical
particles by substituting Z = Z0 N /N ! from eq. (5.16) where Z0 = s ²−β ²s
and e−βµ = Z0 /N . From eq. (5.18) we can obtain the Maxwell-Boltzmann and
other distributions discussed in Chapter 4.
ns = −β −1
5.3
The Perfect Quantum Gases
A perfect quantum gas is a gas of non-interacting quantum particles.
There are two types; (1) the Fermi gas composed of one half integral spin
Fermions which satisfy Fermi-Dirac statistics and (2) the Bose gas composed
of integral spin Bose particles which satisfy Bose-Einstein statistics. In section
1.6, we saw that for Fermi particles there can be at most one particle per state
(ns = 0, 1) while there can be any number of Bosons per state (ns = 0, 1, 2,
. . . ).
The properties of these gases are set by their partition functions
X
Z=
e−β Egas = Z (T, V, N )
states of gas
where a specific state of the gas is given by
Egas state =
r
X
s=1
ns ²s = n1 ²1 + n2 ²2 + n3 ²3 + . . . + nr ²r
8
Statistical Mechanics.
All the possible states of the gas can be reached by summing over all possible
occupations ns of the single particle state s. That is
Z=
X X
n1
n2
...
X
e−β (n1 ²1 +n2 ²2 + ... + nr ²r )
(5.19a)
nr
with the restriction that the total number of particles N is the gas is fixed,
X
ns = N
(5.19b)
s
The different values of Z for the Fermi and Bose cases are entirely fixed by the
difference in allowed single particle state occupation noted above.
We may picture a state of the gas as follows. We imagine a very exclusive
Hilton Hotel having only one room per floor. This Hilton has r floors, one floor
per state of the gas. We invite N identical particles to our Hilton. As manager
we assign the particles to their rooms . If the N particles are Fermions, then we
can assign at most one particle per room. The lowest energy state of the hotel
(gas) is achieved by filling up the lowest N floors (r > N , assumed). Higher
energy states (in terms of work done by the elevator) are created by leaving
some lower floors empty and filling others higher up. The possible occupation
states of the hotel needed in the Partition Function are covered by summing
over the possible occupations of each floor (n = 0, 1). Permutation of Fermions
among the rooms does not lead to a new state of the hotel because we cannot
distinguish one Fermion from another.
If we invite N Bosons, the Bosons are more flexible and allow multiple
occupation of the rooms. For the lowest energy state of the hotel, we condense
all the N Bosons into the ground floor room (Bose-Einstein condensation). The
sum over all possible states of the hotel is again a sum over the possible occupations of each room (n = 0, 1, 2, . . . N ).
Evaluation of the Partition Function
The sum in eq. (5.19a) would be easy to carry out without the restriction
(5.19b) to N particles. With the restriction the sums over ns in each state s
are not independent. We therefore remove the restriction by summing over all
possible values of N . We evaluate, in place of Z, the new partition function
Z (T, V, µ) ≡
X
N
eβµN Z (T, V, N )
(5.20)
Statistical Mechanics.
9
Figure 5.1: Partition function terms vs N .
This partition function is much easier to evaluate since the sum over each ns
can now be carried out independently,
X
X
Z (T, V, µ) =
...
e −β (n1 ²1 + n2 ²2 + ...) e −βµ (n1 + n2 + ...)
n1
=
X
nr
e −β (²1 −µ)n1
n1
=
e −β (²2 −µ)n2 . . .
n2
r
Y
X
s=1
X
e −β (²s −µ)ns
(5.21)
ns
Also, the sum over N removes the dependence on N , much as an integral does,
so that Z (T, V, µ) depends upon the parameter µ. At present µ is arbitrary.
The sum over N clearly extends the class of states of the gas included in
the partition function. It now includes all possibIe numbers of particles in the
gas at volume V . Physically we are allowing the number of particles in the gas
to vary. There will be some equilibrium or average number of particles, N , in
the volume which occurs most often at temperature T . To emphasize this, we
write eq. (5.20) as
Z (T, V, µ) = Z (T, V, 1) eβµ + . . . + Z (T, V, N ) eβµN + . . .
Should this mean number of particles occur overwhelmingly often so that the
term Z (T, V, N ) eβµN dominates the sum, then
Z (T, V, µ) = Z (T, V, N ) eβµN
(5.22)
and we have a simple relation between Z and Z. For now we simply assume eq.
(5.22) holds, evaluate Z and via eq. (5.22) obtain Z (T, V, N ), from which we
calculate the gas properties. The required dependence of Z (T, V, µ) on N is
sketched in Fig. 5.1. We will verify this form below. There are two cases:
(1) Fermi - Dirac
ns = 0, 1
ZF D (T, V, µ) =
r h X
1
Y
s=1
n1 =0
e−β(²s −µ)ns
i
10
Statistical Mechanics.
or
r
Y
£
¤
1 + e−β(²s −µ)
ZF D (T, V, µ) =
(5.23)
s=1
(2) Bose - Einstein
ns = 0, 1, 2, . . .
r h X
∞
Y
ZBE (T, V, µ) =
s=1
e−β(²s −µ)ns
i
ns =0
r
Y
£
¤−1
1 − e−β(²s −µ)
=
(5.24)
s=1
The difference between Fermi and Bose occupation leads to quite different partition functions.
The Statistics
The expected occupation of the single particle states is obtained from the
general result eq. (5.18) and
∂Z
∂Z
∂
=
(e−βµN Z) = e−βµN
∂ ²s
∂ ²s
∂ ²s
to give
ns = − β −1
∂
(log Z)T, µ
∂ ²s
(5.25)
Differentiating eqs. (5.23) and (5.24) gives
ns =
h
i−1
eβ(²s −µ) + 1
−
Fermi - Dirac
and
h
ns =
i−1
eβ(²s −µ) − 1
−
Bose - Einstein
(5.26)
respectively. The properties of the Fermi and Bose gases are developed in Chapters 7 and 8.
5.4
The Grand Partition Function and Thermodynamics
The partition function
X
X
Z =
eβµN Z (T, V, N ) =
e− β(ES −µN )
N
S,N
(5.27)
Statistical Mechanics.
11
is called the grand (canonical) partition function. We saw that for the quantum
gases it was easier to evaluate Z than the canonical partition function, Z. This
was because in Z all possible numbers of particles N in the gas could be included
in the sum. We want now to verify that the average number, N , is found
overwhelmingly more often in the gas than the other possible values of N . Then
the sum in Z will be dominated by the single term in which N = N , and eq.
(5.27) can be replaced by
Z = eβµN Z (T, V, N ).
(5.28)
We also want to relate Z to thermodynamic functions.
To begin we note that if the term for N = N in Z is largest (a maximum)
we must have
∂
(eβµN Z)N =N = 0.
∂N
That is
¡ ∂Z ¢
=0
βµ eβµN Z + eβµN
∂N N =N
or
¢
∂ ¡
µ = − β −1
log Z N =N
(5.29)
∂N
We recall from chapter 2 that the thermodynamic chemical potential is
µ=(
∂F
∂
)
= − β −1
(log Z)T, V, N =N
∂N T, V, N =N
∂N
Thus by setting µ in Z(T, V, µ) at the thermodynamic chemica1 potential of
the gas we guarantee that the terms in Z peak at N = N . We must now show
that Z peaks so sharply at N = N that we may use eq. (5.28). To do this we
use eq. (5.28) first to relate Z to thermodynamic functions and return to verify
that eq. (5.28) is indeed valid.
The average number N of particles in the system corresponds to the
thermodynamic average N appearing in the thermodynamic relations of Chapter
2. For example, in eq. (2.18) we have
Ω = F − µN .
The thermodynamic definition of N can be taken as
N =−(
∂Ω
)
.
∂µ T, V
Using eq. (5.28) we find
Ω = F − µN = −kT log Z − µN
= −kT log Z − kT log eβµN
(5.30)
12
Statistical Mechanics.
or
Ω = −kT log Z.
(5.31)
Thus the grand canonical partition function Z is related to the thermodynamic
potential Ω in the same form as F and Z (F = −kT log Z).
Also since
G = F + pV = Ω + µN + pV
and
G = µN
we have
pV = kT log Z.
(5.32)
With relations (5.31) and (5.32) we can relate Z directly to thermodynamic
properties without using the canonical partition function at all.
Using the thermodynamic definition of N and eq. (5.31) we have
X X
N e− β (ES −µN )
∂Ω
)
= NX SX
(5.33)
N =− (
∂µ T, V
e− β (ES −µN )
N
S
In this relation we may interpret
PS (N ) =
1 − β (ES −µN )
e
Z
as the probability of observing the system in state ES when it contains N
particles. (This we will derive more rigorously in Chapter 6 and we could then
start the argument that follows at this point). Clearly also
P (N ) =
1 X − β (ES −µN )
e
Z
(5.34)
S
is the probability of observing the system having N particles, irrespective of
what state it is in, and
N=
X
P (N ) N = β −1
N
∂
(log Z)T, V
∂µ
(5.35)
What are the fluctations around the average value N ? The mean square
deviation from N is defined as
2
2
2
∆N 2 = (N − N ) = (N 2 − 2N N + N ) = N 2 − (N )
By differentiating eq. (5.35), we find
β −1
∂N
∂2
= β −2
(log Z) = (∆N )2
∂µ
∂µ2
Statistical Mechanics.
13
Since µ and β are both intensive quantities this result shows that (∆N )2 ∝ N
Thus
∆N
1
∝ √ → 0 as N → ∞
N
N
For a macroscopically large gas of many particles (∼ 1023 ) the RMS deviation
from the average N vanishes. This means the term N = N dominates the sum
in Z of eq. (5.27) so that (5.28) holds.
We apply the main results of this chapter, eqs. (5.23), (5.26) and (5.31),
to a number of Bose and Fermi systems in Chapters 7 and 8.
Download