Contents
The nucleus
Radioactivity
Radiation Measurements
Black body radiation
Statistical Mechanics
Radiation and scattering
Related topics
Related links
© The scientific sentence. 2010

Statistical Mechanics
MaxwellBoltzmann distribution
1.Introduction
In the real microscopic world, it does not exist a particle alone inside any space. Dirac
relativistic equation or Schrodinger's equation inform us about the single electron behavior in an atom.
But when it come to try to do so with many electrons, the issue turns out to be very complicated.
Kinetic theory of gases tells us that the internal energy of a gas is
function of the temperature (more precisely U = (3/2) NkT, derived from the calculation of the mean
square of the particle's speed) and involves only macroscopic terms. But what about the energy
of each particle in the gas?. The only convenient way is to take the system as a whole,
as an ensemble and studying indepth this system statistically whereby to understand the
structure of this matter by knowing how the energy is distributed among the particles. That is the
purpose of the Statistical Mechanics
What do then "microscopic" and "macroscopic" stand for?
When one specifies in depth all the parameters of a particle of a system, we have
a microstate for this system. Whereas, when a "global" situation
of a system is considered, a kind of "snapshot" of the situation, we have
a macrostate for this system.
2. Macrostates & Microstates
Let's consider the following example:
We have 3 particles and 2 energy levels. If the particles are indistinguishable,
we have then 4 configurations.
If the particles are distinguishable, we have 8 configurations.
The number of macrostates is 4
4: Ω(a), Ω(b),Ω(c),Ω(d).
The number of microstates is 8
1 for Ω(a) and for Ω(b), and 3 for Ω(c) and Ω(d).
Remark
In the case of indistinguishable particles, every macrostate has no microstate. Forethermore:
Ω(a) = Ω(b) = Ω(c) = Ω(d) = 1. In other words, the macrostate is the
microstate.
Now, let's fix the energy of this system with three particles to be E. Particles can occupy
the two energy levels n_{i} (i = 1, 2).
If E_{i} (i = 1, 2) is the value of the energy associated to the level i, then:
E = Σ n_{i} E_{i} [i = 1, 2].
As the total number of particles is already fixed to be N (here N = 3), if the number of
levels is n; we have:
E = Σ n_{i}E_{i}
N = Σ n_{i}
[i: 1 → n]
Now, having n_{i} particles in each livel "i", that determines the macrostate for
the system, how many microstates do we have for this macrostate?
If n_{1} is the number of particles that lodge in the level "1", the number of ways
to have this occupation is the number of combinations for n_{1} particles
among N particles; that is :
C_{1} = C(n_{1}, N) = N!/n_{1}!(N  n_{1})!
Once this case is set; it remains (N  n_{i}) particles to consider.
If n_{2} is the number of particles that lodge in the level "2", the number of ways
to have this occupation is the number of combinations for n_{2} particles
among (N  n_{1}) particles; that is:
C_{2} = C(n_{2}, (N  n_{1})) = (N  n_{1})!/n_{2}!((N  n_{1})  n_{2}!
...
Finally, if the last level contains n_{n} particles = N  n_{1}  n_{2}  ...  n_{n  1}, then
the numbers of ways to distribute them in the last level "n" is C_{n} = C(n_{n}, n_{n}) = 1.
In total, the number of ways we have is the product of all the combinations. That is:
C_{1} x C_{2} x ... x C_{n} = ΠC_{i} [i: 1 → n]
= N!/n_{1}!(N  n_{1})! x (N  n_{1})!/n_{2}!((N  n_{1}  n_{2})! x ... x 1.
= N!/n_{1} x n_{2} x ... x 0!= N!/Πn_{i} [i: 1 → n]
The number of ways to distribute N particles over
n levels containing each n_{i} particles is:
Ω = N!/Πn_{i}! [i: 1 → n]
This result is related to one configuration. The number n_{i} can take any vlaue
from 1 to n. In total, we have
&sum Ω [i: 1 → n] = (N + n 1)!/N!(n1)!
In the example at left, we have 3^{2} = 9 ways to place two distinguishable
particles in three sub_levels.
In the case of a level "i" has a degeneracy g_{i}; that is the number g_{i}
subshells, we can place n_{i} particles within by g_{i}^{ni} ways; Thus:
The number of ways to distribute N distinguishable particles over
n levels containing each n_{i} particles; with a degeneracy g_{i} for
the level "i" is
Ω = N! Π(g_{i}^{ni}/n_{i}!) [i: 1 → n]]
The configuration (n_{1}, n_{2}, ..., n_{i}, ..., n_{n}) represents a macrostate
which has Ω = N! Π(g_{i}^{ni}/n_{i}!) microstates
3. MaxwellBoltzmann distribution
Let's suppose that we toss up two dimes. The outcomes can be:
(heads, heads); (tails, tails); (heads, tails) or (tails, heads) because we distinguish the pieces. The probability for
each case is 1/4. If instead we have four balls, red and yellow, then indistinguishable, we will have the case: (red, red);
(yellow, yellow); (red, yellow) = (yelow, red). The probability for each case is 1/4, 1/4, 2/4. We say that the last
case is most probable.
We are interested in the most probable case. To find the most probable case from the expression of Ω we derive
it with respect to the number of particles n_{i} and zero it in order to find the extrema which give the most
probable configuration.
From the expression of Ω itself, we can not go further. Taking its Neperian logarithm "ln" will simplify
greatly the issues.
Ω = N!Π (g_{i}^{ni}/n_{i}!), thus:
ln (Ω)= ln N! + Σ [n_{i}ln g_{i}  ln n_{i}!]
Using the Stirling's approximation
ln x! = x ln x  x , we have:
ln (Ω) = N ln N  N + Σ [n_{i}ln g_{i}  n_{i} ln n_{i} + n_{i}] =
N ln N + Σ [n_{i}][ln g_{i}  ln n_{i}] =
N ln N + Σ [n_{i}][ln (g_{i}/n_{i})]
Thus:
d[ln (Ω)]/dn_{i} = ln(g_{i}/n_{i})  1 = ln g_{i}  ln n_{i}  1
Stirling's approximation proof:
ln N! = Σ ln x [x: 1 → N] = ∫ ln x dx [x: 1 → N]
= xln x  x [x: 1 → N] = Nln N  N + 1.
Neglecting 1, because N is large, we can write: ln N! = Nln N  N.
Example: For N = 100, we have ln N! = 364 and Nln N  N = 360; an error ≈ 1% !
We have the following constraints:
N = Σ n_{i}
E = Σ n_{i}E_{i}
Or :
N  Σ n_{i} = 0
E  Σ n_{i}E_{i} = 0
If ∂ln(Ω)/∂n_{i} = 0, a linear combination with the
constraints gives also zero (method using constants called Lagrange multipliers).
Let's define then a new function F as follows:
F(n_{i}) = ln(Ω)  λ(N  Σ n_{i}) + β(E  Σ n_{i}E_{i})
= NlnN + Σ [n_{i}][ln (g_{i}]  Σn_{i}ln n_{i}  λ(N  Σ n_{i}) + β(E  Σ n_{i}E_{i})
To find its extrema, Its zeroderivative gives :
∂F(n_{i})/∂n_{i} = 0
That is:
ln g_{i}  ln n_{i}  1 + λ  β E_{i} = 0
Let's set: λ  1 = α; then:
ln (g_{i}/n_{i}) + α  β E_{i} = 0 .
Or: ln (g_{i}/n_{i}) =  α + β E_{i}
Or: ln (n_{i}/g_{i}) = α  β E_{i}
Thus:
n_{i} = g_{i} exp[ α  β E_{i}];
and
N = Σ n_{i} = Σ g_{i} exp[ α  β E_{i}]
= exp[ α] Σ g_{i} exp[  β E_{i}], that gives:
exp[ α] = N/Σ g_{i} exp[  β E_{i}] = N/Z
Where:
Z = Σ g_{i} exp[ β E_{i}] called the partition function. It follows that:
MaxwellBoltzmann distribution:
n_{i} = g_{i} N exp(βE_{i})/Σ g_{i} exp(βE_{i})
= g_{i} N exp( βE_{i})/Z
without degeneracy: g_{i} = 1, we have:
n_{i} = N exp(βE_{i})/Σ exp(βE_{i})
= N exp( βE_{i})/Z
