Skip to content

An ideal gas in a quantum-mechanical microcanonical ensemble

1. What is the physical problem?

We have:

  • N identical, non-interacting particles
  • in a volume V
  • with a fixed total energy E

This is an ideal gas in the microcanonical ensemble (N, V, E all fixed).

The big question in statistical mechanics is:

"If I know only N, V, and E, what is the most likely way the particles are arranged in energy levels?"

Once we know the most likely arrangement, we can get entropy, pressure, etc.

2. Grouping energy levels into "cells"

The real spectrum of single-particle energies is very dense (many closely spaced levels). To make counting easier, they:

  • Group nearby energy levels into cells (think of bins).
  • Cell i has:
  • an average energy \( \varepsilon_i \)
  • \( g_i \) individual levels inside it
  • \( n_i \) particles sitting in that cell

So the "macro description" of the gas is basically the list \(\{n_i\}\): how many particles are in each energy cell.

Two constraints must hold:

  1. Total number of particles:
\[ \sum_i n_i = N \quad (1) \]
  1. Total energy:
\[ \sum_i n_i \varepsilon_i = E \quad (2) \]

3. Microstates vs distribution \(\{n_i\}\)

  • A microstate = a specific way each individual particle sits in specific levels.
  • A distribution set \(\{n_i\}\) = just how many are in each cell, ignoring which exact particle is where inside that cell.

For a given distribution \(\{n_i\}\), there are some number of distinct microstates \(W[\{n_i\}]\).

The total number of microstates with energy E is

\[ \Omega(N,V,E) = \sum_{\{n_i\}}' W[\{n_i\}] \quad (3) \]

where the sum runs over all distributions obeying (1) and (2).

Because different cells are independent, the total number of microstates for a given distribution factorizes:

\[ W[\{n_i\}] = \prod_i w(i) \quad (4) \]

where \(w(i)\) is the number of ways to arrange \(n_i\) particles in the \(g_i\) levels of cell i.

So the whole game becomes: find \(w(i)\) for different kinds of particles.

4. Counting ways depending on the statistics

Think of "levels" as boxes and "particles" as balls.

(a) Bose–Einstein (bosons)

  • Bosons can share the same level.
  • Particles are indistinguishable.

Question: In how many ways can we put \(n_i\) identical balls into \(g_i\) boxes, with no limit per box?

Answer (combinatorics):

\[ w_{\text{B.E.}}(i) = \frac{(n_i + g_i - 1)!}{n_i!(g_i - 1)!} \quad (5) \]

So for all cells:

\[ W_{\text{B.E.}}[\{n_i\}] = \prod_i \frac{(n_i + g_i - 1)!}{n_i!(g_i - 1)!} \quad (6) \]

(b) Fermi–Dirac (fermions)

  • Fermions obey the Pauli principle: at most one particle per level.
  • Particles are still indistinguishable.

To place \(n_i\) fermions into \(g_i\) levels, each level can be 0 or 1 occupied. We just choose which \(n_i\) levels are occupied:

\[ w_{\text{F.D.}}(i) = \frac{g_i!}{n_i!(g_i - n_i)!} \quad (7) \]

So:

\[ W_{\text{F.D.}}[\{n_i\}] = \prod_i \frac{g_i!}{n_i!(g_i - n_i)!} \quad (8) \]

(c) Maxwell–Boltzmann (classical particles)

Here:

  • Particles are effectively distinguishable (classical limit).
  • Any number of particles can occupy any level.
  • But to avoid overcounting, we divide by a Gibbs factor.

First: if particles were distinguishable with no correction, each of the \(n_i\) particles could choose any of the \(g_i\) levels:

\[ (\text{number of states}) = (g_i)^{n_i} \]

But this counts many states as distinct when they are really the same "macro" arrangement, so we divide by \(n_i!\) to correct:

\[ w_{\text{M.B.}}(i) = \frac{(g_i)^{n_i}}{n_i!} \quad (11) \]

and

\[ W_{\text{M.B.}}[\{n_i\}] = \prod_i \frac{(g_i)^{n_i}}{n_i!} \]

5. Entropy and "most probable" distribution

Entropy in the microcanonical ensemble is

\[ S(N,V,E) = k \ln \Omega(N,V,E) \quad (12) \]

The sum over all distributions in \(\Omega\) is usually dominated by one overwhelmingly most probable distribution \(\{n_i^*\}\). Then we can approximate:

\[ S \approx k \ln W[\{n_i^*\}] \quad (13) \]

So next step: find the distribution \(\{n_i^*\}\) that maximizes \(W\) (or equivalently \(\ln W\)), with constraints:

  • \(\sum_i n_i = N\)
  • \(\sum_i n_i \varepsilon_i = E\)

6. Maximizing with Lagrange multipliers

They maximize

\[ \ln W[\{n_i\}] - \alpha \sum_i n_i - \beta \sum_i \varepsilon_i n_i \]

where \(\alpha\) and \(\beta\) are Lagrange multipliers enforcing N and E.

The condition for a maximum is

\[ \delta \ln W[\{n_i\}] - \left[ \alpha \sum_i \delta n_i + \beta \sum_i \varepsilon_i \delta n_i \right] = 0 \quad (14) \]

Using Stirling's approximation \(\ln x! \approx x\ln x - x\) and plugging in the formulas for \(W\), they get a unified form

\[ \ln W[\{n_i\}] \approx \sum_i \left[ n_i \ln\left( \frac{g_i}{n_i} - a \right) - \frac{g_i}{a} \ln\left( 1 - a \frac{n_i}{g_i} \right) \right] \quad (15) \]

where the parameter \(a\) encodes the statistics:

  • \(a = -1\) → Bose–Einstein
  • \(a = +1\) → Fermi–Dirac
  • \(a = 0\) → Maxwell–Boltzmann (limit of that expression)

Taking the variation and using that the \(\delta n_i\) are arbitrary, they get

\[ \ln\left(\frac{g_i}{n_i^*} - a\right) - \alpha - \beta \varepsilon_i = 0 \quad (17) \]

Solving this for \(n_i^*\):

\[ n_i^* = \frac{g_i}{e^{\alpha + \beta \varepsilon_i} + a} \quad (18) \]

This is the central result: the most probable number of particles in cell i.

Better to look at occupancy per level in that cell:

\[ \frac{n_i^*}{g_i} = \frac{1}{e^{\alpha + \beta \varepsilon_i} + a} \quad (18a) \]

This becomes the familiar distributions:

  • Bosons: \(a = -1\)\(\displaystyle \frac{1}{e^{\alpha + \beta \varepsilon_i} - 1}\)
  • Fermions: \(a = +1\)\(\displaystyle \frac{1}{e^{\alpha + \beta \varepsilon_i} + 1}\)
  • Classical: \(a = 0\)\(\displaystyle e^{-(\alpha + \beta \varepsilon_i)}\)

So all three cases come out of one formula.

7. Entropy and pressure

Now we plug the equilibrium distribution \(n_i^*\) back into the expression for \(S\). After some algebra they get an expression for \(S/k\) involving three sums; the first two are simply \(\alpha N\) and \(\beta E\), leaving

\[ \frac{1}{a}\sum_i g_i \ln\left[ 1 + a e^{-\alpha - \beta \varepsilon_i} \right] = \frac{S}{k} - \alpha N - \beta E \quad (20) \]

Next they interpret the multipliers thermodynamically:

\[ \alpha = -\frac{\mu}{kT}, \qquad \beta = \frac{1}{kT} \quad (21) \]

where \(T\) is temperature and \(\mu\) is chemical potential.

Using thermodynamic identities, they show that the right-hand side of (20) equals \(PV/kT\):

\[ \frac{S}{k} + \frac{\mu N}{kT} - \frac{E}{kT} = \frac{G - (E - TS)}{kT} = \frac{PV}{kT} \quad (22) \]

Therefore, the gas pressure obeys

\[ PV = \frac{kT}{a} \sum_i g_i \ln\left[ 1 + a e^{-\alpha - \beta \varepsilon_i} \right] \quad (23) \]

This is the general equation of state for an ideal quantum gas (bosons or fermions).

8. Classical (Maxwell–Boltzmann) limit

If we take \(a \to 0\) (classical limit: low density or high temperature), the log can be expanded and (23) becomes

\[ PV = kT \sum_i g_i e^{-\alpha - \beta \varepsilon_i} = kT \sum_i n_i^* = NkT \quad (24) \]

So we recover the familiar ideal gas law:

\[ PV = NkT \]

In one sentence

These pages:

  1. Count how many ways particles can be placed in energy levels (bosons, fermions, classical).
  2. Find the most probable distribution of particles over energy levels using combinatorics + Lagrange multipliers.
  3. Show that this gives the Bose–Einstein, Fermi–Dirac, and Maxwell–Boltzmann distributions as special cases.
  4. Use that distribution to derive entropy and pressure, and show that in the classical limit the gas obeys \(PV = NkT\).