There are many ways to arrive at the Gibbs entropy formula

which is actually identical, except for units, to the Shannon entropy formula. Here I document a relatively straightforward derivation of the formula which might be the easiest route to develop intuition for undergraduate students who already know the formula for the Boltzmann entropy,

where is the effective number of configurations of an isolated system with total energy . This derivation is not new of course, but I decided to write it up anyway because it is a particularly elegant argument and is not always given emphasis in the usual textbooks.

Let us start by considering some general statistical ensemble such that a state has energy and probability which is the same for all states with the same energy . For fixed energy we can assume, from the principle of equal *a priori* probabilities, which is valid in the microcanonical ensemble, that

Hence we obtain for the Boltzmann entropy

For our general ensemble, the mean entropy is given by the weighted average, over all possible states, of the Boltzmann entropy, according to

from which we obtain the famous Gibbs formula for the Boltzmann-Gibbs entropy:

### Like this:

Like Loading...

*Related*

Eu concordo com o que você diz, Gandhi, é elegante e nem sempre é dado ênfase nos livros didáticos usuais. E sem falar da tamanha importância!

Abraço

Muito obrigado Marcelo! 🙂

i don’t understand why Pi is the same when you change the ensemble?? (Microcanonical -> general)

In general it is not the same. But for fixed energy, assuming thermal equilibrium, it is reasonable to assume equiprobability.