The principle of equal a priori probabilities works even when probabilities are not a priori equal

A priori probabilities are those that can be known solely through reasoning. The principle of equal a priori probabilities holds that, absent information to the contrary, every possible event can be taken to be equally likely. The principle is especially important in equilibrium statistical mechanics, where it is used to calculate averages over the energy surface for a classical Hamiltonian system without prior knowledge of the exact trajectory in phase space. It is well known that for ergodic systems the principle is valid. What is less widely known is that the principle can work even for non-ergodic systems. Indeed, an alternative approach to justifying the use of the principle is via the application of the maximum entropy principle. (My personal understanding of the subject was much influenced by Jaynes.)

In this post I wish to draw attention to an issue that many students find somewhat puzzling or even counterintuitive, namely, that the principle of equal a priori probabilities works even when probabilities are a priori unequal! That’s right, unequal!

Consider a roll of the dice. Let us choose the common 6-faced die. If the die is fair, in other words unbiased, then the probability p of rolling, say, a 3 is exactly p(3)=1/6. And indeed, this is exactly the same result as if we use the principle of equal a priori probabilities: there are 6 faces and if the probabilities are equal then each face must have a probability of 1/6 exactly.

Now consider “loaded dice.” Let us assume that you know a priori that the die is not fair. Let us assume that there is a 90% chance of landing one of the given numbers 1,2,\ldots,6. So the probability of landing the other 5 numbers add up to 10%. Let us assume that each of these 5 other numbers has a 2% chance.  Clearly, the probabilities p(n) for n=1,2,\ldots,6 are unequal.  Let us call the privileged number with the 90% chance to be m.  Then, for  n=1,2,\ldots,6,

p(n) = \left\{\begin{array}{ll} 0.9 & n=m \\ 0.02 & n\neq m~. \end{array}  \right.

Now let us say that you wish to calculate p(3), but without knowning the value of m.  Obviously, if you knew m then either p(3)=0.9 or else p(3)=0.02, depending on whether or not m=3.  But let us assume that you do not know the value of m. Then what?

In this case, we must consider all possible values of m and do the calculation over the whole sample space.  The calculation is easily done by splitting the contribution to the expected of p(3)  value into 2 parts, where the 1st part is the case m=3 and the second part the case m\neq 3:

p(3) = {1\over 6}\left(0.9+  \sum^6_{m\neq 3} (0.02) \right)~.

Evaluating, we get

p(3) = {1\over 6}\left(0.9+  \sum^6_{m\neq 3} (0.02) \right)  = {1\over 6}\left(0.9+  0.1 \right) = {1\over 6}~.

Notice that you get the same answer as before!  If you do not know m, the probability to land a 3 is the same for loaded dice as it is for fair dice.

This simple example illustrates the power of the principle of equal a priori probabilities. Remarkably, it is an excellent heuristic even when the underlying probabilities are not in fact equal!

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s