In this chapter, we will talk more about conditional probabilities. Recall from earlier discussion that we can think of a conditional probability as the probability of an event if we are given some additional information. So that you do not have to flip back, let me state the fundamental equation for conditional probabilities once more. The conditional probability of the event B, given the event A satisfies
and recall that this is different from P(B) only if A and B are dependent events. If A and B are independent, they have no impact on each other's probabilities. In other words, to condition on an independent event is to incorporate irrelevant information.
We can, of course, also compute the conditional probability in the other direction, the conditional probability of A given B. As “B and A” is the same as “A and B” and P(A and B) can be computed as P(B given A) × P(A), we get the following expression:
a formula that is known by the name Bayes' rule (or Bayes' theorem), named after the Reverend Thomas Bayes (1702–1761). Bayes himself did not publish his rule, and it was only published posthumously after a friend had found it among Bayes' papers after his death. This little innocent-looking formula ...