# Math Insight

### Bayes' Theorem

Math 2241, Spring 2022
Name:
ID #:
Due date: April 6, 2022, 11:59 p.m.
Table/group #:
Group members:
Total points: 1
1. Bayes' theorem simply expresses a relationship between conditional probabilities. If $A$ and $B$ are two events, then the formula for the conditional probabilities are:
$P(A\,|\,B) =$

$P(B\,|\,A) =$
.

Both of those equations depend on $P(A,B)$, the probability of $A$ and $B$. (You can write this probability as $P(A,B)$ or $P(B,A)$; the order of $A$ and $B$ doesn't matter here.) Solve them both for $P(A,B)$, obtaining two expressions for the same quantity $P(A,B)$.
$P(A,B) =$

$P(A,B) =$

Now, set those two expressions for $P(A,B)$ equal to each other.

Bayes' Theorem is the result of solving this equation for $P(A\,|\,B)$. It gives an expression for the conditional probability $P(A\,|\,B)$ in terms of $P(B\,|\,A)$, the conditional probability with the events reversed.

Enter Bayes' Theorem.
$P(A \,|\, B) =$

The content of Bayes' Theorem is really no different than what we've already done when calculating conditional probabilities. The power of Bayes' Theorem lies in how we can interpret its components in terms of probabilistic inference. It can be thought of capturing how additional information (observation of the event $B$) modifies what we can infer about the likelihood of another event $A$.

Before observing the event $B$, our estimate of the likelihood of $A$ is $P(A)$. After observing $B$, the probability of $A$ becomes the conditional probability $P(A\,|\,B)$. Bayes' Theorem captures how the estimate of the probability of $A$ changes based on the likelihood $P(B)$ of the event $B$ as well as information $P(B\,|\,A)$ of how $B$ depends on $A$.

In Bayesian inference the original probability $P(A)$ is the called the prior probability (or simply, the prior). The new probability for $A$ that we obtain after observing the event $B$, i.e., $P(A\,|\,B)$, is called the posterior probability.

2. Let's return to the situation of randomly picking an object from a collection of 1 white square, 2 black squares, 3 white triangles, and 4 black triangles. We'll recast our results in terms of Bayes' Theorem.

Let $S$ be the event of picking a square, $T$ be the event of picking a triangle, $W$ the event of picking a white object, and $B$ the event of picking a black object.

Write Bayes' Theorem for a new probability $P(W\,|\,S)$ of picking a white object, given that a friend told you the object was a square. Your answer should involve mathematical expressions, like $P(S)$ and $P(S\,|\,W)$.
$P(W\,|\,S)=$

Since we have already calculated these probabilities, we can quickly verify Bayes' Theorem works for this case.

With no additional information, what is the probability of a white object? $P(W) =$

What is the probability of picking a square? $P(S)=$

What is the probability of picking a square if you knew the object was white? $P(S\,|\,W)=$

Plug these numbers into Bayes' theorem to see that you got the same answer as before.
$P(W\,|\,S) =$
$\times$
$/$
$=$

3. Bayes' Theorem is the natural tool to use when some conditional probabilities are known but you are interested in the opposite conditional probabilities.

For example, consider a card game of chance introduced earlier. This game is played with four decks of cards, labeled X, A, B, and C. To play this game, a player begins by picking a card from deck X, looking to see if the card is an A, B or C. The player then picks a card from the deck labeled by the chosen first card. This second card will be either a W (for win) or a L (for lose).

Let the event $A$ indicate an A was chosen from deck X, event $B$ indicate a B was chosen from deck X, and event $C$ indicate a C was chosen from deck X. Let event $W$ indicate a win and event $L$ indicate a loss.

1. For a particular set of decks, you observe that a player wins 30% of the time. Moreover, you observe that, when a player wins, 90% of the time the winning card is drawn from deck A (meaning an A was picked from deck X). On the other hand, when a player loses, 40% of the time the losing card is drawn from deck A (meaning an A was picked from deck X).

Since picking an A from deck X is so much more likely to occur during a win than it is to occur during a loss, you wonder if picking an A from deck X is a good indication that a player will win. In particular, you wonder if picking A indicates that one has a larger than 50% chance of winning.

What conditional probability are you interested in estimating?

What three probabilities are you given in the statement of the problem?

$= 0.3$

$= 0.9$

$= 0.4$

Write Bayes' Theorem for the probability that you want to estimate.
$＿=$

Of the three probabilities needed for Bayes' Theorem, you already have values for two of them. Which of these probability is not yet specified?
In many cases, this probability from the denominator of Bayes' Theorem can be the trickiest to obtain. Recall how, when calculating conditional probabilities, we would often divide by a row or column sum of a contingency table. The denominator of Bayes' theorem plays the same role as this row or column sum.

The information to calculate the denominator $＿$ is already specified in the problem statement. It just takes a few steps to calculate.

First, what is the probability of losing the game? $P(L)=$

Second, from $P(A\,|\,W)$ and $P(W)$, calculate $P(A,W)$. From $P(A\,|\,L)$ and $P(L)$, calculate $P(A,L)$. Enter these values into this column of a contingency table.

A
W
L
Total

Third, the total of the column is the overall probability of picking an $A$ from deck X: $P(A)=P(A,W)+P(A,L)=$
. This is the value for the denominator of Bayes' Theorem.

Now, you can fill in all the numbers for Bayes' Theorem.
$P(W\,|\,A) =$
$\times$
$/$
$=$

It turns out that picking A from deck X
indicate that one has at least a 50% chance of winning.

2. For a different set of the four decks, you observe the probabilities for each separate deck. For deck X, you observe that $P(A)=0.7$, $P(B)=0.1$, and $P(C)=0.2$. For the three lettered decks, you observe that $P(W\,|\,A) = 0.1$, $P(L\,|\,A) = 0.9$, $P(W\,|\,B) = 0.5$, $P(L\,|\,B) = 0.5$, $P(W\,|\,C) = 0.7$, and $P(L\,|\,C) = 0.3$.

Let's say you wanted to calculate the conditional probability $P(B\,|\,W)$, the probability, given that a winning card was drawn, that the win was obtained from deck B. Write down Bayes' Theorem for $P(B\,|\,W)$.
$P(B\,|\,W) =$

Again the denominator, the probability of winning $P(W)$, is the probability not explicitly known. But, you can combine six of the given probabilities to determine $P(W)$. (You calculated this same probability before in the context of conditional probabilities.)
$P(W)=$

Given that a player won, what is the probability that the player had selected a $B$? $P(B\,|\,W)=$

4. Familial Adenomatous Polyposis (FAP) is a genetic syndrome that it likely to cause colon cancer. If an individual inherits the defective gene, there is about a 95% chance that one will develop potentially cancerous polyps in the colon (and these polyps, if untreated, lead almost inevitably to colon cancer). FAP is fairly rare, with the incidence of inheriting the mutation being approximately one in 10,000 births.

Let $I$ be the event that an individual inherits the genetic mutation underlying FAP. Let $C$ be the event that a person develops the potentially cancerous polyps. Given the above information, what are the following probabilities?
$P(I) =$

$P(C\,|\,I) =$

It is also possible, though rare, for a spontaneous mutation of the gene underlying FAP to occur. Hence, individuals that did not inherit the genetic mutation might also develop the potentially cancerous polyps of FAP. Let's imagine that the probability that a person who did not inherit the mutation develops the polyps is $4.0 \cdot 10^{-5}$.

Let $N$ be the event that a person did not inherit the genetic mutation underlying FAP. What are the following probabilities?
$P(N) =$

$P(C\,|\,N) =$

Given that a person develops the cancerous polyps from FAP, what is the probability that they inherited the mutated gene?
$P(I\,|\,C) =$