Ex. Throw a 2-sided dice independently of each other
$B = \{second die falls on l\}$
$\mathbf{P[A\cap B] = \frac{1}{36}}$ because of indepent events
Def: If A&B are indepedet events, then $\mathbf{P}[A \cap B] = \mathbf{P}[A] \mathbf{P}[B]$
A sexual disease with probability: $\mathbf{P}[disease] = 0.6\%$
$\mathbf{P}[positive] = \mathbf{P}[positive \cap distease] + \mathbf{P}[positive, \cap disease^c]$ becasue these two are dosjoimt events, we can find them easily
Finally: $\mathbf{P}[positive] = 0.00588 + 0.00994 = 0.01582$
(Question) Next, find $\mathbf{P}[desiease \vert positive]=?$
Let $A_1$ and $A_2$ be two events. Then, $$\mathbf{P}[A \vert B] = \frac{\mathbf{P}[A\cap B]}{\mathbf{P}[B\vert A]\mathbf{P}[A]+ \mathbf{P}[B\vert A^c]\mathbf{P}[A^c]}$$
this is because the denuminator is equal to $\mathbf{P}[B]$ by the previous theorem (Law of Total Probability) applied to $A_1 = A$ and $A_2=A^c$
Similarly:
Let $A_1, A_2, ..A_k$ be disjoint events with $\cup_{i=1}^{k}A_i = \Omega$
then, for any $i$:
$$\mathbf{P}[A_i\vert B] = \frac{\mathbf{P}[A_i\cap B]}{\mathbf{P}[B\vert A_1]\mathbf{P}[A_1] + .. \mathbf{P}[B\vert A_k]\mathbf{P}[A_k]}$$
### Vocabulary: prior, posterior and likelihood
(Bayesian belief versus frequentists)
Zeke tosses a coin, a fake coin and a real coin. Zeke tosses a coin n times and gets a head every time.
$\mathbf{P}[A^c] = 0.1$
Observation = $B = \text{"Heads on n independent tosses"}$.
(Question): $\mathbf{P}[A \vert B]= ?$
Traffic:
$\mathbf{P}[\text{this circuit works}]=?$
$\mathbf{P}[\text{this circuit works}] = \mathbf{P}[C_1 \cap \left(C_2 \cup C_3\right)] = \mathbf{P}[\left(C_1 \cap C_2\right) \cup \left(C_1 \cap C_3\right)]$
Note: if $C_2$ and $C_3$ were not independent: $\mathbf{P}[C_1]\times \mathbf{P}[C_2\cup C_3] = \mathbf{P}[C_1]\times \left(\mathbf{P}[C_2] + \mathbf{P}[C_3] - \mathbf{P}[C_2\cap C_3]\right)$
In [ ]: