independent events

Ex. Throw a 2-sided dice independently of each other

  • $A = \{first die fakks on k \}$
  • $B = \{second die falls on l\}$

    $\mathbf{P[A\cap B] = \frac{1}{36}}$ because of indepent events

Def: If A&B are indepedet events, then $\mathbf{P}[A \cap B] = \mathbf{P}[A] \mathbf{P}[B]$

Def: Conditional probablity

$$\mathbf{P}[A\vert B] = \frac{\mathbf{P}[A \cap B]}{\mathbf{P}[B]}$$
  • Note: this is used for understanding non-independent events.

ven-diagram

Example

A sexual disease with probability: $\mathbf{P}[disease] = 0.6\%$

  • $\mathbf{P}[positive \vert disease] = 0.98$
  • $\mathbf{P}[positive \vert not\ disease] = 0.01$
  • Find $\mathbf{P}[positive]=?$

$\mathbf{P}[positive] = \mathbf{P}[positive \cap distease] + \mathbf{P}[positive, \cap disease^c]$ becasue these two are dosjoimt events, we can find them easily

  • We know $0.98 = \mathbf{P}[positive\vert disease] = \frac{\mathbf{P}[positive \cap diease]}{\mathbf{P}[disease]}$
  • therefore, $\mathbf{P}[positive \cap disease] = 0.98\times \mathbf{P}[disease] = 0.00588$
  • similarly, $\mathbf{P}[positive \cap disease^c] = 0.01\times 0.994=0.00994$
  • Finally: $\mathbf{P}[positive] = 0.00588 + 0.00994 = 0.01582$

  • (Question) Next, find $\mathbf{P}[desiease \vert positive]=?$

  • (Answer) $\mathbf{P}[disease] = \frac{\mathbf{P}[disease \cap positive]}{\mathbf{P}[positive]} = \frac{0.00588}{0.01582} = 0.37$

Rule:

$$\mathbf{P}[A_1\cap A_2 \cap .. \cap A_n] = ?$$
  • If independent events: $\mathbf{P}[A_1]\mathbf{P}[A_2]..\mathbf{P}[A_n]$
  • If not independet: $\displaystyle \mathbf{P}[A_1] \mathbf{P}[A_2\vert A_1] \mathbf{P}[A_3\vert A_1 A_2] ... \mathbf{P}[A_n \vert A_1\cap A_2 \cap .. \cap A_{n-1}]$
    • it is like a chain rule for probability
Example

a gene has two forms:

Theorem: Law of Total Probabilities

  • Assuming $\Omega = A_1 \cup A_2 \cup .. A_k$ and the $A_i$'s are pairwise disjoint $$\mathbf{P}[B] = \sum_{i=1}^{k}\mathbf{P}[B\vert A_i]\mathbf{P}[A_i]$$

Theorem: Bayes Rule

  • Let $A_1$ and $A_2$ be two events. Then, $$\mathbf{P}[A \vert B] = \frac{\mathbf{P}[A\cap B]}{\mathbf{P}[B\vert A]\mathbf{P}[A]+ \mathbf{P}[B\vert A^c]\mathbf{P}[A^c]}$$

  • this is because the denuminator is equal to $\mathbf{P}[B]$ by the previous theorem (Law of Total Probability) applied to $A_1 = A$ and $A_2=A^c$

    Similarly:

  • Let $A_1, A_2, ..A_k$ be disjoint events with $\cup_{i=1}^{k}A_i = \Omega$
    then, for any $i$: $$\mathbf{P}[A_i\vert B] = \frac{\mathbf{P}[A_i\cap B]}{\mathbf{P}[B\vert A_1]\mathbf{P}[A_1] + .. \mathbf{P}[B\vert A_k]\mathbf{P}[A_k]}$$

  • ### Vocabulary: prior, posterior and likelihood

    • The set of all $\mathbf{P}[A_i]$'s are called prior probabilities. This is information (given to us), which the model has to decide upon.
    • The probabilities we compute, the $\mathbf{P}[A_i\vert B]$ are called posterior probabilities: they are updates on the chance of each $A_i$ given the additional information $B$.
    • The conditional probabilities $\mathbf{P}[B \vert A_i]$ are often called likelihoods.

(Bayesian belief versus frequentists)

Example

Zeke tosses a coin, a fake coin and a real coin. Zeke tosses a coin n times and gets a head every time.

  • $\mathbf{P}[\text{Zeke tells the truth}] = 0.9$ lets call this event $A=\text{Zeke tells the truth}$.
  • $\mathbf{P}[A^c] = 0.1$

  • Observation = $B = \text{"Heads on n independent tosses"}$.

  • (Question): $\mathbf{P}[A \vert B]= ?$

    • Answer: $$\mathbf{P}[A \vert B]=\frac{\mathbf{P}[A\cap B]}{\mathbf{P}[B\vert A] \mathbf{P}[A] + \mathbf{P}[B\vert A^c]\mathbf{P}[A]}$$ $$\frac{0.9\times \left(\frac{1}{2}\right)^n}{0.9\times \left(\frac{1}{2}\right)^n + 1\times 0.1}=\frac{1}{1 + 0.1\times 2^n \times \frac{1}{0.9}}$$
Example:
  • part a: $$1 - (\frac{5}{6})^4$$
  • part b: $35/36$ and getting $1-\left( \frac{35}{36} \right)^{24}$
Example:

Traffic:

  • Assumption: those two bridges are independent of each other
  • $\mathbf{P}[\text{this circuit works}]=?$

    • $\mathbf{P}[\text{this circuit works}] = \mathbf{P}[C_1 \cap \left(C_2 \cup C_3\right)] = \mathbf{P}[\left(C_1 \cap C_2\right) \cup \left(C_1 \cap C_3\right)]$

    • Note: if $C_2$ and $C_3$ were not independent: $\mathbf{P}[C_1]\times \mathbf{P}[C_2\cup C_3] = \mathbf{P}[C_1]\times \left(\mathbf{P}[C_2] + \mathbf{P}[C_3] - \mathbf{P}[C_2\cap C_3]\right)$


In [ ]: