Chapter 01

1 Condition Probability

The probability of A given B $$P(A \rvert B) = \frac{P(AB)}{P(B)}$$
$\Rightarrow$

Product Rule $$\begin{cases} P(AB) = P(A)P(B \vert A) \\ P(AB) = P(B)P(A \vert B) \end{cases}$$

2 Law of Total Probability

If $\{ B_n: n = 1, 2, 3, \ldots \}$ is a finite or contably infinite partition of a sample space and each event $B_n$ is measureable. then for any event $A$ of the same probabilty space: $$P(A) = \sum_{n} P(A \vert B_n)P(B_n)$$

3 Bayes' Theorem

Describe the probabilty of an envent, base on the prior knowledge of condition that might be related to the event.

$$P(B_i|A)=\frac{P(B_iA)}{P(A)}=\frac{P(A|B_i)P(B_i)}{P(A)}=\frac{P(A|B_i)P(B_i)}{ \sum_{n} P(A \vert B_n)P(B_n)}$$

4 Prior Probability Vs Posterior Probability

4.1 Prior Probability

Express one's beliefs about this quantity before some evidence is taken into account. The unknown quantity may be a paramter of the model of the latent variable ranther than the an observation variable. And the paramter of prior distrubtions are kind of hyperparamters.

4.2 Posterior Probability

A random event or an uncertain proposition is the conditional probability that is assigned after the relevant evidence or background is taken into account.
The posterior probability is the probability of the parameter $\theta$ given the evdence $X$: $p(\theta \rvert X)$. According to Bayes Theorem, the posterior probability is defined as $$p(\theta | x)=\frac{p(x|\theta)p(\theta)}{x})$$

$\text{Posterior probability} \propto \text{Likelihood} \times \text{Proior probability}$


In [ ]: