Chapter02

1 Discrete random variable

1.1 (0-1) distribution

$P(X=k)=p^k(1-p)^{1-k}, k=0,1 \space (0<p<1)$

1.2 binomial distribution

$P(X=k)=C_n^kp^k(1-p)^{n-k}, \space k=0,1,\ldots,n \space (0<p<1)$
Marked as $X \sim B(n,p)$

1.3 possion Distribution

$P(X=k)=\frac{\lambda^k e^{-\lambda}}{k!} \space \lambda > 0,\space k=0,1,2,\ldots$.
Marked as $X\sim \pi(\lambda)$


If $X \sim B(n,p)$ and $n$ is big enough and $p$ is small enough, then: $$P(X=k)=C_n^kp^k(1-p)^{n-k} \approx \frac{\lambda^k e^{-\lambda}}{k!} $$ where: $\lambda=np$

1.4 geometric distribution

$P(X=k)=(1-p)^{k-1}p, \space k=1,2,3,\dots$

2 Continues random variables

2.1 uniform distribution

$$f(x)=\begin{cases} \frac{1}{b-a}, a<x<b \\ 0, others \end{cases}$$

In [24]:
import numpy as np
import matplotlib.pyplot as plt
plt.plot([1,2], [1,1], linewidth=2,c='k')
plt.plot([1,1], [0,1],'k--', linewidth=2)
plt.plot([2,2], [0,1],'k--', linewidth=2)
plt.plot([0,1], [1,1],'k--')
plt.xticks([1,2],[r'$a$',r'$b$'])
plt.yticks([1],[r'$\frac{1}{b-a}$'])
plt.xlabel('x')
plt.ylabel(r'$f(x)$')
plt.axis([0, 2.5, 0, 2])
gca = plt.gca()
gca.spines['right'].set_color('none')
gca.spines['top'].set_color('none')
gca.xaxis.set_ticks_position('bottom')
gca.yaxis.set_ticks_position('left')
plt.show()


2.2 exponential distribution

$$f(x)= \begin{cases} \lambda e^{-\lambda x}, x >0 \\ 0, x \le 0 \end{cases}$$

In [31]:
import numpy as np
import matplotlib.pyplot as plt
lam = 5
x = np.linspace(0.01, 1, 1000)
f_x = lam * np.power(np.e, -1.0*lam*x)
plt.plot(x, f_x)
plt.xlabel('x')
plt.ylabel(r'$f(x)$')
plt.axis([0, 1.1, 0, 2])
plt.xticks(())
plt.yticks(())
gca = plt.gca()
gca.spines['right'].set_color('none')
gca.spines['top'].set_color('none')
gca.xaxis.set_ticks_position('bottom')
gca.yaxis.set_ticks_position('left')
plt.show()


2.3 normal distribution

$$f(x)= \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^{2}}{2\sigma^2}} $$

where: $-\infty < x < \infty$


In [66]:
import math
import numpy as np
import matplotlib.pyplot as plt
def gauss(x, mu, sigma):
    return 1.0 / (np.sqrt(2*np.pi) * sigma) * np.power(np.e, -1.0 * np.power(x-1, 2)/ (2))
mu = 1.0
sigma = 1.0
x = np.linspace(-2.0, 4.0, 1000)
y = gauss(x, mu, sigma)
plt.plot(x, y)
plt.xticks([1.0], [r'$\mu$'])
plt.yticks(())
plt.xlabel('x')
plt.plot([1.0, 1.0],[0.0, gauss(1.0, mu, sigma)], 'k--')
plt.plot([0,1.0],[gauss(1.0, mu, sigma),gauss(1.0, mu, sigma)],'k--')
plt.text(0-.2, gauss(1.0, mu, sigma), r'$\frac{1}{\sqrt{2\pi}}$',ha='right', va='center')
plt.axis([-3,5,0,0.5])
#plt.(r'$\frac{1}{\sqrt{2\pi}}e^{-\frac{(x-1)^{2}}{2\sigma^2}}$')
gca = plt.gca()
gca.spines['right'].set_color('none')
gca.spines['top'].set_color('none')
gca.xaxis.set_ticks_position('bottom')
gca.spines['bottom'].set_position(('data',0))
gca.yaxis.set_ticks_position('left')
gca.spines['left'].set_position(('data',0))
plt.show()


3 Single Variate Distribution

If a random variable has probability density function is $ f_X(x), -\infty < x < \infty$, and the function $g(x)$ can be derivated anywhere and $g^{'}(x) < 0$, then the random variable $Y=g(X)$'s probability density function is : $$f_Y(y)= \begin{cases} f_X(h(y))\rvert h'(y)\rvert, \alpha< y < \beta \\ 0, ohters \end{cases}$$ where $\alpha=min(g(-\infty), g(+\infty)), \beta=max(g(-\infty), g(+\infty))$ and $h(y)$ is inverse function of the $y=g(x)$.