A random variable is a number that describes the outcome of an event. We can never be certain what value a random variable will take until after the event happens.
Event | Random Variable |
---|---|
Flipping a coin | 1 if heads, 0 if tails |
Rolling a die | What number will appear face up |
Completion of a thesis | The number of years it takes |
Weight of a dead cat's heart | How much the heart weighs in grams |
Calculating the average weight of 144 dead cats' hearts | How much the average weight is in grams |
Although we can never be certain what value random variable can take on, we often know or can guess what's called a probability distribution function or PDF for short. PDF's tell you what the probability is that a random variable will take on a certain value. In other words they tell you the probability of an event going down a certain way. If the event you're concerned with is the completion of a thesis and the random variable is number of years it takes, then the PDF could tell you that taking 4.5 years has a probability of 0.4, 5 years has a probability of 0.3, 10 years has a probability of 0.05, etc. If you integrate PDFs over all values that a random variable can take the result will always be one. This makes sense because the sum of the probabilities of all possibilities has to be one i.e. one of the possibilities has to take place
Note that continuous random variables like the number of years it takes to complete a thesis have PDFs, but discrete random variables like what number will face up when you roll a die have what's called probability mass functions or PMFs. This small detail is unimportant to know for this presentation, but is just a heads up for when you go out and read papers.
If X is a random variable, and its PDF is the following
then we say X is normally distributed or X follows the normal distribution. This means we can get the probability that X will take on certain values since it is normally distributed. To find the probability that X will equal 3.23 we simply plug in 3.23 for x in the above equation. There are many other distributions, in fact infinite, but the normal distribution is a famous one because many random variables we encounter out in the wild seem normally distributed.
I mentioned that we can get the probability that X will equal 3.23 or any number for that matter, just from the above PDF, but we also need to know what $\mu$ and $\sigma$ above are to get that probability. $\mu$ and $\sigma$ are called the parameters of the normal distribution. The PDF depends on what values they take on. $\mu$ is called the mean of the normal distribution and $\sigma$ is called the standard deviation of the normal distribution. Below is the normal distribution plotted with several different parameters.
Pretend that there are many, many alternate universes, and in each of these universes X takes on a different value. If we averaged the value that X took in all these universes the resulting number is called the mean or expected value of X. Every random variable has an expected value. Using the PDF you can calculate the expected value. For the normal distribution the expected value will always be $\mu$.
If we look at what value X took on the most times in all those alternate universes, the resulting value would be called the mode of the random variable X. Every random variable has a mode, and it is simply the maximum of the PDF. This makes sense because the mode is number that X will take on most often and thus with highest probability. For normal random variables, the mode happens to always be $\mu$.
Now say we have the expected value of a random variable. We can take the value X took on in all those alternate universes and find the squared distance from the mean. We do this by subtracting the mean from the value and squaring it. If we then took all these squared distances from the mean that X was in all these alternate universes and we average these squared distances then the result is called the variance of the random variable. The variance of a random variable tells you on average how far a random variable will fall from its mean. The square root of the variance is called the standard deviation. For the normal distribution the square root is $\sigma$.