Continuous random variables: probability density functions, expectation, variance

Resources | Subject Notes | Mathematics

Probability & Statistics 2 - Continuous Random Variables

Probability & Statistics 2 (S2)

Continuous Random Variables: Probability Density Functions, Expectation, Variance

This section covers the concepts of continuous random variables, their probability density functions (PDFs), expected values, and variance. These are fundamental concepts in probability and statistics, particularly when dealing with variables that can take on any value within a given range.

1. Probability Density Function (PDF)

A continuous random variable is a variable that can take on any value within a given range. Unlike discrete random variables, the probability of a continuous random variable taking on a specific value is zero. Instead, we use a probability density function (PDF) to describe the probability distribution.

The PDF, denoted by $f(x)$, is defined such that the area under the curve of the PDF over a given interval represents the probability that the random variable falls within that interval.

Mathematically, this is expressed as:

$P(a \le X \le b) = \int_{a}^{b} f(x) dx$

Key properties of a PDF:

  • $f(x) \ge 0$ for all $x$
  • The total area under the curve is equal to 1: $\int_{-\infty}^{\infty} f(x) dx = 1$

2. Expectation (Mean)

The expected value (or mean) of a continuous random variable $X$, denoted by $E(X)$ or $\mu$, represents the average value we would expect to observe over many repeated trials.

For a continuous random variable, the expected value is calculated using the following formula:

$E(X) = \int_{-\infty}^{\infty} x f(x) dx$

This integral represents the weighted average of all possible values of $X$, where the weights are given by the PDF $f(x)$.

3. Variance

The variance of a continuous random variable $X$, denoted by $Var(X)$ or $\sigma^2$, measures the spread or dispersion of the distribution around the expected value. It is calculated as the expected value of the squared difference between the random variable and its mean.

The formula for the variance is:

$Var(X) = E[(X - E(X))^2] = \int_{-\infty}^{\infty} (x - \mu)^2 f(x) dx$

The standard deviation, denoted by $\sigma$, is the square root of the variance and provides a measure of spread in the same units as the random variable.

$\sigma = \sqrt{Var(X)}$

4. Examples

Consider a continuous random variable $X$ with a PDF given by:

$f(x) = \begin{cases} 2x & \text{for } 0 \le x \le 1 \\ 0 & \text{otherwise} \end{cases}$

a) Finding the expected value:

$E(X) = \int_{0}^{1} x(2x) dx = \int_{0}^{1} 2x^2 dx = \left[ \frac{2x^3}{3} \right]_{0}^{1} = \frac{2}{3}$

b) Finding the variance:

$E[X^2] = \int_{0}^{1} x^2(2x) dx = \int_{0}^{1} 2x^3 dx = \left[ \frac{2x^4}{4} \right]_{0}^{1} = \frac{1}{2}$

$Var(X) = E[X^2] - (E(X))^2 = \frac{1}{2} - \left(\frac{2}{3}\right)^2 = \frac{1}{2} - \frac{4}{9} = \frac{9 - 8}{18} = \frac{1}{18}$

5. Summary

Understanding continuous random variables, their probability density functions, expected values, and variances is crucial for analyzing and interpreting data in many real-world applications. These concepts provide a framework for quantifying uncertainty and making informed decisions based on probabilistic information.

Concept Definition Formula
Probability Density Function (PDF) Describes the relative likelihood of a continuous random variable taking on a given value. $f(x)$
Expected Value (Mean) The average value of a random variable over many trials. $E(X) = \int_{-\infty}^{\infty} x f(x) dx$
Variance A measure of the spread or dispersion of a random variable around its mean. $Var(X) = \int_{-\infty}^{\infty} (x - \mu)^2 f(x) dx$
Standard Deviation The square root of the variance, providing a measure of spread in the same units as the random variable. $\sigma = \sqrt{Var(X)}$