Post

Mean, Variance, and Standard Deviation of Random Variables

Mean, Variance, and Standard Deviation of Random Variables

How do we summarize a random variable with a single number?
What happens to the mean and variance if we shift or scale the variable?
This post explains the mean, variance, and standard deviation for both discrete and continuous random variables โ€” with concrete examples.


๐Ÿ“š This post is part of the "Intro to Statistics" series

๐Ÿ”™ Previously: What Are Random Variables and How Do We Visualize Their Distributions?

๐Ÿ”œ Next: Introduction to the Normal Distribution


๐Ÿ“ What Is the Mean of a Random Variable?

The mean (or expected value) of a random variable ( X ) is its probability-weighted average of all possible values.


๐Ÿงฎ Mean of a Discrete Random Variable

\[ \mu_X = E(X) = \sum_i x_i P(x_i) \]

This means each value \( x_i \) is weighted by its probability \( P(x_i) \).

Example:

\(x_i\)1234
\(P(x_i)\)0.10.30.40.2

Calculate:

\[ E(X) = 1 \times 0.1 + 2 \times 0.3 + 3 \times 0.4 + 4 \times 0.2 = 0.1 + 0.6 + 1.2 + 0.8 = 2.7 \]


Mean of Discrete Random Variable Light


๐Ÿ“ Mean of a Continuous Random Variable

\[ \mu_X = E(X) = \int_{-\infty}^{\infty} x f(x) \, dx \]

Where \( f(x) \) is the probability density function (PDF).

Example:

If

\[ f(x) = \frac{1}{2} \quad \text{for } 0 \leq x \leq 2, \quad 0 \text{ otherwise} \]

Then

\[ E(X) = \int_0^2 x \times \frac{1}{2} \, dx = \frac{1}{2} \int_0^2 x \, dx = \frac{1}{2} \times \left[ \frac{x^2}{2} \right]_0^2 = \frac{1}{2} \times 2 = 1 \]


Mean of a Continuous Random Variable


๐Ÿ”„ Mean Under Linear Transformations

If we transform \( X \) as:

\[ Y = a + bX \]

then

\[ E(Y) = a + b E(X) \]


Example (Using discrete mean above):

\[ E(Y) = 3 + 2 \times 2.7 = 3 + 5.4 = 8.4 \]


Mean of a Continuous Random Variable


๐Ÿ“Š What Is Variance?

Variance measures the spread or deviation of values around the mean:

\[ \text{Var}(X) = E[(X - \mu)^2] \]


๐Ÿงฎ Variance for Discrete Random Variable

\[ \text{Var}(X) = \sum_i (x_i - \mu)^2 P(x_i) \]

Using the discrete example above (\( \mu = 2.7 \)):

\[ \text{Var}(X) = (1 - 2.7)^2 \times 0.1 + (2 - 2.7)^2 \times 0.3 + (3 - 2.7)^2 \times 0.4 + (4 - 2.7)^2 \times 0.2 \]

\[ = (2.89)(0.1) + (0.49)(0.3) + (0.09)(0.4) + (1.69)(0.2) \]

\[ = 0.289 + 0.147 + 0.036 + 0.338 \]

\[ = 0.81 \]


Mean of a Continuous Random Variable


๐Ÿ“ Variance for Continuous Random Variable

\[ \text{Var}(X) = \int_{-\infty}^\infty (x - \mu)^2 f(x) \, dx \]

For the continuous example above (\( \mu=1 \)):

\[ \text{Var}(X) = \int_0^2 (x - 1)^2 \times \frac{1}{2} \, dx = \frac{1}{2} \int_0^2 (x^2 - 2x + 1) \, dx \]

Calculate:

\[ = \frac{1}{2} \left[ \frac{x^3}{3} - x^2 + x \right]_0^2 = \frac{1}{2} \left( \frac{8}{3} - 4 + 2 \right) = \frac{1}{2} \times \frac{2}{3} = \frac{1}{3} \approx 0.333 \]


๐Ÿ”„ Variance Under Linear Transformations

For \( Y = a + bX \), variance changes as:

\[ \text{Var}(Y) = b^2 \text{Var}(X) \]

Adding or subtracting a constant \( a \) does not affect variance.


โœ๏ธ Proof Sketch:

\[ \text{Var}(Y) = E[(Y - E[Y])^2] \]

\[ = E[(a + bX - (a + bE[X]))^2] \]

\[ = E[(b(X - E[X]))^2] \]

\[ = b^2 E[(X - E[X])^2] \]

\[ = b^2 \text{Var}(X) \]


Example:

Using previous discrete variance ( 0.81 ):

\[ \text{Var}(Y) = 2^2 \times 0.81 = 4 \times 0.81 = 3.24 \]


๐Ÿ“ Standard Deviation and Scaling

Standard deviation \( \sigma \) is the square root of variance:

\[ \sigma_X = \sqrt{\text{Var}(X)} \]

For \( Y = a + bX \):

\[ \sigma_Y = \sqrt{\text{Var}(Y)} = \sqrt{b^2 \text{Var}(X)} = |b| \sigma_X \]


Example (continued):

\[ \sigma_X = \sqrt{0.81} = 0.9 \]

\[ \sigma_Y = 2 \times 0.9 = 1.8 \]


๐Ÿ”ข Variance of Sum and Difference

For any two variables \( X \) and \( Y \):

\[ \text{Var}(X \pm Y) = \text{Var}(X) + \text{Var}(Y) \pm 2\,\text{Cov}(X, Y) \]


๐Ÿง  Level Up: Understanding Variance Properties
  • Adding a constant \( a \) to a random variable shifts the mean but leaves variance unchanged.
  • Multiplying by \( b \) scales the variance by \( b^2 \).
  • Standard deviation scales by \( |b| \) โ€” the absolute value of the multiplier.
  • Variance of sums depends on covariance โ€” independent variables have zero covariance, so variances add.

  • ๐Ÿ“Œ Try It Yourself: Mean, Variance & Linear Transformations

    Q1: What is the mean (expected value) of a discrete random variable?

    ๐Ÿ’ก Show Answer

    The probability-weighted average of all possible values the random variable can take.

    Q2: For a linear transformation \( Y = a + bX \), what is the formula for \( E(Y) \)?

    ๐Ÿ’ก Show Answer

    \( E(Y) = a + b \times E(X) \)

    Q3: How does adding a constant \( a \) to a random variable affect its variance?

    ๐Ÿ’ก Show Answer

    Adding a constant does not change the variance.

    Q4: If \( Y = a + bX \), what happens to the variance of \( Y \)?

    ๐Ÿ’ก Show Answer

    The variance scales by the square of \( b \), so \( \text{Var}(Y) = b^2 \times \text{Var}(X) \).


    โœ… Summary

    ConceptFormula / Description
    Mean (Discrete)\( \mu = \sum x_i P(x_i) \)
    Mean (Continuous)\( \mu = \int x f(x) dx \)
    Variance (Discrete)\( \sigma^2 = \sum (x_i - \mu)^2 P(x_i) \)
    Variance (Continuous)\( \sigma^2 = \int (x - \mu)^2 f(x) dx \)
    Linear Transform Mean\( E(a + bX) = a + b E(X) \)
    Linear Transform Variance\( \text{Var}(a + bX) = b^2 \text{Var}(X) \)
    Variance of Sum/Diff\( \text{Var}(X \pm Y) = \text{Var}(X) + \text{Var}(Y) \pm 2\text{Cov}(X,Y) \)
    Std Deviation\( \sigma = \sqrt{\text{Var}(X)} \)

    ๐Ÿ”œ Up Next

    Next, weโ€™ll explore the Normal Distribution โ€” a fundamental continuous distribution that appears everywhere in statistics and data science.

    Stay tuned!

    This post is licensed under CC BY 4.0 by the author.