EC484_Econometrics_Analysis

# Chebyshev Inequality

The Chebyshev Inequality provides a bound on the probability that a random variable deviates from its mean by more than a specified number of standard deviations. It is particularly useful because it applies to any probability distribution with a finite mean and variance, regardless of the distribution’s shape.

The statement of Chebyshev Inequality is:

Theorem For any random variable and constant , it holds

By setting , we could have , and .

Why we have ? Because from the definition of variance, we have:

\text{Var}(\bar{y}) = \text{Var}\left( \frac{1}{n} \sum_{i=1}^{n} y_{i} \right) = \frac{1}{n^2} \sum_{i=1}^{n} \text{Var}(y_{i}) = \frac{n \sigma^2}{n^2} = \frac{\sigma^2}{n}$$

Thus we could use it to derive Weak Law of Large Numbers

Proof

Pick arbitrary , let be distribution function of , then:

We let , then: