EC484_Econometrics_Analysis

Theorem Suppose is a sequence of i.i.d. random variables with finite expected value (), then

as .

We could prove it by Chebyshev Inequality.


Vector Case

WLLN can be extended to the case where .

Definition A sequence of random vectors converges in probability to as , denoted as , if for every ,

The difference part is that we introduce here, which we called norm of a vector. It’s described as the “distance” between two vectors.

Here we only mention a very famous inequality, which is: