Definition: A sequence of random variables xn converges almost surely to the random variable x, denoted xnasx, if

P{limnxn=x}=1.

An equivalent definition is given below.

Definition: A sequence of random variables xn converges almost surely to the random variable x, denoted xnasx, if for every ϵ>0,

P{xkx<ϵ for all kn}1

as n

Comparing this definition to that of convergence in probability, it is clear that almost sure convergence implies convergence in probability.

Theorem: xnasxxnpx.

If θ^asθ, then θ^ is said to be a strongly consistent estimator of θ. The sample mean, for example, is a strongly consistent estimator of E(X); this result is known as the strong law of large numbers.

Theorem (Strong law of large numbers): Let x,x1,x2, be independently and identically distributed random vectors such that Ex<. Then x¯nasμ, where μ=E(x).