Definition:
A sequence of random variables converges almost
surely to the random variable , denoted , if
An equivalent definition is given below.
Definition:
A sequence of random variables converges almost surely to the random variable , denoted , if for every ,
as
Comparing this definition to that of convergence in probability, it is clear that almost sure convergence implies convergence in probability.
If , then is said to be a strongly consistent
estimator of . The sample mean, for example, is a strongly consistent estimator of
; this result is known as the strong law of large numbers.
Theorem (Strong law of large numbers):
Let be independently and identically distributed random vectors such that . Then , where .