Definition: A sequence of random variables \(\x_n\) converges almost surely to the random variable \(\x\), denoted \(\x_n \inAS \x\), if
\[\Pr \{ \lim_{n \to \infty} \x_n = \x \} = 1.\]An equivalent definition is given below.
Definition: A sequence of random variables \(\x_n\) converges almost surely to the random variable \(\x\), denoted \(\x_n \inAS \x\), if for every \(\eps > 0\),
\[\Pr \{ \norm{\x_k - \x} < \eps \textrm{ for all } k \ge n \} \to 1\]as \(n \to \infty\)
Comparing this definition to that of convergence in probability, it is clear that almost sure convergence implies convergence in probability.
Theorem: \(\x_n \inAS \x \implies \x_n \inP \x\).
If \(\bth \inAS \bt\), then \(\bth\) is said to be a strongly consistent estimator of \(\bt\). The sample mean, for example, is a strongly consistent estimator of \(\Ex(X)\); this result is known as the strong law of large numbers.
Theorem (Strong law of large numbers): Let \(\x, \x_1, \x_2, \ldots\) be independently and identically distributed random vectors such that \(\Ex\norm{\x} < \infty\). Then \(\bar{\x}_n \inAS \bm\), where \(\bm = \Ex(\x)\).