This page provides a statement of the basic, univariate Lindeberg condition and a discussion of how to prove that it is satisfied, with an example. Other versions of the Lindeberg condition may be found on the page for the Lindeberg-Feller central limit theorem. Throughout both pages, triangular array notation is used.

Definition: The Lindeberg condition is said to be satisfied if, for every \(\eps > 0\), we have

\[\frac{1}{s_n^2} \sum_{i=1}^n \Ex \{ X_{ni}^2 1(\abs{X_{ni}} \ge \eps s_n) \} \to 0\]

as \(n \to \infty\).

The following theorem provides an example of how to prove that the Lindeberg condition holds.

Theorem: Suppose \(X_1, X_2, \ldots\) are iid with mean zero and finite variance. Then the Lindeberg condition is satisfied.

To prove this result, there are three key steps, and they tend to appear whenever one needs to show that the condition is satisfied:

  1. Replacing the infinite sum with a single quantity \(\propto \Ex T_n\)
  2. \(T_n \inP 0\) (which happens if \(s_n \to \infty\))
  3. \(\Ex T_n \to 0\) by the Dominated Convergence Theorem (requires finite variance)

Proof: Let \(\eps > 0\).

\[\begin{alignat*}{2} & s_n^2 = n \sigma^2 &\hspace{4em}& \text{iid} \\ & \frac{1}{n\sigma^2} \sum_i \Ex \{ X_i^2 1(\abs{X_i} > \eps\sigma\sqrt{n}) \} = \frac{1}{\sigma^2} \Ex \{ X^2 1(\abs{X} > \eps\sigma\sqrt{n}) \} && \text{iid} \\ & \phantom{\frac{1}{n\sigma^2} \sum_i \Ex \{ X_i^2 1(\abs{X_i} > \eps\sigma\sqrt{n} \}} = \frac{1}{\sigma^2} \Ex T_n && \text{Let } T_n =X^2 1(\abs{X} > \eps\sigma\sqrt{n}) \\ & T_n \inP 0 && \Pr\{T_n \ne 0\} \to 0 \\ & \Ex T_n \to 0 && \href{dominated-convergence-theorem.html}{\text{DCT}} \textnormal{ with } Z = \abs{X}^2; \textnormal{ finite variance} \end{alignat*}\]