"Lebesgue integration measures how much mass a function assigns to its values, not just where its graph sits."
Overview
Lebesgue integration is the rigorous language of expectation, population risk, convergence under limits, and almost-everywhere reasoning.
Measure theory is the grammar behind rigorous probability. Earlier probability chapters taught how to compute with random variables and distributions. This chapter explains what those objects are when sample spaces are infinite, events are generated by observations, and densities depend on a base measure.
This section uses LaTeX Markdown throughout. Inline mathematics uses $...$, and display mathematics uses `
`. The focus is the foundation needed for ML: expected loss, pushforward distributions, convergence of estimators, likelihood ratios, importance sampling, KL divergence, and support mismatch.
Prerequisites
Companion Notebooks
| Notebook | Description |
|---|---|
| theory.ipynb | Executable demonstrations for lebesgue integration |
| exercises.ipynb | Graded practice for lebesgue integration |
Learning Objectives
After completing this section, you will be able to:
- Explain the difference between Riemann and Lebesgue integration
- Compute integrals of nonnegative simple functions
- Define the Lebesgue integral for nonnegative measurable functions
- Extend the integral to signed integrable functions
- Use almost-everywhere equality correctly
- State monotone convergence, Fatou's lemma, and dominated convergence
- Apply convergence theorems to interchange limits and expectations
- Interpret expected loss as a Lebesgue integral
- Connect Monte Carlo averages to empirical measures
- Recognize integrability assumptions behind learning objectives
Study Flow
- Read the pages in order and pause after each page to restate the main definition or theorem.
- Run
theory.ipynbwhen you want to check the formulas numerically. - Use
exercises.ipynbafter the reading path, not before it. - Return to this overview page when you need the chapter-level navigation.