Supervised Machine Learning

Models that learn from labelled data. Start with OLS — the cleanest possible foundation — then work your way up through log-linear models, categorical variables, and joint hypothesis testing.

Prerequisite

Statistics Basics

Not supervised ML itself, but you need this first. Mean, variance, distributions, p-values — the language all models speak.

Chapter 1

OLS Regression

How to fit a line through data — and why the math behind it is surprisingly elegant. Includes multiple regression, R², t-tests, and the four OLS assumptions.

Chapter 2

Log-Linear Regression

When relationships aren't straight lines. Interpret results as elasticities — percentage changes rather than absolute units.

Chapter 3 — Coming Soon

Dummy Variables

How to bring categorical data into a regression. Group differences, interaction effects, and the dummy variable trap.

Chapter 4

F-Test & Joint Significance

Testing whether a group of variables matters — together, not one by one. The F-statistic and restricted vs unrestricted models.

Chapter 5

Model Diagnostics

Is your regression actually valid? Test normality, homoskedasticity, and zero-mean errors. Compare models with Adjusted R², AIC, and BIC.

Chapter 6

Nonlinear Regression

When a straight line isn't enough. Model curves with quadratic terms and capture interaction effects — where one variable's impact depends on another.