Hands-on reimplementations with annotated notebooks.
Builds — 5 folders
Backpropagation & Autograd
Scalar autograd engine, manual backprop, automated backward pass, and a minimal neural net library.
Bigram Language Model
Character-level counting model, sampling, loss, smoothing, and an equivalent neural net.
MLP / Makemore
Scaling the bigram model to an MLP with embeddings, train/val/test splits, and experiments.
BatchNorm
Training dynamics, activation statistics, gradient flow, initialization, and normalization.
JPEG Compression
Colour conversion, 8x8 DCT blocks, quantization, zig-zag scanning, RLE, and Huffman coding.