MNIST CNN
Scratch CNN fundamentals, early stopping, confusion matrix analysis, and the first complete PyTorch training pipeline.
This page is the browser-friendly landing page for the project's visual summaries. Each summary turns a project README into a more visual walkthrough with results, architecture, and key lessons in one place.
If you are opening this from GitHub Pages, each project card below links to the live summary page.
The source code and training details still live in each project's README.md.
Scratch CNN fundamentals, early stopping, confusion matrix analysis, and the first complete PyTorch training pipeline.
BatchNorm and learning-rate scheduling push a scratch CNN to its practical ceiling on a harder real-image benchmark.
ResNet18 feature extraction and fine-tuning break through the scratch-model ceiling on a 100-class sports dataset.
Character-level text generation introduces hidden states, BPTT, temperature sampling, and the limits of vanilla RNN memory.
IMDB review classification with LSTM, packed sequences, pretrained GloVe embeddings, and a final frozen-vs-trainable comparison for clean transfer-style analysis.