This project is an implementation of a Multilayer Perceptron (MLP), a type of feedforward neural network, from scratch only using NumPy.
An MLP consists of layers of interconnected nodes (neurons), where each layer performs a linear transformation, followed by a non-linear activation function.
The goal of this project was to build and train MLPs without using any deep learning libraries (TensorFlow, PyTorch), to understand the internals of neural networks.
The MLP (from scratch)
- Hand-built layers, activations, and backpropagation; no deep learning frameworks
- Uses only NumPy so the math (sigmoid, cross-entropy) maps directly to code
- Optimized enough to train on classic datasets like MNIST and Fashion-MNIST
The CLI (on top)
- Interactive walkthrough to configure and launch training runs
- Plots loss curves and decision boundaries so you can watch the model improve
- Saves each run so you can inspect predictions and tweak settings later
Results
- 98.01% accuracy on MNIST digits
- 88.80% accuracy on Fashion-MNIST clothing items