MLP from Scratch (NumPy)

This project is an implementation of a Multilayer Perceptron (MLP), a type of feedforward neural network, from scratch only using NumPy.

An MLP consists of layers of interconnected nodes (neurons), where each layer performs a linear transformation, followed by a non-linear activation function.

The goal of this project was to build and train MLPs without using any deep learning libraries (TensorFlow, PyTorch), to understand the internals of neural networks.

The MLP (from scratch)

The CLI (on top)

Results

Links