Neural Network from Scratch

Build a neural network step by step — from neurons to backpropagation. No frameworks, just pure math brought to life.

Neuron
A neuron computes a weighted sum of its inputs, adds a bias, and passes the result through an activation function. It is the fundamental unit of a neural network.
Dot Product: Vector × Vector
A single neuron multiplies each input by its weight and sums the results — that's a dot product. Here we compute it manually element-by-element, then show how np.dot does it in one line. • Both vectors must have equal length. • Commutative: np.dot(a, b) = np.dot(b, a).
Dot Product: Matrix × Vector
A layer of neurons is just multiple dot products stacked: each row of the weight matrix dots with the input vector to produce one neuron's output. Here we compute each neuron manually, then show how np.dot does the whole layer at once. • Matrix columns must equal vector length. • Not commutative: np.dot(W, x) ≠ np.dot(x, W).
Dot Product: Matrix × Matrix
When you have a batch of samples, each sample needs its own layer output. That's a matrix × matrix multiply: each row of X dots with each column of W^T. Here we compute it with a triple loop, then show how np.dot replaces all of it. • A's columns must equal B's rows. • Not commutative: np.dot(A, B) ≠ np.dot(B, A).
Dot Product Application
np.dot handles three cases: vector·vector → scalar (single neuron), matrix·vector → vector (layer output), and matrix·matrix → matrix (batch processing). It is the core operation behind every neural network layer.