Lab / Backpropagation Visualizer
🔬

Backpropagation Visualizer

Step through forward and backward passes in a neural network. See gradients flow, weights update, and the chain rule in action.

Backpropagation Visualizerbeta
Epoch: 0

Presets

Classic XOR: requires a hidden layer to solve

Architecture

Hidden Layers: 1
Hidden 14
Activation
f(z) = tanh(z)
f'(z) = 1-tanh(z)^2

Training

Learning Rate0.1000
Sample1/4
in: [0, 0] target: [0, 1]

Step Controls

+weight-weight / gradient
Loss
--
Phase
Idle
Epoch
0
Layers
2-4-2

Neural network and backpropagation implemented from scratch — no ML libraries. Uses MSE loss with sigmoid output. All computation runs entirely in your browser. Try the "Vanishing Gradients" preset with the gradient magnitude chart enabled to see why deep sigmoid networks are hard to train.