Backpropagation
The algorithm that trains neural networks by calculating how each weight contributes to prediction error and adjusting weights to reduce that error.
Backpropagation (backward propagation of errors) is the core training algorithm for neural networks. After a forward pass produces a prediction, the algorithm computes the error between prediction and reality, then traces that error backward through the network layer by layer using calculus (the chain rule). This process reveals how much each weight contributed to the error, enabling precise adjustments. The algorithm repeats over many examples until the network converges on weights that minimize prediction error.
Also known as
backprop, backward propagation, error backpropagation