Loss Function

A mathematical function that quantifies how wrong a model's predictions are, providing the signal that drives learning.

The loss function (or cost function) measures the gap between a model's predictions and actual outcomes. Common examples include mean squared error for regression and cross-entropy for classification. The choice of loss function shapes what the model optimizes for: accuracy-focused losses optimize for correct predictions, while calibration-focused losses optimize for well-calibrated probability estimates. Training aims to minimize the loss function across the training dataset.

Also known as

cost function, objective function, error function