Backpropagation

The term backpropagation, short for “backward propagation of errors,” is a supervised learning algorithm used to minimize errors in predictions made by neural networks.

In principle, Backpropagation is a chain-rule application that can be used to compute gradients of loss functions in relation to model parameters.
The mechanism operates in two main phases:

The forward pass and the backward pass.

Click Here