The backpropagation formula can be expressed as:
```
∂E/∂w = (∂E/∂y) * (∂y/∂w)
```
where:
- E is the error of the network
- y is the output of the network
- w is the weight of a connection in the network
The formula calculates the partial derivative of the error with respect to the weight, which tells the network how much the error will change if the weight is changed by a small amount. The formula also calculates the partial derivative of the output with respect to the weight, which tells the network how much the output will change if the weight is changed by a small amount.
The network uses these two partial derivatives to calculate the appropriate adjustment for the weight. The goal is to adjust the weights so that the network's error is minimized, which means that the network is performing well on the given task.
Backpropagation is a powerful learning algorithm that allows neural networks to learn from their mistakes and gradually improve their performance.