Backpropagation Whereas Logistic Regression is made of a single layer of weights, Neural Networks are made of many layers with non-linear activation. We need to have a mechanism to update the weights of the neurons using the gradient of the loss. Such a mechanism is called Backpropagation. It propagates the gradient from the last layer to the first one. chain rule It allows you to decompose the ..