From the course: Deep Learning: Getting Started
Back propagation
- (Educator) Once we have estimated the prediction error from forward propagation, we need to go back to back propagation to adjust the weights and biases. What is the purpose of back propagation? We found an overall error based on the entire network during forward propagation. Each node in the neural network contributes to this overall error. And nodes contribution is determined by the values for the weights and biases, it has. Different nodes contribute differently, based on how well their weights and biases model the relationship between the feature and target variables. As we tuned the network, the weights and biases for each node needs to be adjusted in order to lower the error contribution, by that node. How does back propagation work? It works in the reverse direction as the forward propagation. We start from the output layer. We will compute a delta value for this layer based on the overall error. This delta value is an adjustment that is then applied to all the weights and biases in the layer. This results in new values of weights and biases. Then we derive a new delta value for the previous layer based on the new values in the current layer, it is then applied to the weights and biases in the previous layer. The process of computing deltas, applying it to the weights and biases, and then back propagating continues until we reached the input layer. This image shows the same back propagation process for the network example. The deltas D1 to D4 are computed at each layer and applied to their weights and biases. They also propagate to the previous layer and influence their deltas. If you are interested in the math behind these computations, I recommend additional readings. Again, the deep learning libraries take care of these computations. At the end of the back propagation process, we will have an updated set of weights and biases that would reduce the overall prediction error. How do we continue to reduce error and improve accuracy? Let's discuss that in the next video.
Contents
-
-
-
-
-
Setup and initialization2m 43s
-
Forward propagation1m 14s
-
Measuring accuracy and error2m 12s
-
Back propagation2m 8s
-
Gradient descent1m 21s
-
Batches and epochs2m 22s
-
Validation and testing1m 28s
-
An ANN model1m 39s
-
Reusing existing network architectures2m 33s
-
Using available open-source models2m 27s
-
-
-
-
-