Backpropagation Algorithm:
Example
PROF. NAVNEET GOYAL
Source: A Step by Step Backpropagation Example by Matt Mazur
Example
The Network
Example
single training example: given
inputs 0.05 and 0.10, we want the
neural network to output 0.01 and
0.99.
Example
The Forward Pass
Example: Forward Pass
Example: The Error
Example: The Backward Pass
Consider w5.
We want to know how much a change in w5 affects the total error!
By applying the chain rule we know that:
Example: The Backward Pass
Example: The Backward Pass
Example: The Backward Pass
How much does the output of change with respect to its total net input?
(remember – derivative of sigmoidal function!!)
Example: The Backward Pass
Finally, how much does the total net input of o1 change with
respect to w5?
Combining the three components, we get:
Example: The Backward Pass
Note the sign of the derivative wrt w5 (it is positive)
Example: The Backward Pass
We can repeat this process for updating w6, w7, & w8
Hidden Layer - updation of w1, w2, w3, & w4
Hidden Layer - updation of w1, w2, w3, & w4
Hidden Layer - updation of w1, w2, w3, & w4
Following the same process, we get
Check this out!
Hidden Layer - updation of w1, w2, w3, & w4
Hidden Layer - updation of w1, w2, w3, & w4
Similarly,
Final Points
o We’ve updated all of our weights!
o When we fed forward the 0.05 and 0.1 inputs originally,
the error on the network was 0.298371109.
o After this first round of backpropagation, the total error is
now down to 0.291027924
o After repeating this process 10,000 times, for example, the
error plummets to 0.0000351085.
o At this point, when we feed forward 0.05 and 0.1, the two
outputs neurons generate 0.015912196 (vs 0.01 target) and
0.984065734 (vs 0.99 target).