BACK PROPAGATION ALGORITHM
Backpropagation is an algorithm used to calculate the gradient of the loss function with
respect to the weights of a neural network. It allows the network to learn from errors by
updating weights in the direction that reduces the loss.
• It involves adjusting the weights of the network to reduce the error rate in predictions.
• The adjustment is based on the error calculated in the previous epoch (iteration).
• The goal is to minimize the loss function by updating weights using gradients.
• Backpropagation helps compute the gradient of the loss function with respect to single
weight by the chain rule.
• It efficiently computes one layer at a time, unlike a native direct computation.
• It computes the gradient, but it does not define how the gradient is used. It generalizes the
computation in the delta rule.
• Proper tuning of weights leads to:
1.Lower error rate
2.Better generalization
3.More reliable model performance
• It uses the chain rule of calculus to propagate the error backward through the
network layers.
• It is a standard and widely-used method in training feedforward neural networks.
Why We Need Backpropagation?
The process involves two main phases:
• Forward pass (to compute output)
• Backward pass (to update weights based on errors)
• Calculate the error – How far is your model output from the actual output.
• Minimum Error – Check whether the error is minimized or not.
• Update the parameters – If the error is huge then, update the parameters (weights
and biases). After that again check the error. Repeat the process until the error
becomes minimum.
• Model is ready to make a prediction – Once the error becomes minimum, you can
feed some inputs to your model and it will produce the output.
Types of Backpropagation Networks
1.Static back-propagation:
It is one kind of backpropagation network which produces a mapping of a static input
for static output. It is useful to solve static classification issues like optical character
recognition.
2.Recurrent Backpropagation:
Recurrent Back propagation in data mining is fed forward until a fixed value is
achieved. After that, the error is computed and propagated backward.
Weight Adjustment in Backpropagation
• The Backpropagation algorithm looks for the minimum value of the error function in weight
space using a technique called the delta rule or gradient descent. The weights that
minimize the error function is then considered to be a solution to the learning problem.
• We first initialized some random value to ‘W’ and propagated forward.
• Then, we noticed that there is some error. To reduce that error, we propagated backwards
and increased the value of ‘W’.After that, also we noticed that the error has increased. We
came to know that, we can’t increase the ‘W’ value.
• So, we again propagated backwards and we decreased ‘W’ value.Now, we noticed that the
error has reduced. We keep on updating the weight value in that direction until error
becomes minimum.
Backpropagation Algorithm:
initialize network weights (often small random values)
do
for Each training example named ex
prediction = neural-net-output(network, ex) // forward pass
actual = target-output(ex)
compute error (prediction - actual) at the output units
compute Δwₕ for all weights from hidden layer to output layer //
backward pass
compute Δwᵢ for all weights from input layer to hidden layer // backward
pass continued
update network weights// input layer not modified by error estimate
until all examples classified correctly or another stopping criterion
satisfied
return the network
Backpropagation Algorithm Steps
Backpropagation Algorithm
1. Initialize Weights
• Randomly initialize all network weights to small values
• Include weights for:
1. Input layer → Hidden layer
2. Hidden layer → Output layer
2. Repeat Until Stopping Condition is Met:
e.g., error is below threshold OR maximum epochs completed
For Each Training Example(EX)
Step 2.1: Forward Pass
Compute Output from the network using current weights:
prediction = neural-net-output(network, ex)
error = prediction - actual (target output)
Advantages of Backpropagation
• Backpropagation is fast, simple and easy to program
• It has no parameters to tune apart from the number of inputs
• It is a flexible method as it does not require prior knowledge about the network
• It is a standard method that generally works well
• It does not need any special mention of the features of the function to be learned
Demerits
• Learning phase requires intensive calculations
• Selection of number of Hidden layer neurons is an issue
• Selection of number of Hidden layers is also an issue
• Network gets trapped in Local Minima
• Temporal Instability
• Network Paralysis
Step 1: Forward Propagation
Step 2: Error Calculation (MSE)
Step 3: Backward Propagation – Update w5
Weight Update Rule:
Step 4: New Output After Weight Update
The computed output 0.7801 is significantly higher than the target 0.5, showing that the
neuron is overactivating and its weights must be adjusted to minimize the prediction error.