Artificial Neural Networks -
Back-Propagation Algorithms
Prof. V. Kamakshi Prasad
Professor, Dept. of CSE,
JNTUH College of Engineering Hyderabad
1
Human Biological Neural Network
• Neuron is a tiny processor
• According to one estimate, there are
• neurons (100000000000 neurons) in a human system
• (synaptic) connections between these neurons.
2
Neuron Modelling: McCulloch-Pitts Model
(First neuron model)
3
Types of output functions (used at neurons)
• Non-linear non-differentiable functions
• Non-linear differentiable functions
• Linear (differentiable) functions
4
More sophisticated version of neuron model
5
Sensory neurons (input neurons)
• Which take input from external world
• Sensory neurons are the neurons of first layer in feed-forward neural
networks
• These neurons are designed with linear output functions
• Fan-out behaviour
6
Supervised learning
• Definition
• How is it different from un-supervised learning?
• How to determine NN algorithm as supervised or un-supervised?
• The learning law
• Learning law is like heart of NN algorithm
• Target / Desired value
• Iterative process until convergence / target is achieved
7
Artificial Neural Networks – Learning laws
8
Layered Architectures in ANN
9
Linearly Separable Vs Non-linearly separable
10
Why the layered architecture is assumed?
• Which architecture is more powerful?
• Layered
• Mesh
• Does the human brain follow layered architecture?
11
Perceptron Algorithm
• Conceptually simple
• A two step algorithm!
• Every step in the training as well as in testing are tractable
• Supported by a convergence theorem
“If the classes are linearly separable, convergence is guaranteed in a
finite number of iterations”
12
What is the advantage of convergence proof
in learning algorithms?
• Makes the algorithm as a deterministic algorithm.
• What are the advantages of deterministic algorithms?
• 100% accurate result
13
Perceptron Algorithm
14
Linearly Separable Vs Non-linearly separable
15
Perceptron Algorithm
16
17
18
Back-Propagation Algorithm- NN
Architecture
19
20
Weight updating in Back Propagation
Weight updation follows Gradient Descent approach
21
Back Propagation Algorithm – Derivation
22
Back Propagation Algorithm –
Derivation (Contd.)
23
Back-Propagation Algorithm – relevant
concepts
• Is there any convergence proof
• NO
• But error convergence is guaranteed
24
Error Convergence – BP limitation
25
Comparison of Pattern Classification &
Pattern mapping tasks
26
Pattern Classification Task - Example
27
Comparison of Pattern Classification &
Pattern mapping tasks (Contd.)
28
Pattern mapping – An illustration
29
Forecast / Prediction problem using Back-
Propagation Algorithm
30
2-Dimensional Input layer of FFNN
31
Back-Propagation Algorithm - Summary
• Meant for training multi-layer feed forward neural network
• Wide range of applications
• Advantage:
• Error convergence while performing the training
• Drawback:
• Non-deterministic solution!
32
References
• Artificial Neural Networks, B Yegnanarayana PHI (2005)
• https://images.app.goo.gl/BoXVCAqG97YXX
• jtsulliv.github.io
• https://images.app.goo.gl/CGdZGgY2H52bzwoj6
• https://towardsdatascience.com/svm-feature-selection-and-kernels-8
40781cc1a6c
33
Any Questions Please
34
Thank You
35