100% found this document useful (1 vote)
289 views20 pages

NN For Pattern For Classification

This document discusses using neural networks for pattern classification problems like character recognition. It begins by explaining how an image needs to be converted into a vector of pixel values that can be processed by a neural network. Then it provides examples of single-layer perceptrons and how they can be used to perform simple binary classification and implement logic functions like AND gates. It describes how the weights and bias of a perceptron are adjusted to define the appropriate decision boundary. Finally, it notes that the perceptron learning rule is used for high-dimensional problems like character recognition that cannot be visualized graphically.

Uploaded by

Vivek Goyal
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
289 views20 pages

NN For Pattern For Classification

This document discusses using neural networks for pattern classification problems like character recognition. It begins by explaining how an image needs to be converted into a vector of pixel values that can be processed by a neural network. Then it provides examples of single-layer perceptrons and how they can be used to perform simple binary classification and implement logic functions like AND gates. It describes how the weights and bias of a perceptron are adjusted to define the appropriate decision boundary. Finally, it notes that the perceptron learning rule is used for high-dimensional problems like character recognition that cannot be visualized graphically.

Uploaded by

Vivek Goyal
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Using Neural Networks for

Pattern Classification Problems

Converting an Image
• Camera captures
an image

• Image needs to be
converted to a form
that can be
processed by the
Neural Network

1
Converting an Image
• Image consists
of pixels
• Values can be
assigned to
color of each
pixel
• A vector can
represent the
pixel values in
an image

Converting an Image
• If we let +1
represent black
and 0 represent
white
• p = [0 1 0 1 0 1
010001000
1 0 …..

2
Neural Network Pattern
Classification Problem

Tank image =
[0 1 0 0 1 1 0
…. ] Tank or
Neural Network house ?
House image
= [1 1 0 0 0 1
0 ….]

Types of Neural Networks

• Perceptron
• Hebbian
• Adeline
• Multilayer with Backpropagation
• Radial Basis Function Network

3
2-Input Single Neuron
Perceptron: Architecture
p1 w1,1
n a
A single neuron +
perceptron: p2 w1,2 Symmetrical hard
b limiter
1
' !p $ *
Output: a = hardlims( Wp + b) = hardlims) [ w1,1 w1,2 ]# 1 & + b,
( " p2 % +
/-1, if w1,1 p1 + w1,2 p2 + b < 0
= hardlims( w1,1 p1 + w1,2 p2 + b) = 0
1+1, if w1,1 p1 + w1,2 p2 + b . 0

2-Input Single Neuron


Perceptron: Example
p1 w1,1
a a = hardlims( w1,1 p1 + w1,2 p2 + b)
n
+ #!1, w1,1 p1 + w1,2 p2 + b < 0
p2 w1,2 =$
b %+1, w1,1 p1 + w1,2 p2 + b " 0
1

Example: w1,1 = -1 w1,2 = 1 b = -1


#!1, ! p1 + p2 !1 < 0 or ! p1 + p2 < 1
a=$
%+1, ! p1 + p2 !1 " 0 or ! p1 + p2 " 1

This separates the inputs p = [p1, p2]T into two


categories separated by the boundary: -p1 + p2 = 1

4
2-Input Single Neuron
Perceptron: Decision Boundary
p1 -1
n a #!1, ! p1 + p2 !1 < 0 or ! p1 + p2 < 1
+ a=$
p2 1 -1 %+1, ! p1 + p2 !1 " 0 or ! p1 + p2 " 1

1 p2
decision boundary
- p1 + p 2 = 1
(-2,1) 1
p1
-1 Inputs in this region
(2,-1) have an output of -1

Inputs in this region have an output


of +1

2-Input Single Neuron


Perceptron: Weight Vector

p1 -1
n a
+ p2
p2 1 -1 decision boundary
1 W = [-1, 1] - p1 + p 2 = 1
1
p1
-1

• The weight vector, W, is orthogonal to the


decision boundary

5
2-Input Single Neuron
Perceptron: Weight Vector
p2
decision boundary
W - p1 + p 2 = 1
(-2,1) 1
p1
-1
(2,-1)

• W points towards the class with an output of +1

Simple Perceptron Design


• The design of a simple perceptron is
based upon:
– A single neuron divides inputs into two
classifications or categories
– The weight vector, W, is orthogonal to the
decision boundary
– The weight vector, W, points towards the
classification corresponding to the “1” output

6
Orthogonal Vectors

• For any hyperplane of the form:


a1p1 + a2p2 + a3p3 + . . . + anpn = b
the vector c*[ a1, a2, …, an ] is orthogonal to
the hyperplane (where c is a constant).
- p1 + p2 = - 1 * p1 + 1*p2 = 1

W = [ -1 , 1 ]

AND Gate: Description


• A perceptron can be used to implement most logic functions

• Example: Logical AND Truth table:

Inputs Output
0 0 0
0 1 0
1 0 0
1 1 1

7
AND Gate: Architecture
“hardlim” is used
Input/Target pairs: here to provide
' !0$ *' !0$ * outputs of 0 and 1
(p1 = # &, t1 = 0+(p2 = # &, t 2 = 0+
) "%
0 ,) "%1 ,
' !1$ *' !1$ *
(p3 = # &, t 3 = 0+(p4 = # &, t 4 = 1+ p1 w1,1
) "0% ,) "1% , Two n a
input +
p2
AND w1,2 b
1

AND Gate: Graphical


Description
• Graphically:
Inputs Output p2
0 0 0
0 1 0 1
= zero output
1 0 0 = one output
1 1 1 p1
1
• Where do we place the decision boundary?

8
AND Gate: Decision
Boundary
• There are an infinite number of solutions

1 One possible decision


boundary
1

• What is the corresponding value of W?

AND Gate: Weight Vector


• W must be orthogonal to the decision boundary
• W must point towards the class with an output of 1

1.5 W One possible


1
value is [2 2]

1 1.5

' !p $ *
• Output: a = hardlim([2 2]# 1 & + b+ = hardlim{2 p1 + 2 p2 + b}
) " p2 % ,
Decision boundary

9
AND Gate: Bias
• Decision Boundary: 2 p1 + 2 p2 + b = 0

1.5 W
1 Passes through (1.5, 0)

1 1.5

• At (1.5, 0): 2(1.5) + 2(0) + b = 0 b = -3

AND Gate: Final Design

• Final Design:
p1 2
n a ( !p $ +
+ a = hardlim)[2 2]# 1 & ' 3,
p2
2 -3 * " p2 % -

• Test: ( !0$ +
a = hardlim)[2 2]# & ' 3, = 0
* "0% -
( !0$ + ( !1$ + ( !1$ +
a = hardlim)[2 2]# & ' 3, = 0 a = hardlim)[2 2]# & ' 3, = 0 a = hardlim)[2 2]# & ' 3, = 1
* "1% - * "0% - * "1% -

10
Perceptron Learning Rule
• Most real problems involve input vectors,
p, that have length greater than three
• Images are described by vectors with
1000s of elements
• Graphical approach is not feasible in
dimensions higher than three
• An iterative approach known as the
Perceptron Learning Rule is used

Character Recognition
Problem
• Given: A network has two possible inputs, “x” and “o”.
These two characters are described by the 25 pixel (5 x
5) patterns shown below.

x o

• Problem: Design a neural network using the perceptron


learning rule to correctly identify these input characters.

11
Character Recognition
Problem: Input Description
• The inputs must be described as column
vectors
• Pixel representation: 0 = white
1 = black

The “x” is represented as: [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T

The “o” is represented as: [ 0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T

Character Recognition
Problem: Output Description
• The output will indicate that either an “x” or
“o” was received
• Let: 0 = “o” received A hard limiter
will be used
1 = “x” received
• The inputs are divided into two classes
requiring a single neuron
• Training set:
p1 = [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T, t1 = 1

p2 = [ 0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T , t2 = 0

12
Character Recognition
Problem: Network Architecture
p1
w1,1
p2 a
w1,2 n a = hardlim(Wp + b)
The input, +
p, has 25 b
w1,25
components
1
p25
“hardlim is used to
provide an output
of “0” or “1”

Perceptron Learning Rule:


Summary
• Step 1: Initialize W and b (if non zero) to small random
numbers.
• Step 2: Apply the first input vector to the network and
find the output, a.
• Step 3: Update W and b based on:
Wnew = Wold + (t-a)pT
bnew = bold + (t-a)
• Repeat steps 2 and 3 for all input vectors repeatedly
until the targets are achieved for all inputs

13
Character Recognition Problem:
Perceptron Learning Rule
• Step 1: Initialize W and b (if non zero) to small random numbers.
– Assume W = [0 0 . . . 0] (length 25) and b = 0
• Step 2: Apply the first input vector to the network
– p1 = [ 1 0 0 0 1 0 1 0 1 0 0 0 1 0 0 0 1 0 1 0 1 0 0 0 1]T, t1 = 1
– a = hardlim(W(0)p1 + b(0)) = hardlim(0) = 1
• Step 3: Update W and b based on:
Wnew = Wold + (t-a)p1T = Wold + (1-1)p1T
= [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
bnew = bold + (t-a) = bold + (1-1) = 0

Character Recognition Problem:


Perceptron Learning Rule
• Step 2 (repeated): Apply the second input vector to the network
– p2 = [0 1 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 0 1 0 1 1 1 0 ]T, t2 = 0

a = hardlim(W(1)p 2 + b(1)) = 1

• Step 3 (repeated): Update W and b based on


Wnew = Wold + (t-a)p1T = Wold + (0-1)p2T
= [0 -1 -1 -1 0 -1 0 0 0 -1 -1 0 0 0 -1 -1 0 0 0 -1 0 -1 -1 -1 0]
bnew = bold + (t-a) = bold + (0-1) = -1

14
Character Recognition Problem:
Perceptron Learning Rule

W b p t a e
[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
0 p1 1 1 0
[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]
0 p2 0 1 -1
[0 -1 -1 -1 0 -1 0 0 0 -1 -1 0 0 0 -1 -1 0 0 0 -1 0 -1 -1 -1 0]
-1 p1 1 0 1
[1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
0 p2 0 0 0
[1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
0 p1 1 1 0

Character Recognition
Problem: Results
• After three epochs, W and b converge to:
– W = [1 -1 -1 -1 1 -1 1 0 1 -1 -1 0 1 0 -1 -1 1 0 1 -1 1 -1 -1 -1 1]
– b=0

• One possible solution based on the initial condition


selected. Other solutions are obtained when the initial
values of W and b are changed.
• Check the solution: a = hardlim(W*p + b) both both inputs

15
Character Recognition
Problem: Results
• How does this network perform in the presence of noise?

x and o with three


pixel errors in each

• For the “x” with noise:


a = hardlim{W*[1 0 0 0 1 0 1 0 0 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1] + 0} = 1
• For the “o” with noise:
a = hardlim{W*[0 1 1 1 0 1 0 0 0 1 1 1 0 0 0 1 0 1 0 1 0 1 1 1 0] + 0} = 0
• The network recognizes both the noisy x and o.

Character Recognition
Problem: Simulation
• Use MATLAB to perform the following simulation:
– Apply noisy inputs to the network with pixel errors ranging from 1
to 25 per character and find the network output
– Each type of error (number of pixels) was repeated 1000 times
for each character with the incorrect pixels being selected at
random
– The network output was compared to the target in each case.
– The number of detection errors was tabulated.

16
Character Recognition Problem:
Performance Results
No. of No. of Character Errors Probability of Error
Pixel x o x o
Errors
1-9 0 0 0 0
10 96 0 .10 0
11 399 0 .40 0
An “o” with
12 759 58 .76 .06
11 pixel
13 948 276 .95 .28 errors
14 1000 616 1 .62
15 1000 885 1 .89
16 - 25 1000 1000 1 1

Perceptrons: Limitations
• Perceptrons only work for inputs that
are linearly separable

x x x
x x o x o x
o o
o o x
Linearly separable Not Linearly separable

17
Other Neural Networks
• How do the other types of neural
networks differ from the perceptron?
– Topology
– Function
– Learning Rule

Perceptron Problem: Part 1


• Design a neural network that can identify
a tank and a house.
– Find W and b by hand as illustrated with the x-
o example.
– Use the Neural Network Toolbox to find W
and b

Tank House
(t = 1) (t = 0)

18
Perceptron Problem: Part 2
• Design a neural network that can find a
tank among houses and trees.
– Repeat the previous problem but now with a
tree included.
– Both the house and tree have targets of zero.

Tank House Tree


(t = 1) (t = 0) (t = 0)

Perceptron Problem: Part 3


• Design a neural network that can find a
tank among houses, trees and other
items.
– Create other images on the 9 x 9 grid.
– Everything other than a tank will have a target
of zero.
– How many items can you introduce before
the perceptron learning rule no longer
converges?
House Tree
Tank (t = 0) (t = 0)
(t = 1)
+ ????

19
MATLAB: Neural Network
Toolbox
• >> nntool

MATLAB: Neural Networks


Toolbox
• Go to MATLAB Help and review the
documentation on the Neural Networks
Toolbox
• Use the GUI interface (>> nntool) to
reproduce the results you obtained for the
perceptron (tank vs. house, tree, etc.)
• Data can be imported/exported from the
workspace to the NN Tool.

20

You might also like