0% found this document useful (0 votes)
139 views6 pages

Perceptron Learning Example Steps

The document describes the perceptron learning algorithm. It uses a perceptron to classify instances with two features (x1, x2) into two classes (1 or -1). The perceptron weights (w1, w2, wb) are initially random and are updated using the learning rate n whenever the predicted and actual classes differ, to minimize errors. The example shows the perceptron classifying 4 training instances correctly after multiple iterations of weight updates, and can then be used to classify new test instances.

Uploaded by

Aysun Güran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
139 views6 pages

Perceptron Learning Example Steps

The document describes the perceptron learning algorithm. It uses a perceptron to classify instances with two features (x1, x2) into two classes (1 or -1). The perceptron weights (w1, w2, wb) are initially random and are updated using the learning rate n whenever the predicted and actual classes differ, to minimize errors. The example shows the perceptron classifying 4 training instances correctly after multiple iterations of weight updates, and can then be used to classify new test instances.

Uploaded by

Aysun Güran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

PERCEPTRON LEARNING

Positive class = 1

Negative class = -1

Net input function = yin =w1x1+w2x2+…wmxm = w1.x1+w2.x2+…wm.xm + wb.b {b= bias , b=1}

Threshold function (Activation function) = y = f(y in) = {−11 ifotherwise


yin≥ 0
}

W (new) = w(old) + n (t-y)*x n= learning rate


PERCEPTRON LEARNING

Wb(new) = Wb(old) + n (t-y)*b

t(class_lab n=0.15
x1 x2 el) b w1 = -0.7
-1 -1 -1 1
w2 = 0.5
-1 1 -1 1
1 -1 -1 1 Wb = 0.2
1 1 1 1

yin = w1.x1+w2.x2+ wb.b

First instance (-1,-1) is producing a signal :

yin = (-0.7)(-1)+(0.5)(-1)+0.2(1) = 0.4

y = f(yin) = 1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

w1 (new) = w1(old) + n (t - y) * x1 =-0.7+ 0.15*(-1-1)*(-1) =- 0.4

w2 (new) = w2(old) + n (t - y) * x2 =0.5+0.15*(-1-1)*(-1) = 0.8

Wb(new) = Wb(old) + n (t-y)*b = 0.2+0.15*(-1-1)*1 = -0.1

Second instance is producing a signal (-1,1):

yin = (-0.4)(-1)+(0.8)(1)-0.1(1) = 1.1


PERCEPTRON LEARNING

y = f(yin) = 1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

w1 (new) = w1(old) + n (t - y) * x1 =-0.4+ 0.15*(-1-1)*(-1) = -0.1

w2 (new) = w2(old) + n (t - y) * x2 =0.8+0.15*(-1-1)*(1) = 0.5

Wb(new) = Wb(old) + n (t-y)*b = -0.1+0.15*(-1-1)*1 = -0.4

Third instance (1,-1) is producing a signal:

yin = (-0.1)(1)+(0.5)(-1)-0.4(1) = -1

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

The 4th instance (1,1) is producing a signal:

yin = (-0.1)(1)+(0.5)(1)-0.4(1) = 0

y = f(yin) = 1 (the prediction of the class label that the perceptron gives you)

t = 1 (actual class label (target))

target t = 1 and predicted class label y= 1. Since they are equal to each other we don’t need any
update.

WE SHOULD REPEAT THE WHOLE PROCESS

First instance (-1,-1) is producing a signal :

yin = (-0.1)(-1)+(0.5)(-1)-0.4(1) = -0.8

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

Second instance is producing a signal (-1,1):

yin = (-0.1)(-1)+(0.5)(1)-0.4(1) = 0.2

y = f(yin) = 1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))


PERCEPTRON LEARNING

w1 (new) = w1(old) + n (t - y) * x1 =-0.1+ 0.15*(-1-1)*(-1) = 0.2

w2 (new) = w2(old) + n (t - y) * x2 =0.5+0.15*(-1-1)*(1) = 0.2

Wb(new) = Wb(old) + n (t-y)*b = -0.4+0.15*(-1-1)*1 = -0.7

w1=0.2, w2=0.2, wb= -0.7

Third instance (1,-1) is producing a signal:

yin = (0.2)(1)+(0.2)(-1)-0.7(1) = -0.7

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

w1=0.2, w2=0.2, wb= -0.7

The 4th instance (1,1) is producing a signal:

yin = (0.2)(1)+(0.2)(1)-0.7(1) = -0.3

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = 1 (actual class label (target))

target t = 1 and predicted class label y= 1. Since they are not equal to each other we need to
update the weights.

w1 (new) = w1(old) + n (t - y) * x1 =0.2+ 0.15*(1-(-1))*(1) = 0.5

w2 (new) = w2(old) + n (t - y) * x2 =0.2+0.15*(1-(-1))*(1) = 0.5

Wb(new) = Wb(old) + n (t-y)*b = -0.7+0.15*(1-(-1))*1 = -0.4

w1=0.5, w2=0.5, wb= -0.4

WE SHOULD REPEAT THE WHOLE PROCESS

First instance (-1,-1) is producing a signal :

yin = (0.5)(-1)+(0.5)(-1)-0.4(1) = -1.4

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

w1=0.5, w2=0.5, wb= -0.4


PERCEPTRON LEARNING

Second instance is producing a signal (-1,1):

yin = (0.5)(-1)+(0.5)(1)-0.4(1) = -0.4

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

w1=0.5, w2=0.5, wb= -0.4

Third instance (1,-1) is producing a signal:

yin = (0.5)(1)+(0.5)(-1)-0.4(1) = -0.4

y = f(yin) = -1 (the prediction of the class label that the perceptron gives you)

t = -1 (actual class label (target))

target t = -1 and predicted class label y= -1. Since they are equal to each other we don’t need
any update.

w1=0.5, w2=0.5, wb= -0.4

The 4th instance (1,1) is producing a signal:

yin = (0.5)(1)+(0.5)(1)-0.4(1) = 0.6

y = f(yin) = 1 (the prediction of the class label that the perceptron gives you)

t = 1 (actual class label (target))

target t = 1 and predicted class label y= 1. Since they are not equal to each other we need to
update the weights.

w1=0.5, w2=0.5, wb= -0.4


PERCEPTRON LEARNING

FINAL RESULT and predict a test instance (5,6)

You might also like