0% found this document useful (0 votes)
7 views7 pages

Btech Cse 8 Sem Pattern Recognition 2013

The document is an examination paper for a Pattern Recognition course, detailing the structure of the exam which includes multiple choice questions, short answer questions, and long answer questions. It covers various topics such as supervised and unsupervised learning, Bayes classifiers, clustering algorithms, and feature selection. The exam is designed for CS/B.TECH (CSE) students and has a total duration of 3 hours with full marks of 70.

Uploaded by

adrita.exam.2004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views7 pages

Btech Cse 8 Sem Pattern Recognition 2013

The document is an examination paper for a Pattern Recognition course, detailing the structure of the exam which includes multiple choice questions, short answer questions, and long answer questions. It covers various topics such as supervised and unsupervised learning, Bayes classifiers, clustering algorithms, and feature selection. The exam is designed for CS/B.TECH (CSE) students and has a total duration of 3 hours with full marks of 70.

Uploaded by

adrita.exam.2004
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

[Link]

com

Name : ……………………………………………………………
Roll No. : ………………………………………………………..
Invigilator’s Signature : ………………………………………..

CS/[Link] (CSE)/SEM-8/CS-801F/2013
2013
PATTERN RECOGNITION
Time Allotted : 3 Hours Full Marks : 70

The figures in the margin indicate full marks.


Candidates are required to give their answers in their own words
as far as practicable.

GROUP – A
( Multiple Choice Type Questions )

1. Choose the correct alternatives for the following :

10 1 = 10
i) Clustering algorithm usually employ

a) supervised learning

b) unsupervised learning

c) reinforcement learning

d) competitive learning.

8207 [ Turn over


CS/[Link] (CSE)/SEM-8/CS-801F/2013

ii) The likelihood of class w 1 and w 2 followed normal


distribution N ( – 0·5, 2 ) and N ( 0·5, 2 ), respectively.
For equal prior, a pattern X = 1·0 belongs to

a) class w 1

b) class w 2

c) either class w 1 or class w 2

d) both the classes.

iii) If the covariance matrices for all of the classes are


identical, then the discriminant functions will be

a) Linear b) Quadratic

c) Polynomial d) None of these.

iv) For uniform prior we can estimate the parameter of a


density function by using

a) maximum likelihood ( ML )

b) maximum a posteriority ( MAP )

c) either ML or MAP

d) none of these.

v) K-Nearest Neighbor based classifier is

a) linear and optimal

b) linear and suboptimal

c) nonlinear and optimal

d) nonlinear and suboptimal.

8207 2
CS/[Link] (CSE)/SEM-8/CS-801F/2013

vi) If P NN is the classification error probability for the


Nearest Neighbor rule and P B is the Bayes error then

a) P B P NN 2P B b) P NN P 2B

c) P NN P 2B d) P NN P B..

vii) Gradient descent search is not applicable to find optima


on a

a) rough surface

b) smooth surface

c) surface with single optima

d) surface with multiple optima.

viii) Perceptron is not able to implement

a) OR gate b) AND gate

c) XOR gate d) NOT gate.


ix) Given two fuzzy clusters A 1 and A 2 . A data point X in
two-class ( fuzzy C-means clustering ) then satisfies
a) µA ( x)+µA ( x)=1
1 2

b) µA ( x)+µA ( x)<1
1 2

c) µA ( x)+µA ( x)>1
1 2

d) µA ( x)+µA ( x) 1.
1 2

x) Principal component analysis is one important step in

a) Data dimension reduction

b) Data encryption

c) Noise filtering

d) Data communication.

8207 3 [ Turn over


CS/[Link] (CSE)/SEM-8/CS-801F/2013

GROUP – B
( Short Answer Type Questions )
Answer any three of the following. 3 5 = 15

2. Compare and contrast supervised and unsupervised

learning.

3. Design a Bayes classifier in terms of a set of discriminant

functions.

4. A sample from class-A is located at ( X, Y, Z ) = ( 1, 2, 3 ), a

sample from class-B is at ( 7, 4, 5 ) and a sample from

class-C is at ( 6, 2, 1 ). How would a sample at ( 3, 4, 5 ) be

classified using the Nearest Neighbor technique and

Euclidean distance ?

5. Write a short note on generalized linear discriminant

function.

6. Consider the following proximity matrix :


x1 x2 x3 x4 x5

P= x1 0 6 8 2 7
x2 0 1 5 3
x3 0 10 9
x4 0 4
x5 0

Draw the resulting dendrogram by applying single link


clustering algorithm.

8207 4
CS/[Link] (CSE)/SEM-8/CS-801F/2013

GROUP – C
( Long Answer Type Questions )
Answer any three of the following. 3 15 = 45

7. a) Describe the basic steps involved in the design of

pattern recognition system.

b) What is maximum likelihood ( ML ) estimation ? Show

that if the likelihood function is univariate Gaussian

2,
with unknowns the mean µ as well as variance then

ML estimate are given by

N
1 2 1 2
µ =
N
X k , and =
N (Xk–µ) ,
k=1

where X k is the k th pattern and N is the total number

of training patterns.

c) Compare parametric and non-parametric technique.

6+5+4

8207 5 [ Turn over


CS/[Link] (CSE)/SEM-8/CS-801F/2013

8. a) What is Bayesian classifier ? Prove that it is an optimal

classifier.

b) In a two class problem with single feature X the pdf’s


1
are Gaussians with variance 2 = for both classes
2

and mean value 0 and 1 respectively. If


1
P (w1 )= P (w2 ) = , compute the threshold value
2

X 0 for minimum error probability. 4+5+6

9. a) What is density estimation ? What are the necessary

conditions for its convergence ?

b) Compare Parzen Windows and k-Nearest Neighbor

density estimation technique.

c) What is perceptron ? Discuss briefly the perceptron

based learning algorithm. 4+4+7

10. a) What is clustering ? Categorize the different clustering

algorithms of the pattern recognition domain.

b) Explain Fuzzy-C-means clustering algorithm. Write a

short note about its criterion function. 6+9

8207 6
CS/[Link] (CSE)/SEM-8/CS-801F/2013

11. a) What is feature selection ? What is optimal and

suboptimal feature subset selection ?

b) Explain one suboptimal feature subset selection

technique.

c) What is feature generation ? Write a short note on

principal component analysis. 4+5+6

8207 7 [ Turn over

You might also like