0% found this document useful (0 votes)
18 views16 pages

Ad3501 Deep-Learning Model Question

The document is a question bank for a Deep Learning course at Anna University, covering various topics such as deep networks, machine learning basics, and convolutional networks. It includes questions categorized into parts A, B, and C, focusing on understanding, analysis, and application of deep learning concepts. The document is intended for students in the Department of Artificial Intelligence and Data Science for the academic year 2023-2024.

Uploaded by

mnallaselvi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views16 pages

Ad3501 Deep-Learning Model Question

The document is a question bank for a Deep Learning course at Anna University, covering various topics such as deep networks, machine learning basics, and convolutional networks. It includes questions categorized into parts A, B, and C, focusing on understanding, analysis, and application of deep learning concepts. The document is intended for students in the Department of Artificial Intelligence and Data Science for the academic year 2023-2024.

Uploaded by

mnallaselvi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

lOMoARcPSD|45501237

Ad3501 Deep-learning model question

deep learning (Anna University)

Scan to open on Studocu

Studocu is not sponsored or endorsed by any college or university


Downloaded by Archana N ([email protected])
lOMoARcPSD|45501237

DEPARTMENT OF ARTIFICIAL INTELLIOGENCE & DATA SCEIENCE

V SEMESTER

DEEP LEARNING

Regulation – 2021

Academic Year 2023 – 2024(EVEN)

Prepared by

Mr. KALIMUTHAN.C (AP/IT)

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

DEPARTMENT OF ARTIFICIAL INTELLIGENCE AND DATA SCIENCE

QUESTION BANK

SUBJECT : DEEP LEARNING


SEM/YEAR: V/III
UNIT - I: DEEP NETWORKS BASICS
Linear Algebra: Scalars -- Vectors -- Matrices and tensors; Probability Distributions -- Gradient based Optimization –
Machine Learning Basics: Capacity -- Overfitting and underfitting -- Hyperparameters and validation sets -- Estimators
-- Bias and variance -- Stochastic gradient descent -- Challenges motivating deep learning; Deep Networks: Deep
feedforward networks; Regularization – Optimization

PART – A

Q.N Question BT Competence


o Level

1 What is Deep Learning? BTL-1 Remember


2 Differentiate scalar and vector. BTL-2 Understand

3 Compare Deep Learning and rule based system. BTL-4 Analyze


4 What is Auto encoder? BTL-1 Remember
5 List out the special kind of matrices. BTL-1 Remember
6 What is representation learning? BTL-1 Remember
7 Compare Deep Learning with Machine learning. BTL-5 Evaluate
8 State the Bayes rule. BTL-1 Remember
9 List out some supervised learning algorithms. BTL-1 Remember
Can principle component analysis viewed as unsupervised learning
10 algorithm? Examine. BTL-3 Apply

11 Discuss about supervised learning algorithms. BTL-2 Understand


12 Give Venn diagram for Deep Learning. BTL-2 Understand
13 Specify the formula for conditional probability. BTL-6 Create
14 Illustrate support vector machine. BTL-3 Apply
15 Show the formula for Variance. BTL-3 Apply
16 Differentiate Independence and Conditional Independence. BTL-2 Understand
17 Analyze Eigen Decomposition. BTL-4 Analyze

18 Analyze random variable. BTL-4 Analyze

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

19 Probability theory is a fundamental tool of many disciplines of science BTL-5 Evaluate


and engineering. Justify.
20 Develop a formula for unbiased sample covariance matrix associated with BTL-6 Create
an m x n dimensional matrix X. Assume E[x] =0.
21 Analyze abouttensor. BTL-4 Analyze
22 Illustrate identity matrix? BTL-3 Apply
23 Discuss about matrix inverse. BTL-2 Understand
24 Assess linear dependence of vectors. BTL-5 Evaluate

PART – B
i. What is Deep Learning? (3)
1 ii. Describe how different parts of an Artificial Intelligence system BTL-1 Remember
relate to each other within different AI disciplines in detail with
diagram.(10)
2 Describe how deep learning is a kind of representation learning with the BTL-1 Remember
Venn diagram. (13)
3 Listand explain the historical trends in Deep Learning. BTL-1 Remember

i. Discussabout scalars.(7)
4 BTL-2 Understand
ii. Give detail description of vectors. (6)
i. Give the Difference between deep learningand machine learning.(7)
5 BTL-2 Understand
ii. Give the various concepts of probability. (6)
6 i. Demonstrate linear dependence and independence of vectors.(7) BTL-3 Analyze
ii. Explainspan of vectors. (6)
7 Analyze and write short notes on the following. BTL-4 Analyze
i. Vectors. (6)
ii. Matrices.(7)
Explain the following in detail.
8 i. Eigen Decomposition. (7) BTL-4 Apply
ii. Tensors.(6)
9 Assess the following. BTL-5 Evaluate
i. Expectation .(5)
ii. Variance.(4)
iii. Covariance . (4)
Extrapolate conditional probability and Develop a summary of various
10 BTL-6 Create
common probability distribution. (13)
11 Describe Stochastic Gradient Descent. (13) BTL-1 Remember
12 i. Illustrate the importance of principal components analysis. (6) BTL-3 Analyze
ii. Explain support vector machines in detail. (7)
13 Explain supervised learning algorithm. (13) BTL-4 Apply

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

14 Discuss unsupervised learning algorithm. (13) BTL-2 Understand


15 Discuss Normal distribution. (13) BTL-2 Understand
16 Explain Probability Mass function and Probability Density function. (13) BTL-3 Analyze
17 Explain Principal Components Analysis. (13) BTL-5 Evaluate

PART – C
1 Develop short notes on followingwith respect to deep learning with BTL-6 Create
examples.
i) Scalar and Vectors. (6)
ii) Matrices. (7)
Assess the following with respect to deep learning examples.
2 i) Random Variables. (6) BTL-5 Evaluate
ii) Probability. (7)
3 Develop a supervised learning algorithm and explain in detail.(15) BTL-6 Create
4 Assess unsupervised learning algorithm.(15) BTL-5 Evaluate
5 Assess the historical developments in deep learning. (15) BTL-5 Evaluate

UNIT – I cont...
Deep Feed Forward Network: Learning XOR – Gradient Based Learning- Hidden Units – Architecture
Design – Back Propagation Algorithms. Regularization for Deep Learning: Parameter Norm Penalties –
Regularization and unconstrained Problems – Dataset Augmentation – Noise Robustness – Semi
supervised Learning – Challenges in Neural Network Optimization.

PART – A
Q. Questions BT Competence
No Level
1 Point out different set of layers in Feed forward networks. BTL-4 Analyze
2 Point out the default activation function for modern neural networks. BTL-4 Analyze
3 Compare linear models and neural networks. BTL-5 Evaluate
4 Develop three generalizations of rectified linear units based on using a BTL-6 Create
non-zero slope.
5 What is Deep Feed Forward networks? BTL-1 Remember
6 List reasonably common hidden unit types. BTL-1 Remember
7 Give the drawback of rectified linear units. BTL-2 Understand
8 Describe gradient descent. BTL-2 Understand
9 Give example of a feed forward neural network. BTL-2 Understand
10 Define chain rule of calculus. BTL-1 Remember
11 List some classification problems where Data augmentation is used. BTL-1 Remember
12 Define universal approximation theorem for feed forward network. BTL-1 Remember

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

13 What critical points or stationary points in derivative illustration of a BTL-1 Remember


function?
14 Describe regularization for deep learning. BTL-2 Understand
15 Illustrate semi supervised learning. BTL-3 Apply
16 Illustrate the derivative function used in gradient descent algorithm. BTL-3 Apply
17 Explain importance of dataset augmentation. BTL-3 Apply
18 Analyze and write chain rule of calculus. BTL-4 Analyze
19 Reason for calling Feedforward neural networks as networks–Justify. BTL-5 Evaluate
20 Develop a computational graph for any function. BTL-6 Create
21 Give reason for the term “feed forward” used in the feed forward BTL-2 Understand
networks.
22 Explain XOR operation. BTL-3 Apply
23 Analyze cost function. BTL-4 Analyze
24 Justify the application of Dataset Augmentation various tasks BTL-5 Evaluate

PART – B
1 Describe Deep feed forward networks. (13) BTL-1 Remember
2 Explain cost function ingradient based learning. (6) BTL-3 Apply
Explain learning conditional distributions with maximum likelihood. (7)
3 i. Describe about learning conditional statistics in gradient based BTL-1 Remember
learning.(7)
ii. Explain linear units for Gaussian Output Distributions.(6)
4 Explain output units of feed forward networks. (13) BTL-3 Apply
5 i. Explain sigmoid units for Bernoulli Output Distributions.(8) BTL-5 Evaluate
ii. Justify the importance of Rectified linear units in Hidden units. (5)
6 i. GiveSoftmax units for Multinoulli Output Distributions. (7) BTL-2 Understand
ii. Discuss about Hidden Units. (6)
7 i. Describe Rectified linear units and their generalizations. (7) BTL-2 Understand
ii. Describe Logistic Sigmoid and Hyperbolic Tangent. (6)
8 i. Write a short notes on Radial Basis function, Softplus and Hard tanh(7) BTL-4 Analyze
ii. Write a short notes on Architecture Design. (6)

9 i. Describe Back Propagation algorithm. (7) BTL-1 Remember


ii. Explain regularization for deep learning. (6)
10 Briefly describe Universal Approximation Properties and Depth. (13) BTL-1 Remember
11 Analyze and write short notes on Dataset Augmentation. (13) BTL-4 Analyze

12 Develop a data set and demonstrate Noise Robustness. (13) BTL-6 Create
13 Discuss in detail about chain rule of calculus. (13) BTL-2 Understand

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

14 Illustrate Computational graphs. (13) BTL-4 Analyze


15 Give the applications of Dataset Augmentation. (13) BTL-2 Understand
16 Explain Multi-Task Learning. (13) BTL-3 Analyze
17 Assess Computational graphs with necessary diagrams. (13) BTL-5 Evaluate
PART – C
1 Develop a Deep Feed forward network and explain. (15) BTL-6 Create
Assess the routines to implement forward propagation computation.
2 BTL-5 Evaluate
(15)
3 Assess the difference between linear models and neural networks. (15) BTL-5 Evaluate
4 Develop your own scenarios to demonstrate computational graph. (15) BTL-6 Create
5 Develop Chain Rule of Calculus. (15) BTL-6 Create

UNIT II
Convolution Operation -- Sparse Interactions -- Parameter Sharing -- Equivariance -- Pooling -- Convolution Variants:
Strided -- Tiled -- Transposed and dilated convolutions; CNN Learning: Nonlinearity Functions -- Loss Functions --
Regularization -- Optimizers --Gradient Computation.

Q.N Questions BT Competence


o Level
PART – A
1 An essential feature of any convolutional network implementation is the
BTL-5 Evaluate
ability to implicitly zero-pad the input V.Justify
2 The output layer of convolutional network is usually relatively
BTL-5 Evaluate
inexpensive
to learning layer. Justify.
3 What is convolutional networks? BTL-1 Remember
4 Createa chart that demonstrates convolution with a stride. BTL-6 Create
5 Howpooling handles inputs of varying size? BTL-4 Analyze
6 Whatis meant by convolution? BTL-1 Understand
7 List three important ideas that help to improve a machine learning BTL-1 Remember
system.
8 Whatisunshared convolution? BTL-2 Understand
9 Defineprimary visual cortex. BTL-1 Remember
10 Howto reduce the cost of convolutional network training? BTL-2 Understand
11 Simulatethe idea behind reverse correlation. BTL-6 Create
12 Discussabout parameter sharing in neural network. BTL-2 Understand
13 Give three properties of V1 that a convolutional network layer is
BTL-2 Understand
designed
to capture.
14 Explain feature map. BTL-4 Analyze
Explain how a convolutional layer have a property called equivariance to
15 BTL-3 Apply
translation?

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

16 List three stages of a convolutional network. BTL-1 Remember


17 List out various formats of data that can be used with convolutional BTL-1 Remember
Networks.
18 Illustrate pooling stage in convolutional network. BTL-3 Apply
Differentiate complex layer terminology and simple layer terminology in
19 BTL-4 Analyze
Convolutional network.
20 Show three basic strategies for obtaining convolution kernels without BTL-3 Apply
Supervised training.
21 Give example for convolution. BTL-2 Understand
22 Illustrate reverse correlation. BTL-3 Apply
23 Explain complex layer terminology. BTL-4 Analyze
24 Examine equivariance to translation. BTL-5 Evaluate
PART – B
1 Write an example function for Convolution operation and explain in detail. BTL-1 Remember
(13)
2 Explain the following with suitable diagram. BTL-4 Analyze
i. Sparse interactions. (6)
ii. Parameter sharing. (7)
3 Describe Pooling with suitable example. (13) BTL-1 Remember
4 Write an expression for Unshared convolution with explanation and BTL-1 Remember
explain Tiled convolution.(13)
5 Discuss in detail the variants of the Basic Convolution Function. (13) BTL-2 Understand
6 Constructing architecture that show complex layer terminology and Simple BTL-5 Evaluate
layer terminology in convolutional neural network.
7 Discuss local connections, convolution and full connections with diagram? BTL-2 Understand
(13)
8 Develop table with examples of different formats of data that can be used BTL-6 Create
with convolutional networks. (13)
9 Describe in detail about the following. BTL-1 Remember
i. Parameter Sharing. (7)
ii. Equivariant representation. (6)
10 Differentiate locally connected layers, tiled convolution and standard BTL-4 Analyze
convolution with suitable examples and diagram. (13)
11 i. Write short notes Max Pooling. (6) BTL-2 Remember
ii. Explain Pooling with down sampling. (7)
12 Explain random or Unsupervised Features.(13) BTL-4 Analyze
13 Illustrate unshared convolution with suitable examples. (13) BTL-3 Apply
14 i. Show three properties of V1 that a convolutional network layer BTL-3 Apply
is designed to capture. (6)
ii. Prove the working learned invariances with necessary example
and diagram. (7)
15 Discuss parameter sharing. (13) BTL-2 Understand

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

16 Illustrate Equivariant representation. (13) BTL-3 Analyze


17 Evaluate the working learned invariances with necessary example and BTL-5 Evaluate
diagram. (13)
PART – C
1 Construct graphical demonstration for sparse connectivity and explain it BTL-5 Evaluate
in detail. (15)
2 Create a graphical demonstration for parameter sharing and explain it in BTL-6 Create
Detail. (15)
3 Evaluate variants of the basic convolution function. (15) BTL-5 Evaluate
4 Construct a convolutional network to demonstrate the effect of zero BTL-6 Create
padding on network size.
Explain Neuro scientific basis for Convolutional Networks. (15)
5 Create a table with examples of different formats of data that can be used BTL-6 Create
with convolutional networks. (15)

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

UNIT – III : SEQUENCE MODELING: RECURRENT AND RECURSIVE


NETS
Unfolding Computational Graphs – Recurrent Neural Networks – Bidirectional RNNs – Encoder Decoder Sequence to
Sequence Architectures – Deep Recurrent Networks – Recursive Neural Networks – The Challenge of Long- Term
Dependencies – Echo State Networks – The Long-term memory and other Gated RNNs – Optimization for Long Term
Dependencies – Explicit Memory.
PART – A
Q.No Questions BT Competence
Level
1 What is Recurrent Neural Networks? BTL-1 Remember
2 What is Encoder? BTL-1 Remember
3 Give the blocks of decomposition of computation of most Recurrent Neural BTL-2 Understand
Networks.
4 What is Bidirectional Recurrent Neural Networks? BTL-1 Remember
5 Give the advantage of recursive nets over recurrent nets. BTL-2 Understand
6 What is decoder? BTL-1 Remember
7 Describe Recursive Neural Networks. BTL-1 Remember
8 Predict the concept of gated RNNs. BTL-2 Understand

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

9 Compare echo state network and liquid state machines. BTL-4 Analyze
10 Distinguish content based addressing and location based addressing in BTL-2 Understand
Memory networks.
11 Classify the different strategies for Multiple Time Scales. BTL-3 Apply
12 Develop block diagram for LSTM. BTL-6 Create
13 Illustrate important design patterns for recurrent neural networks. BTL-3 Apply
14 Summarize about echo state networks. BTL-5 Evaluate
15 Point out the advantage of introducing depth in Deep recurrent Networks. BTL-4 Analyze
16 Compare gradient descent with and without gradient clipping using BTL-4 Analyze
Diagram.
17 Justify the major advantages of unfolding process in computational BTL-5 Evaluate
Graphs.
18 Illustrate block diagram of LSTM recurrent network “cell”. BTL-3 Apply
19 What are leaky units? BTL-1 Remember
20 Develop a schematic diagram of a network with an explicit memory. BTL-6 Create
21 Give a block diagram for Long Short Term Memory. BTL-2 Understand
22 Illustrate echo state networks. BTL-3 Apply
23 Explain liquid state machines. BTL-4 Analyze
24 Assess explicit memory. BTL-5 Evaluate
PART – B
1 i. Describe Unfolding Computational Graphs. (6) BTL-1 Remember
ii. Explain Bidirectional RNNs. (7)
2 Describe the following. BTL-1 Remember
i. Teacher Forcing in Recurrent Neural Networks. (7)
ii. Networks with Output Recurrence. (6)
3 i. Describe Echo State Networks. (7) BTL-1 Remember
ii. Explain challenge of Long-Term Dependencies.(6)
4 Discuss Recurrent Neural Networks in detail.(13) BTL-2 Understand
5 Describe Deep Recurrent Networks in detail.(13) BTL-2 Understand
6 Illustrate Encoder-Decoder sequence-to-sequence Architecture. (13) BTL-3 Apply
7 Explain Leaky Units and Other Strategies for Multiple Time Scales. (13) BTL-4 Analyze

8 Point out various features of Echo state networks. (13) BTL-4 Analyze
9 Explain Optimization for Long-Term Dependencies. (13) BTL-5 Evaluate
10 Compute the gradient in a Recurrent Neural Network. (13) BTL-6 Create

10

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

11 i. Illustrate Clipping Gradients. (7) BTL-3 Apply


ii. Illustrate Regularizing to Encourage Information Flow.( 6)
12 Describe the following. BTL-1 Remember
i. Long Short-Term Memory. (7)
ii. Other Gated RNNs. (6)
13 Explain in detail about the following. BTL-4 Analyze
i. Adding Skip Connections through Time. (6)
ii. Leaky Units and a Spectrum of Different Time Scales .(7)
14 Describe Explicit memory. (13) BTL-2 Understand
15 Discuss Echo State Networks. (13) BTL-2 Understand
16 Illustrate Bidirectional RNNs.(13) BTL-3 Analyze
17 Explain challenge of Long-Term Dependencies. (13) BTL-5 Evaluate
PART – C
1 Develop an example for Unfolding Computational Graphs and describe the BTL-6 Create
Major advantages of unfolding process. (15)
2 Explain how to compute the gradient in a Recurrent Neural Network.(15) BTL-5 Evaluate
3 Explain a modeling sequences Conditioned on Context with RNNs. (15) BTL-5 Evaluate
4 Prepare an example of Encoder- Decoder or sequence-to-sequence RNN BTL-6 Create
architecture.(15)
5 Explain variousGated RNNs. (15) BTL-5 Evaluate

UNIT V AUTOENCODERS AND


GENERATIVE MODELS
. Autoencoders: Undercomplete autoencoders -- Regularized autoencoders -- Stochastic encoders and decoders --
Learning with autoencoders; Deep Generative Models: Variational autoencoders – Generative adversarial networks.

PART – A
Q.No Question BT Competence
Level
1 What is Probabilistic PCA and Factor Analysis? BTL-1 Remember
2 Define Linear Factor Model. BTL-1 Remember

3 Give the various generalizations of ICA. BTL-2 Understand


4 What is Independent Component Analysis? BTL-1 Remember
5 Give major advantage of slow feature analysis. BTL-2 Understand
6 Name the various tasks than can be done by probabilistic models. BTL-1 Remember
11

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

7 What is Denoising Autoencoders? BTL-1 Remember


8 Predict the primary disadvantage of the non-parametric encoder. BTL-2 Understand
9 Point out the trade-off faced in representation learning problems. BTL-4 Analyze

12

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

10 Distinguish between one-shot learning and zero-shot learning. BTL-2 Understand


11 Classify the different Graphical models. BTL-3 Apply
12 Develop distribution equation for energy based model. BTL-6 Create
13 Which are undirected models? BTL-3 Apply
14 Summarize Distributed representations. BTL-5 Evaluate
15 Point out the reason for why Greedy layer-wise pre-training called BTL-4 Analyze
Greedy.
16 Compare directed models and undirected models. BTL-4 Analyze
17 Slow Feature Analysis is an efficient application of slowness principle? BTL-5 Evaluate
Justify.
18 How many task does the learner must perform in transfer learning? BTL-3 Apply
19 List the two different ideas combined by Unsupervised pre-training. BTL-1 Remember
20 Develop an example for distribution equation that represent a Boltzmann BTL-6 Create
distribution.
21 Give an example of learning algorithm based on non-distributed BTL-2 Understand
representations.
22 Compare distributed representation and a symbolic one. BTL-4 Analyze
23 Illustrate the reasons for which Modeling a rich distribution is not feasible BTL-3 Apply
in unstructured modeling.
24 Evaluate Undirected models. BTL-5 Evaluate
PART – B
1 Describes parse Coding. (13) BTL-1 Remember
Describe the following
2 i. Probabilistic PCA and. (6) BTL-1 Remember
ii. Factor Analysis. (7)
3 Describe the following. BTL-1 Remember
i, Independent Component Analysis,.(5)
ii, Slow Feature Analysis. (8)
4 Discuss Manifold interpretation of PCA. (13) BTL-2 Understand
5 Discuss Auto encoders. (13) BTL-2 Understand
6 Write in detail about Undercomplete Autoencoders. (13) BTL-3 Apply
7 Explain Regularized Autoencoders. (13) BTL-4 Analyze
8 Compare Structured Probabilistic Model and Unstructured Modeling (13) BTL-4 Analyze
9 Summarize usage of various Graphs to describe Model Structure.(13) BTL-5 Evaluate
10 Develop an example distribution equation for energy-based model and explain in BTL-6 Create
detail. (13)
11 i. Write short notes Sparse Autoencoders.(7) BTL-4 Analyze
ii. Illustrate Denoising Auto encoders. (6)
iii.

iv.

13

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

Describe the following.


12 i. Representation learning.(6) BTL-1 Remember
ii. Greedy Layer-Wise Unsupervised Pretraining.(7)

14

Downloaded by Archana N ([email protected])


lOMoARcPSD|45501237

13 Discuss in detail about transfer learning and Domain Adaptation. (13) BTL-3 Apply
14 Describe Distributed Representation.(13) BTL-2 Understand
15 Discuss about Slow Feature Analysis. (13) BTL-2 Understand
16 Write about representation learning. (13) BTL-3 Analyze
17 Explain Markov random fields. (13) BTL-5 Evaluate

PART – C
1 Develop a short notes on Separation and D-Separation.(15) BTL-6 Create
2 Explain Monte Carlo methods.(15) BTL-5 Evaluate
3 Explain Autoencoders.(15) BTL-5 Evaluate
4 Develop various graphs to describe Model Structure. (15) BTL-6 Create
5 Assess Independent Component Analysis. (15) BTL-5 Evaluate

15

Downloaded by Archana N ([email protected])

You might also like