Ad3501 Deep-Learning Model Question
Ad3501 Deep-Learning Model Question
V SEMESTER
DEEP LEARNING
Regulation – 2021
Prepared by
QUESTION BANK
PART – A
PART – B
i. What is Deep Learning? (3)
1 ii. Describe how different parts of an Artificial Intelligence system BTL-1 Remember
relate to each other within different AI disciplines in detail with
diagram.(10)
2 Describe how deep learning is a kind of representation learning with the BTL-1 Remember
Venn diagram. (13)
3 Listand explain the historical trends in Deep Learning. BTL-1 Remember
i. Discussabout scalars.(7)
4 BTL-2 Understand
ii. Give detail description of vectors. (6)
i. Give the Difference between deep learningand machine learning.(7)
5 BTL-2 Understand
ii. Give the various concepts of probability. (6)
6 i. Demonstrate linear dependence and independence of vectors.(7) BTL-3 Analyze
ii. Explainspan of vectors. (6)
7 Analyze and write short notes on the following. BTL-4 Analyze
i. Vectors. (6)
ii. Matrices.(7)
Explain the following in detail.
8 i. Eigen Decomposition. (7) BTL-4 Apply
ii. Tensors.(6)
9 Assess the following. BTL-5 Evaluate
i. Expectation .(5)
ii. Variance.(4)
iii. Covariance . (4)
Extrapolate conditional probability and Develop a summary of various
10 BTL-6 Create
common probability distribution. (13)
11 Describe Stochastic Gradient Descent. (13) BTL-1 Remember
12 i. Illustrate the importance of principal components analysis. (6) BTL-3 Analyze
ii. Explain support vector machines in detail. (7)
13 Explain supervised learning algorithm. (13) BTL-4 Apply
PART – C
1 Develop short notes on followingwith respect to deep learning with BTL-6 Create
examples.
i) Scalar and Vectors. (6)
ii) Matrices. (7)
Assess the following with respect to deep learning examples.
2 i) Random Variables. (6) BTL-5 Evaluate
ii) Probability. (7)
3 Develop a supervised learning algorithm and explain in detail.(15) BTL-6 Create
4 Assess unsupervised learning algorithm.(15) BTL-5 Evaluate
5 Assess the historical developments in deep learning. (15) BTL-5 Evaluate
UNIT – I cont...
Deep Feed Forward Network: Learning XOR – Gradient Based Learning- Hidden Units – Architecture
Design – Back Propagation Algorithms. Regularization for Deep Learning: Parameter Norm Penalties –
Regularization and unconstrained Problems – Dataset Augmentation – Noise Robustness – Semi
supervised Learning – Challenges in Neural Network Optimization.
PART – A
Q. Questions BT Competence
No Level
1 Point out different set of layers in Feed forward networks. BTL-4 Analyze
2 Point out the default activation function for modern neural networks. BTL-4 Analyze
3 Compare linear models and neural networks. BTL-5 Evaluate
4 Develop three generalizations of rectified linear units based on using a BTL-6 Create
non-zero slope.
5 What is Deep Feed Forward networks? BTL-1 Remember
6 List reasonably common hidden unit types. BTL-1 Remember
7 Give the drawback of rectified linear units. BTL-2 Understand
8 Describe gradient descent. BTL-2 Understand
9 Give example of a feed forward neural network. BTL-2 Understand
10 Define chain rule of calculus. BTL-1 Remember
11 List some classification problems where Data augmentation is used. BTL-1 Remember
12 Define universal approximation theorem for feed forward network. BTL-1 Remember
PART – B
1 Describe Deep feed forward networks. (13) BTL-1 Remember
2 Explain cost function ingradient based learning. (6) BTL-3 Apply
Explain learning conditional distributions with maximum likelihood. (7)
3 i. Describe about learning conditional statistics in gradient based BTL-1 Remember
learning.(7)
ii. Explain linear units for Gaussian Output Distributions.(6)
4 Explain output units of feed forward networks. (13) BTL-3 Apply
5 i. Explain sigmoid units for Bernoulli Output Distributions.(8) BTL-5 Evaluate
ii. Justify the importance of Rectified linear units in Hidden units. (5)
6 i. GiveSoftmax units for Multinoulli Output Distributions. (7) BTL-2 Understand
ii. Discuss about Hidden Units. (6)
7 i. Describe Rectified linear units and their generalizations. (7) BTL-2 Understand
ii. Describe Logistic Sigmoid and Hyperbolic Tangent. (6)
8 i. Write a short notes on Radial Basis function, Softplus and Hard tanh(7) BTL-4 Analyze
ii. Write a short notes on Architecture Design. (6)
12 Develop a data set and demonstrate Noise Robustness. (13) BTL-6 Create
13 Discuss in detail about chain rule of calculus. (13) BTL-2 Understand
UNIT II
Convolution Operation -- Sparse Interactions -- Parameter Sharing -- Equivariance -- Pooling -- Convolution Variants:
Strided -- Tiled -- Transposed and dilated convolutions; CNN Learning: Nonlinearity Functions -- Loss Functions --
Regularization -- Optimizers --Gradient Computation.
9 Compare echo state network and liquid state machines. BTL-4 Analyze
10 Distinguish content based addressing and location based addressing in BTL-2 Understand
Memory networks.
11 Classify the different strategies for Multiple Time Scales. BTL-3 Apply
12 Develop block diagram for LSTM. BTL-6 Create
13 Illustrate important design patterns for recurrent neural networks. BTL-3 Apply
14 Summarize about echo state networks. BTL-5 Evaluate
15 Point out the advantage of introducing depth in Deep recurrent Networks. BTL-4 Analyze
16 Compare gradient descent with and without gradient clipping using BTL-4 Analyze
Diagram.
17 Justify the major advantages of unfolding process in computational BTL-5 Evaluate
Graphs.
18 Illustrate block diagram of LSTM recurrent network “cell”. BTL-3 Apply
19 What are leaky units? BTL-1 Remember
20 Develop a schematic diagram of a network with an explicit memory. BTL-6 Create
21 Give a block diagram for Long Short Term Memory. BTL-2 Understand
22 Illustrate echo state networks. BTL-3 Apply
23 Explain liquid state machines. BTL-4 Analyze
24 Assess explicit memory. BTL-5 Evaluate
PART – B
1 i. Describe Unfolding Computational Graphs. (6) BTL-1 Remember
ii. Explain Bidirectional RNNs. (7)
2 Describe the following. BTL-1 Remember
i. Teacher Forcing in Recurrent Neural Networks. (7)
ii. Networks with Output Recurrence. (6)
3 i. Describe Echo State Networks. (7) BTL-1 Remember
ii. Explain challenge of Long-Term Dependencies.(6)
4 Discuss Recurrent Neural Networks in detail.(13) BTL-2 Understand
5 Describe Deep Recurrent Networks in detail.(13) BTL-2 Understand
6 Illustrate Encoder-Decoder sequence-to-sequence Architecture. (13) BTL-3 Apply
7 Explain Leaky Units and Other Strategies for Multiple Time Scales. (13) BTL-4 Analyze
8 Point out various features of Echo state networks. (13) BTL-4 Analyze
9 Explain Optimization for Long-Term Dependencies. (13) BTL-5 Evaluate
10 Compute the gradient in a Recurrent Neural Network. (13) BTL-6 Create
10
PART – A
Q.No Question BT Competence
Level
1 What is Probabilistic PCA and Factor Analysis? BTL-1 Remember
2 Define Linear Factor Model. BTL-1 Remember
12
iv.
13
14
13 Discuss in detail about transfer learning and Domain Adaptation. (13) BTL-3 Apply
14 Describe Distributed Representation.(13) BTL-2 Understand
15 Discuss about Slow Feature Analysis. (13) BTL-2 Understand
16 Write about representation learning. (13) BTL-3 Analyze
17 Explain Markov random fields. (13) BTL-5 Evaluate
PART – C
1 Develop a short notes on Separation and D-Separation.(15) BTL-6 Create
2 Explain Monte Carlo methods.(15) BTL-5 Evaluate
3 Explain Autoencoders.(15) BTL-5 Evaluate
4 Develop various graphs to describe Model Structure. (15) BTL-6 Create
5 Assess Independent Component Analysis. (15) BTL-5 Evaluate
15