0% found this document useful (0 votes)
15 views3 pages

Question Bank of Deep Learning

The document outlines important examination questions related to deep learning, covering topics such as biological vs. machine vision, neural network architectures, regularization techniques, and recurrent neural networks. It includes questions on specific models like LeNet-5 and word-vector representations, as well as concepts like dataset augmentation and semi-supervised learning. The content is organized into modules that address various aspects of deep learning and natural language processing.

Uploaded by

E Manikanta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views3 pages

Question Bank of Deep Learning

The document outlines important examination questions related to deep learning, covering topics such as biological vs. machine vision, neural network architectures, regularization techniques, and recurrent neural networks. It includes questions on specific models like LeNet-5 and word-vector representations, as well as concepts like dataset augmentation and semi-supervised learning. The content is organized into modules that address various aspects of deep learning and natural language processing.

Uploaded by

E Manikanta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

[Type the document title]

Important Examination Questions: Deep Learning


Module 1 Deep Learning and NLP
1. Explain the similarities and differences between biological vision and machine vision.
How has biological vision inspired the development of neural networks?

2. Describe the architecture and functioning of the Neocognitron. Why is it considered a


precursor to Convoluntional neural networks (CNNs)?

3. Explain the architecture of LeNet-5 and ImageNet with a neat diagram. Mention its
role in the history of deep learning and computer vision.

4. What is TensorFlow Playground? How does it help beginners understand deep learning
networks?

5. Define one-hot representation of words. What are its limitations, and how do word
vectors/embeddings overcome them?

6. Explain the concept of word-vector arithmetic with examples. How does the word2vec
algorithm learn meaningful word representations?

7. Differentiate between localist and distributed representations of language. Provide


examples of how deep learning is applied to NLP tasks like classification, machine
translation, chatbots, and search engines.

Module 2: Regularization for Deep Learning


1. What are parameter norm penalties in deep learning? Explain L1 and L2 regularization
with formulas and their impact on weights.

2. How can norm penalties be viewed as a form of constrained optimization? Give a


mathematical explanation.

3. What are under-constrained problems in deep learning? How does regularization help
in addressing them?

4. Define dataset augmentation. Explain its importance in improving model


generalization with suitable examples (e.g., image recognition).

5. What is noise robustness in deep learning? Explain how adding noise to inputs, hidden
units, or weights helps in regularization.

6. What is semi-supervised learning? Discuss its role in deep learning when labeled data
is limited.

7. What is early stopping? Explain how it works as a regularization technique with the
help of a validation curve.

Department of Computer Science and Engineering RYMEC Ballari Page 1


[Type the document title]

8. Differentiate between parameter tying and parameter sharing. Also explain the concept
of sparse representations in deep learning.

9. Describe parameter initialization strategies and adaptive learning rate algorithms (


SGD,Momemtum Algo)?

Module 3

1. Explain the concept of the convolution operation in Convolutional Neural


Networks (CNNs).
2. What is the motivation behind using convolutional layers in deep learning
architectures?
3. Describe pooling in CNNs.
4. Explain how convolution and pooling act as an ‘infinitely strong prior’ in deep
learning models.
5. List and explain the variants of the basic convolution function.
6. What are structured outputs in CNNs? Provide suitable examples.
7. Discuss the types of data handled by CNNs and how CNN architectures differ for
1D, 2D, and 3D data.
8. Explain efficient convolution algorithms and their impact on deep learning
performance.

Module 4 (Half)

1. Explain the difference between recurrent neural networks (RNNs) and feed
forward neural networks. Why are RNNs particularly suited for sequence data?
2. What is a computational graph in the context of RNNs? Explain the process of
unfolding a recurrent network over time.
3. Describe the vanishing and exploding gradient problems in RNNs. How do they
affect learning in long sequences?
4. What are bidirectional RNNs (BiRNNs)? How do they differ from standard
RNNs, and in what scenarios are they useful?
5. Explain the encoder-decoder architecture used in sequence-to-sequence tasks.
Provide an example application in natural language processing.
6. Compare and contrast recurrent networks and recursive networks. Give examples
of tasks where recursive networks are more suitable.
7. What are the advantages and limitations of using RNNs for modeling long-term
dependencies in sequential data?
8. How do LSTM (Long Short-Term Memory) and GRU (Gated Recurrent
Unit) architectures help solve the gradient vanishing problem in standard
RNNs?

Department of Computer Science and Engineering RYMEC Ballari Page 2


[Type the document title]

Department of Computer Science and Engineering RYMEC Ballari Page 3

You might also like