0% found this document useful (0 votes)
3 views2 pages

Module 1&2 Que

Uploaded by

astrospace369
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views2 pages

Module 1&2 Que

Uploaded by

astrospace369
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Module 1:

1. Define supervised, unsupervised, and semi-supervised learning with one example


each.
2. What is shallow learning? How does it differ from deep learning?
3. List any four real-world applications of deep learning.
4. What is a loss function in deep learning? Give an example.
5. Write short notes on:
a) Exploding and vanishing gradients
b) Ill-conditioning in optimization
6. Explain how deep learning eliminates the need for handcrafted feature extraction.
7. Differentiate between empirical risk minimization and surrogate loss functions.
8. Describe the role of backpropagation in deep learning.
9. Why is early stopping used during training of deep learning models?
10. Explain with examples how deep learning surpasses conventional machine learning
techniques in feature learning and generalization.
11. Discuss the challenges faced in training deep neural networks and the strategies to
overcome them.
12. “Optimization in deep learning is different from pure mathematical optimization.”
Justify this statement.

Module 2:

1. List the key contributions of LeNet-5 and AlexNet in CNN evolution.


2. Define convolution and write its general 1D convolution formula.

3. What is the difference between convolution and cross-correlation in CNNs?


4. Mention any three advantages of ResNet over earlier CNN models.
5. Write short note on:
i. ZFNet and its improvements over AlexNet
Inception module in GoogLeNet

6. Explain the role of ReLU activation and dropout in AlexNet.


7. Compare VGGNet and GoogLeNet in terms of architecture and efficiency.
8. Why is depthwise separable convolution (used in Xception) more efficient than standard
convolution?
9. Interpret the significance of feature maps in convolutional operations with an example.
10. Given a 5×5 input image and a 3×3 filter, explain the step-by-step convolution process to
obtain a feature map.(L3)
11. Apply the concept of residual connections to explain how ResNet solves the vanishing
gradient problem(L3)
12. Discuss the evolution of CNN models from LeNet to ResNet, highlighting the key
improvements at each stage.
13. Explain with examples how convolution helps in detecting edges, corners, and patterns in
images.
14. Describe the commutative property of convolution with reference to 1D and 2D
operations.

You might also like