0% found this document useful (0 votes)
13 views1 page

Deep Learning Lab QP Model

The document outlines a model exam for a Deep Learning Laboratory course, detailing 18 practical tasks for students to implement various neural network architectures. Tasks include solving the XOR problem with a DNN, training CNNs for digit and alphabet recognition, and developing RNNs for text generation and sentiment analysis. Additional projects involve using GANs for image generation and applying transfer learning for image classification on different datasets.

Uploaded by

kannan.niran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views1 page

Deep Learning Lab QP Model

The document outlines a model exam for a Deep Learning Laboratory course, detailing 18 practical tasks for students to implement various neural network architectures. Tasks include solving the XOR problem with a DNN, training CNNs for digit and alphabet recognition, and developing RNNs for text generation and sentiment analysis. Additional projects involve using GANs for image generation and applying transfer learning for image classification on different datasets.

Uploaded by

kannan.niran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

SUBJECT NAME: DEEP LEARNING LABORATORY

SUBJECT CODE : AD23521


SEMESTER/YEAR: V / III

MODEL EXAM

1. Implement a Deep Neural Network (DNN) to solve the XOR problem. Visualize the learned decision
boundary.
2. Train a Convolutional Neural Network (CNN) for handwritten digit recognition using MNIST
dataset and report accuracy.
3. Build a Recurrent Neural Network (RNN) for text generation on a small dataset of your choice.
4. Implement a Multi-Layer Perceptron (MLP) to classify Iris dataset. Compare training and testing
accuracy.
5. Train a CNN to recognize alphabets using EMNIST dataset. Display confusion matrix.
6. Implement a simple RNN to predict the next word in a given text sequence.
7. Develop a CNN-based face recognition system. Use embeddings to classify faces using cosine
similarity or k-NN.
8. Implement an LSTM-based model for sentiment analysis using the IMDB movie review dataset.
Report accuracy and confusion matrix.
9. Build a Sequence-to-Sequence model for Parts-of-Speech (POS) tagging. Demonstrate predictions on
at least 3 input sentences.
10. Implement an Encoder–Decoder model for machine translation (English → French). Demonstrate
with test sentences.
11. Build a CNN for object recognition on CIFAR-10 dataset. Show training/validation accuracy curves.
12. Implement a Bi-directional LSTM for text classification. Compare with a vanilla RNN.
13. Construct a Sequence-to-Sequence model for question answering (input: simple question, output:
answer). Demonstrate at least 3 queries.
14. Train a Variational Autoencoder (VAE) on MNIST dataset. Display original vs reconstructed
images.
15. Design and implement a Generative Adversarial Network (GAN) for image augmentation. Display
generated synthetic images alongside originals.
16. Apply Transfer Learning using a pre-trained CNN (VGG16/ResNet) for flower image classification.
Compare results with a simple CNN.
17. Design a GAN for handwritten digit generation. Display at least 5 synthetic digits generated.
18. Apply Transfer Learning using MobileNet/ResNet for classifying medical images (X-ray dataset).
Evaluate with precision, recall, and F1-score.

You might also like