0% found this document useful (0 votes)
9 views2 pages

Deep Learning Fundamentals

Uploaded by

G Dayanandam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views2 pages

Deep Learning Fundamentals

Uploaded by

G Dayanandam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

SEMESTER-V

COURSE 14B: Deep Learning Fundamentals


Theory Credits: 3 3 hrs/week

Course Objectives:
1. To understand the theoretical foundations and mathematics underlying deep learning
algorithms.
2. To explore neural network architectures and their optimization techniques.
3. To apply convolutional and recurrent neural networks to real-world data.
4. To understand the role of autoencoders, generative models, and transfer learning.
5. To develop practical deep learning models using open-source frameworks such as
TensorFlow or PyTorch.

Course Outcomes:
After successful completion of this course, students will be able to:
1. Explain the fundamental principles of neural networks and deep learning.
2. Implement and train deep neural networks for various data modalities.
3. Analyze and optimize network performance using gradient-based methods.
4. Apply advanced architectures such as CNNs, RNNs, and GANs to solve complex
problems.
5. Design, evaluate, and deploy end-to-end deep learning systems for practical applications.

Unit I: Introduction to Deep Learning and Neural Networks


Introduction to AI and Machine Learning – Motivation for Deep Learning – Biological Neurons
and Artificial Neurons – Perceptrons and Multilayer Perceptrons – Activation Functions – Loss
Functions – Backpropagation and Gradient Descent
Unit II: Optimization and Training Deep Networks
Challenges in Training Deep Networks – Vanishing and Exploding Gradients – Regularization
(Dropout, Batch Normalization) – Weight Initialization – Optimizers (SGD, Adam, RMSProp) –
Hyperparameter Tuning and Model Evaluation
Unit III: Convolutional Neural Networks (CNNs)
Convolution and Pooling Operations – CNN Architectures (LeNet, AlexNet, VGG, ResNet,
Inception) – Transfer Learning – Visualization and Interpretation – Applications in Image
Recognition and Computer Vision
Unit IV: Sequential Models and Recurrent Neural Networks (RNNs)
Sequence Modeling – RNNs and LSTMs – GRUs – Attention Mechanisms – Transformers
Overview – Applications in Natural Language Processing (Text, Speech)
Unit V: Generative and Advanced Deep Learning Models
Autoencoders – Variational Autoencoders (VAEs) – Generative Adversarial Networks (GANs) –
Reinforcement Learning Basics – Ethical and Societal Implications of Deep Learning

Text Books:

1. Fundamentals of Deep Learning (2nd Edition) by Nikhil Buduma, Nithin Buduma & Joe
Papa, O’Reilly Media
2. Programming Neural Networks with Python” by Joachim Steinwender and Roland
Schwaiger (Rheinwerk Computing).
SEMESTER-V
COURSE 14B: Deep Learning Fundamentals
Practical Credits: 1 2 hrs/week

No. Practical Title Objective Key Concepts / Tools


Implement a Perceptron from Understand the basic neuron NumPy, dot product, step
1
Scratch model and activation functions function
Train a Simple Neural Learn supervised learning and Feedforward, weight
2
Network for AND/OR Gates error correction updates, loss function
Handwritten Digit Implement your first deep Keras/TensorFlow,
3
Recognition using MNIST neural network softmax, dense layers
Visualize Activation Compare sigmoid, tanh, and Matplotlib visualization,
4
Functions ReLU behaviors activation study
Build a Multi-Layer
Introduce hidden layers and Dense networks, batch
5 Perceptron (MLP) for
backpropagation training
Classification
Implement Gradient Descent Learn the optimization concept Mean squared error,
6
from Scratch manually learning rate tuning
Image Classification with
Learn CNN architecture and Conv2D, MaxPooling,
7 Convolutional Neural
filters Flatten layers
Networks
Train a Neural Network with Prevent overfitting and improve Dropout layers,
8
Dropout Regularization generalization training/testing accuracy
Predict House Prices Using a Apply deep learning to Normalization, linear
9
Regression Neural Network regression problems output layer
Implement a Basic
Understand neural networks in Q-learning, reward
10 Reinforcement Learning
decision-making function, exploration
Agent

You might also like