MACHINE LEARNING - NPTEL
Week 1: Introduction: Basic definitions, types of learning, hypothesis space and
inductive bias, evaluation, cross-validation
Week 2: Linear regression, Decision trees, overfitting
Week 3: Instance based learning, Feature reduction, Collaborative filtering based
recommendation
Week 4: Probability and Bayes learning
Week 5: Logistic Regression, Support Vector Machine, Kernel function and Kernel
SVM
Week 6: Neural network: Perceptron, multilayer network, backpropagation,
introduction to deep neural network
Week 7: Computational learning theory, PAC learning model, Sample complexity,
VC Dimension, Ensemble learning
Week 8: Clustering: k-means, adaptive hierarchical clustering, Gaussian mixture
model
SYLLABUS: IIT Madras
Basic Maths : Probability, Linear Algebra, Convex Optimization
Background: Statistical Decision Theory, Bayesian Learning (ML, MAP, Bayes
estimates, Conjugate priors)
Regression : Linear Regression, Ridge Regression, Lasso
Dimensionality Reduction : Principal Component Analysis, Partial Least Squares
Classification : Linear Classification, Logistic Regression, Linear Discriminant
Analysis, Quadratic Discriminant Analysis, Perceptron, Support Vector Machines +
Kernels, Artificial Neural Networks + BackPropagation, Decision Trees, Bayes
Optimal Classifier, Naive Bayes.
Evaluation measures : Hypothesis testing, Ensemble Methods, Bagging Adaboost
Gradient Boosting, Clustering, K-means, K-medoids, Density-based Hierarchical,
Spectral
Miscellaneous topics: Expectation Maximization, GMMs, Learning theory Intro
to Reinforcement Learning
Graphical Models: Bayesian Networks.
SYLLABUS: Jadhavpur
Introduction: What is machine learning? Applications of Machine Learning, Types
of machine learning with examples-supervised learning, unsupervised learning,
semi-supervised learning, reinforcement learning, Learning as search [2L]
Learning from examples, training data representation, test data, output vector
representation, hypothesis representation, hypothesis space, inductive bias,
problem of generalization, more specific hypothesis and more general hypothesis,
VC dimension, PAC learning, how noise affects learning. [4L]
Decision tree learning: ID3 algorithm with real life examples, overfitting, handling
continuous attributes and missing attributes [4L]
Bayesian Learning: Bayesian decision theory, Bayesian classification, losses, risks,
discriminant functions [2L]
Linear regression and logistic regression – regression vs. classification, hypothesis
representation, cost function, logistic function, Derivation of gradient descent
algorithm, Learning multiple classes [4L]
Support Vector Machines- Linear Support Vector Machine and brief introduction to
Kernel Machines. Multi-class SVM: One vs. all strategy. [4L]
Instance based learning: K-nearest classifier, Curse of dimensionality, When to use
KNN? [2L]
Performance measures for Machine learning algorithms- Confusion matrix,
Evaluation Measures -Accuracy, Error rate, precision, recall, F-measure etc.
Bootstrapping & Cross Validation, ROC curve [2L]
Model selection procedures: overfitting, regularization, model complexity,
bias/variance dilemma [2L]
Artificial neural networks: Intro to Artificial neural networks, Backpropagation
algorithm, introduction to deep neural networks with real life examples [6L]
Unsupervised learning: Clustering --- Distance based and probabilistic models
[5L]
Ensemble learning: boosting , bagging and random forest [2L]
Introduction to modern machine learning tools and packages such as WEKA under
Java platform and/or Scikit-learn under python platform and/or machine learning
under R platform. [