0% found this document useful (0 votes)
187 views6 pages

Probabilistic Graphical Model

The document outlines the course structure for 'Probabilistic Graphical Model' at Birla Institute of Technology & Science, Pilani, detailing course objectives, content structure, learning outcomes, and evaluation schemes. It aims to equip students with advanced skills in constructing and analyzing probabilistic models, including Bayesian and Markov networks, while also covering inference algorithms and learning techniques. The course includes a combination of theoretical and practical components, with assessments through quizzes, assignments, and exams.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
187 views6 pages

Probabilistic Graphical Model

The document outlines the course structure for 'Probabilistic Graphical Model' at Birla Institute of Technology & Science, Pilani, detailing course objectives, content structure, learning outcomes, and evaluation schemes. It aims to equip students with advanced skills in constructing and analyzing probabilistic models, including Bayesian and Markov networks, while also covering inference algorithms and learning techniques. The course includes a combination of theoretical and practical components, with assessments through quizzes, assignments, and exams.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE, PILANI

WORK INTEGRATED LEARNING PROGRAMMES


Digital
Part A: Content Design
Course Title Probabilistic Graphical Model
AIML*ZG526
Course No(s)
Credit Units 4

Credit Model 1 - 0.5 - 1.5.


1 unit for class room hours, 0.5 unit for Tutorial, 1.5 units for
Student preparation. 1 unit = 32 hours

Content Authors Ms. Seetha Parameswaran

Version 1.0

Date June 22th, 2019

Course Objectives

By the end of this course, students will be able to:

No Course Objective

CO1 Apply advanced concepts of probability theory and graph theory to construct and analyze
complex structured probabilistic models, including Bayesian Networks and Markov
Networks.

CO2 Analyze the independence properties and factorization capabilities of directed and
undirected graphical models using techniques such as d-separation and Gibbs distributions.

CO3 Evaluate the effectiveness of various exact and approximate inference algorithms for
probabilistic graphical models, considering factors such as computational complexity and
accuracy.

CO4 Design and implement parameter learning and structure learning algorithms for both
Bayesian Networks and Markov Networks, addressing challenges in different real-world
scenarios.
CO5 Critically assess the strengths and limitations of different structured probabilistic models
and inference techniques, and propose innovative solutions to overcome their limitations
in complex problem domains.

Text Book(s)
T1 Mastering Probabilistic Graphical Models using Python by Ankur Ankan, Abhinash
Panda. Packt Publishing 2015.

T2 Building Probabilistic Graphical Models with Python by Kiran R Karkera. Packt


Publishing 2014.
Reference Book(s) & other resources
R1 Probabilistic Graphical Models: Principles and Techniques by Daphne Koller and
Nir Friedman. MIT Press. 2009

R2 Learning in Graphical Models by Michael I. Jordan. MIT Press. 1999

Content Structure
1. Introduction
1.1. Objective of the course
1.2. Structured Probabilistic Models

2. Mathematical Preliminaries
2.1. Probability Theory
2.2. Graph

3. Directed Graphical Models


3.1. Bayes Networks
3.2. D-separation
3.3. I-map

4. Undirected Graphical Models


4.1. Markov Networks
4.2. Gibbs distributions
4.3. Factorization

5. Exact Inference
5.1. Variable Inference
5.2. Belief Propagation
5.3. MAP using belief propagation

6. Approximate Inference
6.1. Propagation based approximation algorithm
6.2. Loopy Belief propagation
6.3. Sampling based approximate messages
6.4. Markov chain Monte Carlo methods

7. Parameter Learning
7.1. Parameter Estimation in Bayesian Networks
7.2. Maximum Likelihood Estimation
7.3. Parameter Estimation in Markov Networks

8. Structure Learning
8.1. Structure learning in Bayesian Networks
8.2. Constraint based structure learning
8.3. Score based constraint learning

9. Models
9.1. Naïve Bayes Model
9.2. Hidden Markov Model

Learning Outcomes:

Upon successful completion of this course, students will be able to:


No Learning Outcomes

LO1 Construct and manipulate complex Bayesian and Markov Networks to model real-
world probabilistic relationships, demonstrating proficiency in applying graph
theory and probability concepts.

LO2 Analyze the independence properties of given graphical models using d-


separation and factorization techniques, and interpret their implications for
probabilistic inference.

LO3 Implement and compare the performance of exact and approximate inference
algorithms (such as Belief Propagation and Markov Chain Monte Carlo methods)
on various structured probabilistic models.

LO4 Design and execute parameter and structure learning algorithms for both
directed and undirected graphical models, critically evaluating their effectiveness
on different types of data.

LO5 Develop innovative applications of structured probabilistic models (such as


advanced Naïve Bayes or Hidden Markov Models) to solve complex problems in
areas like pattern recognition, natural language processing.

Part B: Learning Plan


Academic Term

Course Title Probabilistic Graphical Model


AIML*ZG526
Course No
Lead Instructor
Session Study / HW
No. Topic Title Resource
Reference

Introduction
1
Objective of the course, Structured Probabilistic Models, R1 – Ch1
Representation, Inference, Learning, Application of Probabilistic
Graphical Models.

Mathematical Preliminaries
2
Probability theory, Probability Distributions, Random Variables
and Joint Distributions, Independence and Conditional T1 – Ch1
Independence, Expectation and Variance T2 – Ch1
Graphs, Nodes and Edges, Subgraphs, Paths and Trails, Cycles and
Loops

Directed Graphical Models


3
Independence and independent parameters, Bayesian models, T1 – Ch1
Representation, Factorization of a distribution over a network,
Bayesian model representation

Directed Graphical Models (contd)


4
D-separation, IMAP, IMAP to factorization, CPD representations, T1 – Ch1
Implementing Bayesian networks using pgmpy

Undirected Graphical Models


5
Markov network, Parameterizing a Markov network – factor, T1 – Ch2
Factor operations, Gibbs distributions and Markov networks,
Factor graph

Undirected Graphical Models (contd)


6
Independencies in Markov networks, Constructing graphs from T1 – Ch2
distributions
Bayesian and Markov networks

Exact Inference
7
Variable elimination, Belief propagation, Constructing a clique
tree, MAP using variable elimination, Factor maximization, MAP T1 – Ch3
using belief propagation, Finding the most probable assignment,
Predictions from the model using pgmpy

Books, Web
8
Review of Session 1 to 7 references and
Slides
Approximate Inference
9
Exact inference as an optimization, Propagation-based
approximation algorithm, Loopy Belief propagation , Propagation T1 – Ch4
with approximate messages, Sampling-based approximate T2 – Ch7
methods, Markov chain Monte Carlo methods, Using a Markov
chain

Parameter Learning
10
General ideas in learning, Learning as an optimization, Maximum T1 – Ch5, Ch6
likelihood estimation, Parameter Estimation in Bayesian T2 – Ch5
Networks, MLE for Bayesian networks

Parameter Learning (contd)


11 T1 – Ch5, Ch6
Parameter Estimation in Markov Networks, MLE for Markov T2 – Ch5
models

Structure Learning
12
Structure learning in Bayesian networks, Methods for the T1 – Ch5, Ch6
learning structure, Constraint-based structure learning, Structure T2 – Ch4
score learning, Bayesian score for Bayesian networks

Structure Learning (contd)


13 T1 – Ch5, Ch6
Structure learning in Markov Models, Constraint-based structure T2 – Ch4
learning, Structure score learning

Naïve Bayes Model, Implementation T1 – Ch7


14

Hidden Markov Model, Implementation T1 – Ch7


15

Books, Web
16
Review of session 9 to 15 references and
Slides

Detailed Plan for Lab work

Session
Lab No. Lab Objective Lab Sheet Access URL Reference

Bayesian model representation 4


1
Markov Model representation 6
2
MAP on Bayesian model 7
3
MLE on Bayesian Model 10
4
MLE on Markov Model 11
5
Learning Structure in Bayesian Model 12
6
Evaluation Scheme:
Legend: EC = Evaluation Component; AN = After Noon Session; FN = Fore Noon Session

Name Type Duration Weight Day, Date, Session, Time


No
Quizzes Online 10%
EC-1
Assignments Take Home 20%
EC-2
Mid-Semester Test Closed Book 1.5 Hrs 30%
EC-3
Comprehensive Exam Open Book 2.5 Hrs 40%
EC-4

Note:
Syllabus for Mid-Semester Test (Closed Book): Topics in Session Nos. 1 to 8
Syllabus for Comprehensive Exam (Open Book): All topics (Session Nos. 1 to 16)

Important links and information:

Elearn portal: https://elearn.bits-pilani.ac.in


Students are expected to visit the Elearn portal on a regular basis and stay up to date with
the latest announcements and deadlines.

Contact sessions: Students should attend the online lectures as per the schedule provided
on the Elearn portal.

Evaluation Guidelines:
1. EC-1 consists of two Quizzes. Students will attempt them through the course pages on
the Elearn portal. Announcements will be made on the portal, in a timely manner.
2. EC-2 consists of either one or two Assignments. Students will attempt them through
the course pages on the Elearn portal. Announcements will be made on the portal, in
a timely manner.
3. For Closed Book tests: No books or reference material of any kind will be permitted.
4. For Open Book exams: Use of books and any printed / written reference material
(filed or bound) is permitted. However, loose sheets of paper will not be allowed.
Use of calculators is permitted in all exams. Laptops/Mobiles of any kind are not
allowed. Exchange of any material is not allowed.
5. If a student is unable to appear for the Regular Test/Exam due to genuine exigencies,
the student should follow the procedure to apply for the Make-Up Test/Exam which
will be made available on the Elearn portal. The Make-Up Test/Exam will be
conducted only at selected exam centres on the dates to be announced later.

It shall be the responsibility of the individual student to be regular in maintaining the self-
study schedule as given in the course hand-out, attend the online lectures, and take all the
prescribed evaluation components such as Assignment/Quiz, Mid-Semester Test and
Comprehensive Exam according to the evaluation scheme provided in the hand-out.

You might also like