0% found this document useful (0 votes)
61 views5 pages

AI Programming Syllabus

AI Programming Syllabus 101

Uploaded by

김지현
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views5 pages

AI Programming Syllabus

AI Programming Syllabus 101

Uploaded by

김지현
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Programming for AI (AI 504)

Fall 2025

Course Aim
Learn and practice essential programming skills for conducting machine learning and deep learning
research. Each week, one class will be dedicated to learning theory, and the other will be used for
practicing. We will cover various topics including:
Numpy, Scikit-Learn, PyTorch Basics, Autoencoders, VAE, GAN, RNN, CNN, Transformer, Language
Models, Diffusion Models.

Lectures
Tuesday, Thursday 10:30am-12:00pm.
Both online & offline offered:
●​ Offline lecture at Kins Tower, Seongnam-si
●​ Online lecture using Zoom (https://kaist.zoom.us/j/7425492761)

Textbook
No textbook. Will upload reading material if necessary.

Instructor
Edward Choi, Associate Professor, Graduate School of AI, College of Engineering.
Email: [email protected]
Webpage: https://mp2893.com

Teaching Assistants
TBA

Grading
Pass/Fail
●​ Project 1: 30%
●​ Project 2: 30%
●​ Project 3: 40%

Weekly Plan
Each week will consist of one theory session and another practice session.
1.​ Intro + Numpy
2.​ Basic Machine Learning + Scikit-learn
3.​ PyTorch (Autograd) + Logistic Regression + Multi-layer Perceptron
4.​ Autoencoders (& Denoising Autoencoders)
5.​ Variational Autoencoders
6.​ Generative Adversarial Networks
7.​ Convolutional Neural Networks
8.​ Project 1: Image synthesis
9.​ Word2Vec + Subword Encoding
10.​ Recurrent Neural Networks & Sequence-to-Sequence
11.​ Transformers
12.​ BERT & GPT
13.​ Project 2: Language Model
14.​ Deep Diffusion Probabilistic Model
15.​ Image-Text Multi-modal Learning
16.​ Project 3: Vision-Language Model

Weekly Course Plan


●​ Week 1
○​ Lecture
■​ Introduction
■​ Jupyter Notebook Install
○​ Practice
■​ Numpy
■​ Explain vectors, matrices, tensors
■​ Indexing
■​ Element-wise operation
■​ Dot product
■​ Reshape
■​ Concat
●​ Week 2
○​ Lecture
■​ Supervised, Unsupervised, RL
■​ Optimization
■​ Evaluation metric
■​ Train/Dev/Test
■​ Over-fitting, under-fitting
■​ Regularization
■​ Curse of dimensionality
■​ Classifiers
○​ Practice
■​ Scikit-learn
■​ Matplotlib
●​ Week 3
○​ Lecture
■​ PyTorch Intro
■​ Logistic Regression
■​ Multi-layer perceptron
○​ Practice
■​ Playing with PyTorch
■​ Linear regression with 2-D samples
■​ MNIST classification with Logistic Regression and MLP
●​ Week 4
○​ Lecture
■​ Autoencoders
■​ Dimension reduction
■​ Denoising autoencoder
○​ Practice
■​ Build Autoencoder & Denoising Autoencoder
■​ Dimension reduction with MNIST ⇒ tSNE
●​ Week 5
○​ Lecture
■​ Variational Autoencoders
■​ Variational inference
■​ Image generation
○​ Practice
■​ Implement VAE
■​ Image reconstruction & generation
■​ Evaluate likelihood
●​ Week 6
○​ Lecture
■​ Min-max game
■​ Image generation
■​ Evaluation: Inception score, FID, SSIM, LPIPS, PSNR
■​ Using different discriminators
■​ VAE vs GAN
○​ Practice
■​ Implement GAN
■​ Image generation
■​ Evaluate FID
●​ Week 7
○​ Lecture
■​ Convolutional Neural Networks
■​ Kernel
■​ Pooling
■​ Stride
■​ 1D conv, 3D conv
■​ ResNet
■​ ImageNet Classification
○​ Practice
■​ Implement ResNet
■​ MNIST classification
●​ Week 8
○​ Project 1
●​ Week 9
○​ Lecture
■​ Word2Vec
■​ GloVe
■​ Subword Encoding
●​ BPE
●​ WordPiece
●​ SentencePiece
○​ Practice
■​ Implement Word2Vec
■​ Apply Word2Vec on SentencePiece
●​ Week 10
○​ Lecture
■​ RNN & GRU & LSTM
■​ Sequence-to-Sequence
■​ Attention
○​ Practice
■​ Implement RNN, GRU
■​ Implement Fr-En translator with GRU+Attention
●​ Week 11
○​ Lecture
■​ Transformers
■​ QKV operation
■​ Seq2Seq
■​ Masked attention
○​ Practice
■​ Implement Neural Translation model
●​ Week 12
○​ Lecture
■​ BERT
■​ GPT 1, 2, 3
○​ Practice
■​ Implement and train language models
●​ Week 13
○​ Project 2
●​ Week 14
○​ Lecture
■​ DDPM
■​ DDIM
■​ Guided diffusion
■​ Dall-E 2 & Stable Diffusion
○​ Practice
■​ Implement DDPM
■​ Image generation
●​ Week 15
○​ Lecture
■​ Image-to-Text
■​ Text-to-Image
■​ Multi-modal pretraining
■​ CLIP
○​ Practice
■​ Latent DDPM
■​ Conditional image generation
●​ Week 16
○​ Project 3

You might also like