CSL-537: Deep Learning
Autumn Semester 2025-26
Autumn Semester 2025-26
Make sure you are fundamental understanding in Linear Algebra and Probability. Below are reference material to revise the basics
Python Programming
Foundational course in AI or ML
Loss functions: Mean Square Error, Negative Log Likelihood, Cross Entropy
Optimization techniques: Gradient Descent, Stochastic Gradient Descent, Constrained convex optimization (Lagrangians)
Supervised Techniques: Linear Regression, Logistic Regression, Perceptron Classification, Decision Trees, Random Forests, SVMs, Naive Bayes, k-NN
Unsupervised Techniques: K-means Clustering, Gaussian Mixture Modeling (GMMs), Expectation-Maximization (EM), Principal Component Analysis (PCA)
Chapters 1-5 of book Deep Learning by Ian Goodfellow:
Microsoft Teams
To be updated ...
Introduction
Multi-layered Perceptrons
Backpropagation
Regularization: L1-L2 Norms
Dropouts
Optimization: Challenges. Stochastic Gradient Descent
Advanced Optimization Algorithms
Batch and Layer Normalization
Convolutional Networks (CNNs)
...
Deep Learning by Ian Goodfellow
To be updated ...
CWS: 30%
MTE: 30%
ETE: 40%