CSL-537: Deep Learning
Autumn Semester 2025-26
विद्या नाम नरस्य रूपमधिकं - Knowledge is the greatest attribute of a person.
Autumn Semester 2025-26
विद्या नाम नरस्य रूपमधिकं - Knowledge is the greatest attribute of a person.
Instructor: Jitin Singla (email: jsingla AT bt.iitr.ac.in)
Lectures:
Tue • 11:05 AM - 12:00 PM
Wed • 12:05-1:00 PM
Fri • 12:05-1:00 PM
Venue: APJ AKB - 501
Tutorial:
Wed (3:00-3:55 PM)
Venue: APJ AKB - 504
Office Hours:
Tue • 4:00–5:00 PM
Venue: 211, BSBE Dept.
15-July-2025: Course Announcements will be posted here regularly. Email notifications will only be sent if information is urgent.
This course serves as an introduction to deep learning, providing students with a solid foundation in one of the most transformative areas of machine learning.
Establish Mathematical Foundations: Gain a comprehensive understanding of the core mathematical principles that underpin deep learning models and algorithms.
Explore Key Architectures: Study widely adopted techniques and neural network architectures that have become standard in applications such as image analysis, language modeling, and beyond.
Survey Recent Advances: Examine some of the latest breakthroughs and trends in deep learning, setting the stage for further study and research in advanced topics.
Make sure you are fundamental understanding in Linear Algebra and Probability. Below are reference material to revise the basics. Review Content: Linear Algebra and Probability
Python Programming
Foundational course in AI or ML
Loss functions: Mean Square Error, Negative Log Likelihood, Cross Entropy
Optimization techniques: Gradient Descent, Stochastic Gradient Descent, Constrained convex optimization (Lagrangians)
Supervised Techniques: Linear Regression, Logistic Regression, Perceptron Classification, Decision Trees, Random Forests, SVMs, Naive Bayes, k-NN
Unsupervised Techniques: K-means Clustering, Gaussian Mixture Modeling (GMMs), Expectation-Maximization (EM), Principal Component Analysis (PCA)
Chapters 1-5 of book Deep Learning by Ian Goodfellow
Introduction and Motivation
Multi-layered Perceptrons, Neural Networks, Backpropagation
Loss Functions, Regularization: L1-L2 Norms
Optimization (Stochastic Gradient Descent, RMSProp, Adam, Adagrad)
Dropouts, Batch and Layer Normalization
Convolutional Networks (CNNs), Residual Networks
Recurrent Neural Networks (RNNs), LSTMs
Attention: Bahdanau Attention, Transformers
Word2Vec Embeddings
Understanding Language Modeling, BERT
Improving Language Understanding by Generative Pre-Training (GPT 1)
Language Models as Unsupervised Multitask Learners (GPT 2)
Language Models are Few-Shot Learners (GPT 3)
Other LLMs: LLaMa, Minstral, Gemini, Deepseek
Vision Language Models: VIT, CLIP
Python is the default programming languages for the course.
Submit via Moodle or GitHub—- as specified in each assignment.
Honor Code: Any cases of copying will be awarded a zero on the assignment. More severe penalties may follow.
Late submissions will incur penalties, as announced with assignment.
Book: Deep Learning by Ian Goodfellow
Chapter 6: Multi-layered Perceptrons, Backpropagation
Chapter 7: Regularization - L1/L2, Other Techniques
Chapter 8: Optimization Techniques. Normalization
Chapter 9: Deep Learning for Vision - Basic Models (CNNs)
Chapter 10: Deep Learning for NLP - Basic Models (RNNs, LSTM)
Papers:
Continuous Assessment (CWS): 30%
Announced & Surprise Quizzes
Assignments
Mid-Term Exam (MTE): 30%
End-Term Exam (ETE): 40%