NPTEL - Intro to Machine Learning
TA for NPTEL's Intro to ML course (NOC22-CS73) (Jul - Oct 2022)
TA Duties
As part of NPTEL PMRF TA scheme, I got the opportunity to serve as a TA for the Introduction to Machine Learning course taught by Dr. Balaraman Ravindran. As part of the TA duties, we would have one hour sessions with the students every week (total of 12 weeks) in which we could either teach a new concept or clarify students’ doubts. I would generally ask the students in the chat forum if they wanted any specific doubts clarified a day before my session. I would either solve those questions or teach a related concept not covered by the professor that week or teach a concept in more detail than covered by the professor. We were required to record the lectures and upload it on YouTube so that students who could not make it to the session could watch later.
Experience
TAing for this course made the “When one teaches, two learn” quote true for me! ML field is challenging at best and TAing for this course helped me get back into touch with the basics. Handling diverse doubts from students helped me also learn and strengthen my concepts. Since it was live and interactive, I had to think on the spot and even have the humility to say that I don’t know something when I don’t know it and came back with answers with rigorous arguments.
Teaching Content
All the lectures are uploaded in the Intro to Machine Learning - Live Sessions playlist of my YouTube teaching channel.
-
Bayes' theorem applied to a medical test meant as a pilot run for system checks.
-
We solved Linear Algebra and Probability problems from Intro to ML NPTEL Course practice problems. We touch upon concepts like Bayes' theorem, conditional probability, Eigen values and vectors, joint probability distributions. Lastly, we brainstorm ideas on building an ML system for predicting weather!
-
We discuss a bit more on the total probability formula. We solve a simple probability problem. We then discuss about linear regression and see how to solve it using Numpy!
-
We discuss about some properties of Positive Semi-Definite (PSD) and Positive Definite (PD) matrices and connect these to Ridge Regression. We see an example of how to use LDA in scikit-learn package. We also discuss a real world dataset i.e., ABIDEII dataset - description and simple usage.
-
Week 1, 2 & 3 Practice questions - Solutions & Discussions. We have a discussion on PCA algorithm.
-
SVM i.e., Week 4 Practice and Quiz questions - Solutions & Discussions. Problems like how to determine convexity were dealt in great detail.
-
We discussion on Convex optimization in context of SVM - Why the Lagrangian formulation and How solving it helps! We discussion on weight initialisation problem in neural networks and the effect of learning rates, batch sizes etc. using visuals developed by the deeplearning.ai team.
-
We discuss on the problem of reproducibility which plagues the ML field right now. We discuss ideas like Parameter Sharing (in context of CNNs and RNNs) & Probabilistic NNs.
-
We have a detailed discussion on the Class Imbalance Problem - Definition & General Solutions in the ML Literature
-
We discussion about MultiClass and MultiLabel Classification - How they are different and a soft intro about why problems like one-shot or few-shot learning matter. We also discuss some properties of Bayesian Networks.
-
We discuss about Bayesian Networks, Continuous Probability Distributions, Hidden Markov Models & Graph Neural Networks - GCNs, Graph RNNs
-
We discuss about KMeans++ and Clustering with Neural Networks
-
We discussion Gaussian Mixture Models and some problems in that regard. To end, we discuss about some previous topics as sort of revision.