Teaching
- Spring 2024: Algorithms, Geometry, and Optimization. (CS 4960/6960) The course covers some of the modern paradigms in algorithm design, motivated by appications like data streaming, dimensionality reduction, graph analysis, etc. The course gives a flavor of different techniques, covering areas like randomized algorithms, convex relaxations, spectral graph theory, and optimization-based methods. [Course webpage]
- Spring 2022: Theory of Machine Learning. (CS 5966/6966) The course goes in depth into the theoretical foundations of machine learning. We will study concepts like generalization (why ML algorithms work on unseen data), analyze optimization algorithms (GD, SGD), and look at some recent developments in ML theory. [Course webpage]
- Fall 2021: Graduate Algorithms. (CS 5150/6150) This is an introductory algorithms class for graduate students. Students are expected to have done an undergrad level data structures/algorithms class, and are expected to be comfortable with probability and discrete mathematics. [Course webpage].
Here is a link to an older version with a link to course content, lecture notes and lecture videos: Fall 2019 homepage.
- Spring 2021: Probability and Statistics for Engineers. (CS 3130) The course introduces the basic concepts of probability theory and statistics, such as conditional probability, random variables, the central limit theorem, statistical inference and hypothesis testing. [Course webpage]
- Fall 2016-20: Graduate Algorithms (CS 5150/6150)
- Spring 2020: Probability and Statistics for Engineers (CS 3130)
- Spring 2017: Theory of Machine Learning (CS 5966/6966) [Link to the course webpage]
- Spring 2016: Techniques in Algorithms and Approximation. (CS 5968/6968) This is a graduate level course covering approximation algorithms, spectral methods, hardness, etc. It is accessible to students comfortable with the material in Graduate Algorithms (CS 6150). [Course webpage]
- Fall 2016: ML Seminar: Large scale machine learning. This seminar will cover some of the recent developments in large scale machine learning, including distributed optimization, lock-free methods and their analysis, ADMM, and the like. [Course webpage]