AI2101/MA2101 Convex OptimizationWelcome to the official webpage of the course AI2101/MA2101 (Convex Optimization). This introductory course on Convex optimization aims to introduce some mathematical background and algorithms helpful in solving optimization problems in general. Although the primary focus of this course will be on elementary algorithms, equal emphasis is laid on related mathematical concepts. This course is primarily targeted for undergraduate students. Course Contents : Algorithms for single variable optimization - Golden Section Search, Bisection Search, Newton's Method and Secant Method, Subspace, Affine and Convex Sets, Affine and Convex Functions, Limit and Differentiation of function of multiple Variables, Chain Rule, Gradient, Tangent and Normal Vectors, Directional Derivatives, First and Second order conditions for Convexity, First and Second order conditions for Optimality, Newton's method for multi-variable optimization, State Estimation, Descent Algorithms for multi-variable optimization - Gradient Descent and Steepest Descent, Lagrange Conditions for solving optimization problems with equality constrains, KKT conditions for solving optimization problems with inequality constrains, Duality, Discrete and Mixed Integer Programming. Students taking this course are expected to be familiar with Linear Algebra/Matrix Theory and Vector Calculus. The course will have a good number of programming exercises (some of which will be based on the package CVXPY). Instructor
Class Timings
Evaluation Pattern
References
Teaching AssistantsCredit to the following undergraduate students who have volunteered to serve as Teaching Assistants for the course.
|