IEOR 262B: Mathematical Programming II
Instructor: Javad Lavaei
Time: Tuesdays and Thursdays, 12:30-2pm
Location: Online
Instructor's Office Hours: Thursdays, 2-3pm
Grading Policy:
Description
This course provides a fundamental understanding of general nonlinear optimization theory, convex optimization, conic optimization, low-rank optimization, numerical algorithms, and distributed computation. Some of the topics covered in this course are as follows:
Nonlinear optimization: First- and second-order conditions, Fritz John optimality conditions, Lagrangian, duality, augmented Lagrangian, etc.
Convexity: Convex sets, convex functions, convex optimization, conic optimization, convex reformulations, etc.
Convexification: Low-rank optimization, conic relaxations, sum-of-squares, etc.
Algorithms: Descent algorithms, conjugate gradient methods, gradient projection & conditional gradient methods, block coordinate methods, proximal methods, second-order methods, accelerated methods, decomposition and distributed algorithms, convergence analysis, etc.
Applications: Machine learning, data science, etc.
Textbook
‘‘Low-Rank Semidefinite Programming: Theory and Applications" by Alex Lemon, Anthony Man-Cho So and Yinyu Ye, Foundations and Trends in Optimization, 2015 (click here to download the monograph).
Lecture notes
Homework
|