IEOR 262B: Mathematical Programming II
Instructor: Javad Lavaei
Time: Tuesdays and Thursdays, 12:302pm
Location: Online
Instructor's Office Hours: Thursdays, 23pm
Grading Policy:
Description
This course provides a fundamental understanding of general nonlinear optimization theory, convex optimization, conic optimization, lowrank optimization, numerical algorithms, and distributed computation. Some of the topics covered in this course are as follows:
Nonlinear optimization: First and secondorder conditions, Fritz John optimality conditions, Lagrangian, duality, augmented Lagrangian, etc.
Convexity: Convex sets, convex functions, convex optimization, conic optimization, convex reformulations, etc.
Convexification: Lowrank optimization, conic relaxations, sumofsquares, etc.
Algorithms: Descent algorithms, conjugate gradient methods, gradient projection & conditional gradient methods, block coordinate methods, proximal methods, secondorder methods, accelerated methods, decomposition and distributed algorithms, convergence analysis, etc.
Applications: Machine learning, data science, etc.
Textbook
‘‘LowRank Semidefinite Programming: Theory and Applications" by Alex Lemon, Anthony ManCho So and Yinyu Ye, Foundations and Trends in Optimization, 2015 (click here to download the monograph).
Lecture notes
Homework
Homework 1
Homework 2
Homework 3
Homework 4
Homework 5
Homework 6
Homework 7 (due April 30 at 5pm)
