14:00 - 16:00 | Tue 15 Oct | Pacífico | Tu3-1
This workshop will be organized in two main components. The first component will provide a detailed description of the gradient descent algorithm and of the fast gradient
algorithm developed by Nesterov in a landmark paper. We will emphasize the following central ideas that underlie these two algorithms: the choice of stepsize, the choice of momentum (for fast gradient), and a novel proof of their convergence properties based on the Fenchel conjugate. The second component will give an overview of nonlinear programming. We will discuss the Karush-Kuhn-Tucker optimality conditions. We will also describe the two following main algorithmic approaches to nonlinear programming: sequential quadratic programming and interior-point methods. We will illustrate the techniques from both components via several numerical examples in Python.
No information added