14:00 - 15:00 | Wed 16 Oct | Pacífico | W3-P4-1
We propose an overview and unified analysis of Bregman proximal first-order algorithms for convex minimization. Bregman proximal methods are a flexible and
versatileclass of algorithms that generalizes the gradient descent, projected gradient, and proximal gradient algorithms. This class of algorithms offers enormous potential to tackle large-scale optimization problems arising in a variety of disciplines. Our approach highlights the fundamental but somewhat overlooked role that the Fechel conjugate plays in this important class of algorithms. Our approach yields novel proofs of the convergence rates of the Bregman proximal subgradient, Bregman proximal gradient, and a new accelerated Bregman proximal gradient algorithm. We illustrate the effectiveness of Bregman proximal methods in two problems of great interest in data science, namely the D-optimal design and the Poisson linear inverse problems.