Convex Optimization with Abstract Linear Operators

Stephen Boyd

10:20 - 11:10 | Tuesday 22 March 2016 | Grand Ballroom



Domain specific languages (DSLs) for convex optimization, such as CVX and YALMIP and the more recent CVXPY and Convex.jl, are very widely used to rapidly develop, prototype, and solve convex optimization problems of modest size, say, tens of thousands of variables, with linear operators described as sparse matrices. These systems allow a user to specify a convex optimization problem in a very succinct and natural way, and then solve the problem with great reliability, with no algorithm parameter tuning, and a reasonable performance loss compared to a custom solver hand designed and tuned for the problem. In this talk we describe recent progress toward the goal of extending these DSLs to handle large-scale problems that involve linear operators given as abstract operators with fast transforms, such as those arising in image processing and vision, medical imaging, and other application areas. This involves re-thinking the entire stack, from the high-level DSL design down to the low level solvers.