Convex optimization problems appear in a huge range of applications in engineering, operations research, finance and other areas. Although numerical methods for convex optimization have been the subject of extensive research for decades, modern applications call for new techniques and software tools.
This talk will present two general purpose solution methods for convex optimization problems with quadratic objectives. The first, implemented in the open-source packages OSQP and COSMO, is ADMM-based and employs a novel operator splitting technique that requires the solution of a quasi-definite linear system with the same coefficient matrix in each iteration. Our algorithm is very robust, highly scalable, and places no requirements on the problem data such as positive definiteness of the objective function or linear independence of the constraint functions. For large-scale semidefinite programs (SDPs), we also employ decomposition strategies based on identifying chordal sparsity in the problem data to split large constraints into multiple smaller ones.
The second, implemented in the open-source package Clarabel, is interior-point based and uses a novel homogeneous embedding technique that offers substantially faster solve times relative to existing open-source and commercial solvers for some problem types. This improvement is due to both a reduction in the number of required interior point iterations as well as an improvement in both the size and sparsity of the linear system that must be solved at each iteration. The talk will describe details of this embedding and show performance results with respect to solvers based on the standard homogeneous self-dual embedding, including ECOS and MOSEK.
Join at: imt.lu/seminar