Robust optimization (RO) has emerged as one of the leading paradigms to efficiently model parameter uncertainty. The recent connections between RO and problems in statistics and machine learning domains demand for solving RO problems in ever more larger scale. However, the traditional approaches for solving RO formulations based on building and solving robust counterparts or the iterative approaches utilizing nominal feasibility oracles can be prohibitively expensive and thus significantly hinder the scalability of RO paradigm.
We present a general and flexible iterative framework to solve robust convex optimization problems that is built on a fully online first-order paradigm. In contrast to the existing literature, a key distinguishing feature of our approach is that it requires access to only cheap first-order oracles for the original problem that are remarkably cheaper than pessimization or nominal feasibility oracles, while maintaining the same convergence rates. This, in particular, makes our approach much more scalable and hence preferable in large-scale applications. In the case of strongly convex functions, we also establish a new improved iteration complexity addressing an open question in the literature. Motivated by a robust portfolio optimization problem, we demonstrate our approach on robust quadratic programs with ellipsoidal uncertainty.
If time permits, we will discuss connections and applicability of our online framework in the context of joint estimation and optimization problems.
This is joint work with Nam Ho-Nguyen (Carnegie Mellon University, USA).
Fatma is an Associate Professor of Operations Research at Tepper School of Business, Carnegie Mellon University. She is also affiliated with the Algorithms Combinatorics and Optimization (ACO) PhD Program.