Non-convex and discrete optimization problems are at the heart of many algorithmic tasks that arise in machine learning and other computing applications, for example, learning mixtures of Gaussians and sparse dictionaries.
In a sequence of recent works, we show that a concrete polynomial-time meta-algorithm called sum-of-squares (SOS) achieves for a wide range of these problems the strongest known provable guarantees.
Remarkably, SOS achieves these guarantees without being tailored to problem specifics.
We also show that for a rich class of problems, the guarantees that SOS achieves are optimal for a restricted but very powerful model of computation.
This result leads to the strongest known unconditional lower bounds for NP-complete problems.
Taken together these results point to a unified theory for efficient optimization centered around SOS that could change how we think about efficient computation in general and bring a kind of conceptual clarity to the design of efficient algorithms we had never anticipated.