Prof. Dr. Sebastian Pokutta (Georgia Institute of Technology)

Lazy Conditional Gradients through Simpler Oracles

Programm / Abstract:
Conditional Gradient Descent methods are popular first-order methods for (smooth) constraint convex minimization and are heavily used in Machine Learning. Relying on a linear programming oracle, these methods often outperform projected gradient descent methods whenever projections into the feasible region are expensive. Unfortunately, even those methods might suffer from prohibitive running times if the linear programming oracle itself is expensive, e.g., when the feasible region corresponds to a hard combinatorial optimization problem. In this talk, we will explore a general method to significantly speed-up conditional gradient descent methods by replacing the linear programming oracle with a significantly easier and cheaper oracle leading to real-world speedups by several orders of magnitude while maintaining identical theoretical converge rates modulo (small!) constant factors. Moreover, we will further outline a conditionally accelerated lazy stochastic gradient descent method (CAL-SGD) that achieves optimal bounds in terms of required gradient evaluations and calls to the new oracle matching those for the more complex linear programming oracle. (based on joint works with Gábor Braun, George Lan, Yi Zhou, Daniel Zink)

Zeit:
am Montag den 19. März 2018 um 14:00

Ort:
Technische Universität Berlin
Straße des 17. Juni 136
10623 Berlin
MA 041

eingetragen von Dorothea Kiefer(kiefer@math.tu-berlin.de, )

zurück zum Kalender               Mathematics Calendar of the AMS