Abstract:

In this lecture series, we will discuss the main ideas of multilevel optimization techniques and their relation to classical multigrid theory. We will discuss how multilevel optimization methods for convex and non-convex minimization problems can be constructed and analyzed. We will study the significant gain in convergence speed, which can be achieved by multilevel minimization techniques.

Multilevel optimization techniques are also intimately linked to non-linear preconditioning. As it turns out, the minimization-based view on non-linear problems can not only help to design efficient preconditioners, but is also useful for the construction of globalization strategies.

In the last part of the lecture series, we will employ multilevel optimization techniques in the context of machine learning and will discuss their benefits for the training of neural networks. Various numerical examples from phase field models for fracture, from non-linear elasticity, cardiac simulation, and from deep learning will illustrate our findings.

Online-Vorträge per zoom. Registrierung per Mail an @email.