Ekaterina Lobacheva (HSE University); Maxim Kodryan (HSE University)
Thursday, December 2, 2021 - 17:00
Virtual event (Videobroadcast) - link for registration
Max-Planck-Institut fuer Mathematik in den Naturwissenschaften, 04103 Leipzig
This talk will discuss certain peculiarities of training neural networks with batch normalization and weight decay, which has become a common practice in recent years. It turns out that their combined use may result in a surprising periodic behavior of optimization dynamics: the training process regularly exhibits destabilizations that, however, do not lead to complete divergence but cause a new period of training. We will delve deeper into the mechanism underlying the discovered periodic behavior, both empirically and theoretically, and analyze the conditions in which it occurs in practice. We will also demonstrate that periodic behavior can be regarded as a generalization of two previously opposing perspectives on training with batch normalization and weight decay, namely the equilibrium presumption and the instability presumption.
submitted by Saskia Gutzschebauch (Saskia.Gutzschebauch@mis.mpg.de, 0341 9959 50)