Wednesday, December 1, 2021 - 10:00
Online Event
Der Vortrag findet bei Zoom statt: https://zoom.us/j/159082384, --- ---
Forschungsseminar Mathematische Statistik
We undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We derive theoretical guarantees for the density estimation with GANs under a proper choice of the deep neural networks classes representing generators and discriminators. In particular, we prove that the resulting estimate converges to the true density p* in terms of Jensen-Shannon (JS) divergence at the rate (logn/n)2β/(2β+d) where n is the sample size and β determines the smoothness of p*. Moreover, we show that the obtained rate is minimax optimal (up to logarithmic factors) for the considered class of densities.
submitted by chschnei (christine.schneider@wias-berlin.de, 030 20372574)