Ivan Oseledets (Skoltech Moscow)
Thursday, March 15, 2018 - 10:00
MPI für Mathematik in den Naturwissenschaften Leipzig
Inselstr. 22, 04103 Leipzig, E1 05 (Leibniz-Saal), 1. Etage
Deep neural networks and tensors are different forms of approximation of multivariate functions. In this talk, I will give an overview of our recent results on tensor and matrix analysis, deep learning, and their connections 1) Desingularization of low-rank matrix manifolds (joint with V. Khrulkov) 2) The expressive power of recurrent neural networks (joint with V. Khrulkov and A. Novikov) 3) Universal adversarial examples and singular vectors (joint with V. Khrulkov) 4) Geometry score: a way to compare generative adversarial networks (joint with V. Khrulkov)
submitted by Saskia Gutzschebauch (Saskia.Gutzschebauch@mis.mpg.de, 0341 9959 50)