Dino Sejdinovic (University of Oxford)
Friday, May 25, 2018 - 10:15
SFB 1294 Colloquium, University of Potsdam
Karl-Liebknecht- Str. 24-25, 14476 Potsdam OT Golm, House 28, Room 0.108
Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting probability metric, are useful tools for fully nonparametric hypothesis testing and for learning on distributional inputs, i.e. where labels are only observed at an aggregate level. I will give an overview of this framework and describe the use of large-scale approximations to kernel embeddings in the context of Bayesian approaches to learning on distributions and in the context of distributional covariate shift, e.g. where measurement noise on the training inputs differs from that on the testing inputs.
submitted by Liv Heinecke (liv.heinecke@uni-potsdam.de, 0331-977-203137)