Adam Oberman: “Generalization Theory in Machine Learning“ (Part 1/2)

Watch part 2/2 here: High Dimensional Hamilton-Jacobi PDEs Tutorials 2020 “Generalization Theory in Machine Learning“ (Part 1/2) Adam Oberman, McGill University Abstract: Statistical learning theory addresses the following question. Given a sample of data points and function values, and a parameterized function (hypothesis) class can we find a function in H which best approximates f? Statistical learning theory has superficial similarities to classical approximation theory,
Back to Top