Bachelor
2025/2026
Statistical Learning Theory
Type:
Elective course (Applied Mathematics and Information Science)
Delivered by:
Big Data and Information Retrieval School
When:
3 year, 1, 2 module
Open to:
students of all HSE University campuses
Instructors:
Bruno Frederik Bauwens
Language:
English
Course Syllabus
Abstract
This course studies mathematical explanations for the learning ability of important machine learning algorithms such as support vector machines, AdaBoost, and over parameterized neural nets. The course has 3 parts. The first is about "online learning". Important ideas and techniques are introduced in a simpler probability free setting, such as margins and bias-complexity trade-off. In the lectures about multi-armed bandits, probability theory is reviewed, (also this is important in reinforcement learning). The 2nd part, present the main definitions (VC-dimension and Rademacher complexity), and prove the main risk bounds, (using various measure concentration results). In the 3rd part, the theory is used to derive margin risk bounds. This is applied to SVM-classification, AdaBoost, and implicit regularization in over parameterized neural nets. Neural tangent kernels are introduced and the connection with kernel SVM is explained. See HTTP://wiki.cs.hse.ru/Statistical_learning_theory_2023 for the full lecture notes.