• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта
Бакалавриат 2025/2026

Теория статистического обучения

Когда читается: 3-й курс, 1, 2 модуль
Охват аудитории: для всех кампусов НИУ ВШЭ
Язык: английский

Course Syllabus

Abstract

This course studies mathematical explanations for the learning ability of important machine learning algorithms such as support vector machines, AdaBoost, and over parameterized neural nets. The course has 3 parts. The first is about "online learning". Important ideas and techniques are introduced in a simpler probability free setting, such as margins and bias-complexity trade-off. In the lectures about multi-armed bandits, probability theory is reviewed, (also this is important in reinforcement learning). The 2nd part, present the main definitions (VC-dimension and Rademacher complexity), and prove the main risk bounds, (using various measure concentration results). In the 3rd part, the theory is used to derive margin risk bounds. This is applied to SVM-classification, AdaBoost, and implicit regularization in over parameterized neural nets. Neural tangent kernels are introduced and the connection with kernel SVM is explained. See HTTP://wiki.cs.hse.ru/Statistical_learning_theory_2023 for the full lecture notes.