来源:https://mostafa-samir.github.io/
作者:Mostafa Samir
Machine Learning Theory – Part 1: Introduction
- Motivation
- Who is this Series for?
- Prerequisites
- Cautions
- Formalizing the Learning Problem
- The Target Function
- The Hypothesis
- The Loss Function
- The Generalization Error
- Is the Learning Problem Solvable?
- References and Additional Readings
Machine Learning Theory – Part 2: Generalization Bounds
- Independently, and Identically Distributed
- The Law of Large Numbers
- Hoeffding’s inequality
- Generalization Bound: 1st Attempt
- Examining the Independence Assumption
- The Symmetrization Lemma
- The Growth Function
- The VC-Dimension
- The VC Generalization Bound
- Distribution-Based Bounds
- One Inequality to Rule Them All
- References and Additional Readings
Machine Learning Theory – Part 3: Regularization and the Bias-variance Trade-off
- Why rich hypotheses are bad?
- The Bias-variance Decomposition
- Taming the Rich
- References and Additional Readings
欢迎加入我爱机器学习QQ7群: 467165306
微信扫一扫,关注我爱机器学习公众号
微博:我爱机器学习
原文
https://www.52ml.net/21357.html