Learn Before
Idea
some (fun) facts about Adaboost
→ The weak learners in AdaBoost are decision trees with a single split, called decision stumps. → AdaBoost works by putting more weight on difficult to classify instances and less on those already handled well. → AdaBoost algorithms can be used for both classification and regression problem.
0
1
Updated 2021-02-20
Tags
Data Science