Learn Before
Concept
Tuning Parameters in Boosting
There are 3 tuning parameters when using Boosting:
- The first is the number of trees, denoted by . Cross-validation is used in order to select . If the value of is too big, it may overfit the data, albeit slowly.
- The second tuning parameter is the shrinkage parameter (), which controls the rate at which the boosting learns. This value is typically a small positive value; if it is very small, a large value may be required for it to work well.
- The third tuning parameter is , which is the number of splits in each tree. This value controls how complex the boosting will be. If , there is a single split, where each model has at most 1 variable (also known as a stump). It is also known as the ‘interaction depth’, which controls the interaction order.
0
6
Updated 2020-02-28
Tags
Data Science