Learn Before
Concept

Tuning Parameters in Boosting

There are 3 tuning parameters when using Boosting:

  1. The first is the number of trees, denoted by BB. Cross-validation is used in order to select BB. If the value of BB is too big, it may overfit the data, albeit slowly.
  2. The second tuning parameter is the shrinkage parameter (λ\lambda), which controls the rate at which the boosting learns. This value is typically a small positive value; if it is very small, a large BB value may be required for it to work well.
  3. The third tuning parameter is dd, which is the number of splits in each tree. This value controls how complex the boosting will be. If d=1d = 1, there is a single split, where each model has at most 1 variable (also known as a stump). It is also known as the ‘interaction depth’, which controls the interaction order.

0

6

Updated 2020-02-28

Tags

Data Science