Concept

Boosting

Boosting is another approach that can be used for both classification and regression. Boosting improves predictions of a decision tree by learning slowly. Unlike bagging, boosting works sequentially, where each tree grown uses the information from those created before it, and works on modified versions of the original data set. Boosting combines a large number of trees (f^1,...,f^B\hat{f}^{1}, ..., \hat{f}^{B}). Decision trees are fitted based on the residuals of the model rather than the response (Y), then update the residuals by adding it to the fitted function. This means that each model created is slowly reducing the error of the previous ones. This causes an improvement of f^\hat{f}, particularly in places where it does not do as well as others. Decision trees using boosting tend to be very small.

0

8

Updated 2021-04-14

Tags

Data Science

Related