Learn Before
Concept

Learning Decision Trees

The algorithm required to build a decision tree involves 3 steps:

  1. Start with an empty tree

  2. Split on the feature with the highest information gain

  3. Recurse with step 2

Creating a minimally sized tree which minimizes training error is actually an extremely hard, (NP-Hard) problem). To approximate this optimal solution, we build a tree by greedily splitting on the feature which results in the most information gain, i.e., pick the feature to split on that more certainly leads to a classification.

0

3

Updated 2021-02-26

Tags

Data Science

Learn After