Learn Before
Learning Decision Trees
The algorithm required to build a decision tree involves 3 steps:
-
Start with an empty tree
-
Split on the feature with the highest information gain
-
Recurse with step 2
Creating a minimally sized tree which minimizes training error is actually an extremely hard, (NP-Hard) problem). To approximate this optimal solution, we build a tree by greedily splitting on the feature which results in the most information gain, i.e., pick the feature to split on that more certainly leads to a classification.
0
3
Contributors are:
Who are from:
Tags
Data Science
Related
Trees VS. Linear Models
Advantages to Using Decision Trees
Disadvantages to Using Decision Trees
Types of decision trees
Learning Decision Trees
Approaches for improving decision trees' predictions
Decision trees applied to regression and classificatioin problems
Decision Tree Terms
Post pruning decision trees with cost complexity pruning
Scikit learn key decision tree parameters
Decision tree key parameters
Gradient Boosted Decision Trees
Find the Accuracy Score of a Decision Tree