Decision Tree Selection and Pruning based on Complexity


Omega Contributor
Half-way down the page at this known site (see link below) is the code and example. The author selected 0.011 as the complexity penalty for their decision tree. I don't get why they didn't use 0.29 instead for a more parsimonious tree with a comparable error rate. I thought you always selected the comparable and simpler model for better out of sample fit. I am guessing this is just an example/oversight and they didn't realize what they did.