It is important to note that although this paper calls into question the value of learning biases that penalize complexity, in no way does it provide support for learning biases that encourage complexity for its own sake. C4.5X only grafts new nodes onto a decision tree when there is empirical support for doing so.
Nor do the results in any way argue against the appropriate use of decision tree pruning. To generate its pruned trees, C4.5 removes branches where statistical estimates of the upper bounds on the error rates indicate that these will not increase if the branch is removed. It could be argued that C4.5 only reduces complexity when there is empirical support for doing so. It is interesting to note that for eight of the thirteen data sets examined, C4.5X's post-processing of the pruned trees resulted in higher average predictive accuracy than post-processing of unpruned trees. These results suggest that both pruning and grafting can play a valuable role when applied appropriately.