The theoretical and experimental objections to the Occam thesis do not appear to have greatly diminished the machine learning community's use of Occam's razor. This paper seeks to support objections to the Occam thesis with robust and general experimental counter-evidence. To this end it presents a systematic procedure for increasing the complexity of inferred decision trees without modifying their performance on the training data. This procedure takes the form of a post-processor for decision trees produced by C4.5 [Quinlan, 1993]. The application of this procedure to a range of learning tasks from the UCI repository of learning tasks [Murphy and Aha, 1993] is demonstrated to result, on average, in increased predictive accuracy when the inferred decision trees are applied to previously unseen data.