An algorithm, IDC, was previously proposed [Nock Gascuel1995] for building decision committees. It proceeds in two stages. The first stage builds a potentially large subset of different rules, each of which is actually a DC with only one rule. In a second stage, it gradually clusters the decision committees, using the property that the union of two DC s with different rules is still a DC . At the end of this procedure, the user obtains a set of DCs, and the most accurate one is chosen and returned. Experimental results display the ability of IDC to build small DCs. In that paper, we provide an algorithm for learning decision committees which has a different structure since it builds only one DC. More precisely, WIDC is a three stage algorithm. It first builds a set of rules derived from results on boosting decision trees [Schapire Singer1998]. It then calculates the vectors using a scheme derived from Ranking loss boosting [Schapire Singer1998]. It finally prunes the final DC using two possible schemes: a natural pruning which we call ``pessimistic pruning'', and pruning using local convergence results [Kearns Mansour1998], which we call ``optimistic pruning''. The default vector is always chosen to be the observed distribution of ambiguously classified examples.