Compiegne ISIPTA'11 home
Latest info: Pictures now online.

Lev V. Utkin, Andrea Wiencierz

An imprecise boosting-like approach to regression


This paper is about a generalization of ensemble methods for regression which are based on variants of the basic AdaBoost algorithm. The generalization of these regression methods consists in restricting the unit simplex for the weights of the instances to a smaller set of weighting probabilities. The proposed algorithms cover the standard AdaBoost-based regression algorithms and standard regression as special cases. Various imprecise statistical models can be used to obtain the restricted set of probabilities. One advantage of the proposed algorithms compared to the basic AdaBoost-based regression methods is that they have less tendency to over-fitting, because the weights of the hard instances are restricted. Finally, some simulations and applications also indicate a better performance of the proposed generalized methods.


regression, AdaBoost, algorithm, linear-vacuous mixture model, Kolmogorov-Smirnov bounds

Download area

The paper is available in the following formats:

E-mail addresses

Lev V. Utkin
Andrea Wiencierz

Send any remarks to