G. Tsoumakas, I. Vlahavas, “Distributed Data Mining of Large Classifier Ensembles”, Proc. (companion volume) 2nd Hellenic Conference on AI (SETN '02), I. Vlahavas, C. Spyropoulos (Ed.), pp. 249-255, Thessaloniki, Greece, 2002.
Author(s): Grigorios Tsoumakas, I. Vlahavas
Abstract: Nowadays, classifier ensembles are often used for distributed data mining in order to discover knowledge from inherently distributed information sources and scale up learning to very large databases. One of the most successful methods used for combining multiple classifiers is Stacking. However, this method suffers from very high computational cost in the case of large number of distributed nodes. This paper presents a new classifier combination strategy that scales up efficiently and achieves both high predictive accuracy and tractability of problems with high complexity. It induces a global model by learning from the averages of the local classifiers' output. This way, fast and effective combination of large number of classifiers is achieved.