100 likes | 239 Vues
This paper presents a thorough investigation of boosting methods applied to associative classifiers, focusing on three innovative voting strategies: AdaBoost, evidence weighting, and a hybrid approach. The research underscores the motivation for improving classification systems' performance and examines the effectiveness of these techniques through extensive experiments. Key insights into their advantages, limitations, and potential applications in various classification problems are discussed. The study aims to pave the way for future enhancements in associative classification methodologies.
E N D
Boosting an Associative Classifier Presenter:Chien-Shing Chen Author: Yanmin Sun Yang Wang Andrew K.C. Wong 2006, TKDE
Outline • Motivation • Objective • Introduction • Weight Strategies for Voting • Experiments • Conclusions • Personal Opinion
Motivation • Boosting is a general method for improving the performance of any learning algorithm. • no reported work on boosting associative classifiers
Objective • describe three strategies for voting multiple classifiers in boosting an HPWR classification system • AdaBoost • evidence weight • Hybrid • analyzes the features of these three strategies
Weighting Strategies for Voting • Let εdenotes the weighted training error at each iteration. • weight of evidence provided by x in favor of yi as opposed to other values P(x∩yi) / p(yi)
t=1 x1 x4 t=2 x1 x4
Opinion • Drawback • lack handing with the Class level (predicting attributes) Qualification • Application • any classification problem • Future Work • weight of evidence description • Fourth strategic