1 / 10

Boosting an Associative Classifier

This paper presents a thorough investigation of boosting methods applied to associative classifiers, focusing on three innovative voting strategies: AdaBoost, evidence weighting, and a hybrid approach. The research underscores the motivation for improving classification systems' performance and examines the effectiveness of these techniques through extensive experiments. Key insights into their advantages, limitations, and potential applications in various classification problems are discussed. The study aims to pave the way for future enhancements in associative classification methodologies.

anakin
Télécharger la présentation

Boosting an Associative Classifier

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Boosting an Associative Classifier Presenter:Chien-Shing Chen Author: Yanmin Sun Yang Wang Andrew K.C. Wong 2006, TKDE

  2. Outline • Motivation • Objective • Introduction • Weight Strategies for Voting • Experiments • Conclusions • Personal Opinion

  3. Motivation • Boosting is a general method for improving the performance of any learning algorithm. • no reported work on boosting associative classifiers

  4. Objective • describe three strategies for voting multiple classifiers in boosting an HPWR classification system • AdaBoost • evidence weight • Hybrid • analyzes the features of these three strategies

  5. Weighting Strategies for Voting • Let εdenotes the weighted training error at each iteration. • weight of evidence provided by x in favor of yi as opposed to other values P(x∩yi) / p(yi)

  6. t=1 x1 x4 t=2 x1 x4

  7. Experiments

  8. Experiments

  9. Experiments

  10. Opinion • Drawback • lack handing with the Class level (predicting attributes) Qualification • Application • any classification problem • Future Work • weight of evidence description • Fourth strategic

More Related