1 / 5

Classification based on Association Rules

Classification based on Association Rules. Introduction. Association rules were originally designed for finding multi-correlated items in transactions However, they can be easily adapted for classification.. How ?. Example.

marcy
Télécharger la présentation

Classification based on Association Rules

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classification based on Association Rules

  2. Introduction • Association rules were originally designed for finding multi-correlated items in transactions • However, they can be easily adapted for classification.. • How ?

  3. Example Sepal Length (SL); Sepal Width (SW); Petal Length (PL); Petal Width (PW) Large = L; Medium = M; Small = S; Discretization of numeric attributes to create “Large”, “Medium”, “Small” Now apply Association rule mining to find patterns of the form: <features-sets> - Class Labels Rank rules first by confidence and then support

  4. Integration with Bayes Classifier • The frequent items generated for the frequent mining algorithm can be used as features and integrated into a Bayes classifier. • Suppose <f1,f2> is a frequent itemset in all transactions projected on class 1 (C1). • Eg. <f1,f2> appears in 20% of the transactions of C1 but only 5% of the transactions of C2. • Then <f1,f2> is a good candidate feature to try out in the Bayes classifier. • [This is part of the assignment]

  5. Integration with Bayesian Classifier • Suppose we have <SL=L,PW=M> as a frequent feature for Virginica. • Should we also have <SL=L> and <PW=M> as separate features ? • What are the pros and cons ?

More Related