1 / 13

Bayes Net Classifiers The Naïve Bayes Model

Bayes Net Classifiers The Naïve Bayes Model. Oliver Schulte Machine Learning 726. Classification. Suppose we have a target node V such that all queries of interest are of the form P(V=v| values for all other variables).

daryl
Télécharger la présentation

Bayes Net Classifiers The Naïve Bayes Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayes Net ClassifiersThe Naïve Bayes Model Oliver Schulte Machine Learning 726

  2. Classification • Suppose we have a target node V such that all queries of interest are of the formP(V=v| values for all other variables). • Example: predict whether patient has bronchitis given values for all other nodes. • Because we know form of query, we can optimize the Bayes net. • V is called the class variable. • v is called the class label. • The other variables are called features.

  3. Optimizing the Structure • Some nodes are irrelevant to a target node, given the others. • Examples • Can you guess the pattern? • The Markov blanket of a node contains: • The neighbors. • The spouses (co-parents).

  4. The Markov Blanket • The Markov blanket of a node contains: • The neighbors. • The spouses (co-parents).

  5. How to Build a Bayes net classifier • Eliminate nodes not in the Markov blanket. • Feature Selection. • Learn parameters. • Fewer dimensions!

  6. The Naïve Bayes Model

  7. Classification Models • A Bayes net is a very general probability model. • Sometimes want to use more specific models. • More intelligible for some users. • Models make assumptions : if correct → better learning. • Widely used Bayes net-type classifier: Naïve Bayes.

  8. The Naïve Bayes Model • Given class label, features are independent. • Intuition: The only way in which features interact is through the class label. • Also: We don’t care about correlations among features. Temperature Wind Humidity Outlook PlayTennis

  9. The Naive Bayes Classification Model • Exercise: Use the Naive Bayes Assumption to find a simple expression for P(PlayTennis=yes|o,t,w,h) • Solution: • multiply the numbers in each column • Divide by P(o,t,w,h)

  10. Example Normalization: P(PT=yes|features) = 0.0053/0.0053+0.0206 = 20.5%.

  11. Naive Bayes Learning • Use maximum likelihood estimates, i.e. observed frequencies. • Linear number of parameters! • Example: see previous slide. • Weka.NaiveBayesSimple uses Laplace estimation. • For another refinement, can perform feature selection first. • Can also apply boosting to Naive Bayes learning, very competitive. Temperature Wind Humidity Outlook PlayTennis

  12. Ratio/OddsClassification Formula • If we only care about classification, can ignore normalization constant. • Ratios of feature probabilities more numeric stability. • Exercise: Use the Naive Bayes Assumption to find a simple expression for the posterior oddsP(class=yes|features)/P(class = no|features). • Product = 0.26, see examples.xlsx • Positive or negative?

  13. Log-Odds Formula • For even more numeric stability, use logs. • Intuitive interpretation: each feature “votes” for a class,then we add up votes. • Sum = -1.36, see examples.xlsx • Positive or negative? • Linear discriminant: add up feature terms, accept if >0.

More Related