1 / 14

CS498-EA Reasoning in AI Lecture #9

CS498-EA Reasoning in AI Lecture #9. Instructor: Eyal Amir Fall Semester 2011. Previously. First-Order Logic Syntax: Well-Founded Formulas Semantics: Models, Satisfaction, Entailment Models of FOL: how many, sometimes unexpected Resolution in FOL Resolution rule Unification Clausal form

deo
Télécharger la présentation

CS498-EA Reasoning in AI Lecture #9

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS498-EAReasoning in AILecture #9 Instructor: Eyal Amir Fall Semester 2011

  2. Previously • First-Order Logic • Syntax: Well-Founded Formulas • Semantics: Models, Satisfaction, Entailment • Models of FOL: how many, sometimes unexpected • Resolution in FOL • Resolution rule • Unification • Clausal form • COMPLETENESS, SOUNDNESS

  3. Pop Quiz (5 min) • Question: how many non-equivalent propositional theories of n variables are there? • After you’re done: Discussion

  4. Today • Probabilistic graphical models • Bayesian Networks • Markov Random Fields • Next week: Treewidth methods: • Variable elimination • Clique tree algorithm

  5. Summary of Our Goal

  6. Probability • A sample space Omega (O) is a set of outcomes of a random experiment • A probability P is a function from a sigma-field (e.g., all measurable subsets) A on O (the events) to [0,1]. • A random variable X is a function X:OR such that for all B Borel set in R, X-1(B) is in A.

  7. Independent Random Variables • Two variables X and Y are independent if • P(X = x|Y = y) = P(X = x) for all values x,y • That is, learning the values of Y does not change prediction of X • If X and Y are independent then • P(X,Y) = P(X|Y)P(Y) = P(X)P(Y) • In general, if X1,…,Xp are independent, then P(X1,…,Xp)= P(X1)...P(Xp) • Requires O(n) parameters

  8. Conditional Independence • Unfortunately, most of random variables of interest are not independent of each other • A more suitable notion is that of conditional independence • Two variables X and Y are conditionally independent given Z if • P(X = x|Y = y,Z=z) = P(X = x|Z=z) for all values x,y,z • That is, learning the values of Y does not change prediction of X once we know the value of Z • notation: I ( X , Y | Z )

  9. Marge Homer Lisa Maggie Bart Example: Family trees Noisy stochastic process: Example: Pedigree • A node represents an individual’sgenotype • Modeling assumptions: • Ancestors can effect descendants' genotype only by passing genetic materials through intermediate generations

  10. Y1 Y2 X Non-descendent Markov Assumption Ancestor Parent • We now make this independence assumption more precise for directed acyclic graphs (DAGs) • Each random variable X, is independent of its non-descendents, given its parents Pa(X) • Formally,I (X, NonDesc(X) | Pa(X)) Non-descendent Descendent

  11. Burglary Earthquake Radio Alarm Call Markov Assumption Example • In this example: • I ( E, B ) • I ( B, {E, R} ) • I ( R, {A, B, C} | E ) • I ( A, R | B,E ) • I ( C, {B, E, R} | A)

  12. X Y X Y I-Maps • A DAG G is an I-Map of a distribution P if all Markov assumptions implied by G are satisfied by P (Assuming G and P both use the same set of random variables) Examples:

  13. X Y Factorization • Given that G is an I-Map of P, can we simplify the representation of P? • Example: • Since I(X,Y), we have that P(X|Y) = P(X) • Applying the chain ruleP(X,Y) = P(X|Y) P(Y) = P(X) P(Y) • Thus, we have a simpler representation of P(X,Y)

  14. THE END

More Related