1 / 75

Markov Logic

Markov Logic. Parag Singla Dept. of Computer Science University of Texas, Austin. Overview. Motivation Background Markov logic Inference Learning Applications. The Interface Layer. Applications. Interface Layer. Infrastructure. Networking. WWW. Email. Applications. Internet.

Télécharger la présentation

Markov Logic

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Logic Parag Singla Dept. of Computer Science University of Texas, Austin

  2. Overview • Motivation • Background • Markov logic • Inference • Learning • Applications

  3. The Interface Layer Applications Interface Layer Infrastructure

  4. Networking WWW Email Applications Internet Interface Layer Protocols Infrastructure Routers

  5. Databases ERP CRM Applications OLTP Interface Layer Relational Model Transaction Management Infrastructure Query Optimization

  6. Artificial Intelligence Planning Robotics Applications NLP Multi-Agent Systems Vision Interface Layer Representation Inference Infrastructure Learning

  7. Artificial Intelligence Planning Robotics Applications NLP Multi-Agent Systems Vision Interface Layer First-Order Logic? Representation Inference Infrastructure Learning

  8. Artificial Intelligence Planning Robotics Applications NLP Multi-Agent Systems Vision Interface Layer Graphical Models? Representation Inference Infrastructure Learning

  9. Artificial Intelligence Planning Robotics Applications NLP Multi-Agent Systems Vision Interface Layer Statistical + Logical AI Representation Inference Infrastructure Learning

  10. Artificial Intelligence Planning Robotics Applications NLP Multi-Agent Systems Vision Interface Layer Markov Logic Representation Inference Infrastructure Learning

  11. Overview • Motivation • Background • Markov logic • Inference • Learning • Applications

  12. Markov Networks Smoking Cancer • Undirected graphical models Asthma Cough • Potential functions defined over cliques

  13. Markov Networks Smoking Cancer • Undirected graphical models Asthma Cough • Log-linear model: Weight of Feature i Feature i

  14. First-Order Logic • Constants, variables, functions, predicates • Anna, x, MotherOf(x), Friends(x,y) • Grounding: Replace all variables by constants • Friends (Anna, Bob) • Formula: Predicates connected by operators • Smokes(x)  Cancer(x) • Knowledge Base (KB): A set of formulas • Can be equivalently converted into a clausal form • World: Assignment of truth values to all ground predicates

  15. Overview • Motivation • Background • Markov logic • Inference • Learning • Applications

  16. Markov Logic • A logical KB is a set of hard constraintson the set of possible worlds • Let’s make them soft constraints:When a world violates a formula,It becomes less probable, not impossible • Give each formula a weight(Higher weight  Stronger constraint)

  17. Definition • A Markov Logic Network (MLN) is a set of pairs (F, w) where • F is a formula in first-order logic • w is a real number • Together with a finite set of constants,it defines a Markov network with • One node for each grounding of each predicate in the MLN • One feature for each grounding of each formula F in the MLN, with the corresponding weight w

  18. Example: Friends & Smokers

  19. Example: Friends & Smokers

  20. Example: Friends & Smokers Two constants: Ana (A) and Bob (B)

  21. Example: Friends & Smokers Two constants: Ana (A) and Bob (B) Smokes(A) Smokes(B) Cancer(A) Cancer(B)

  22. Example: Friends & Smokers Two constants: Ana (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A)

  23. Example: Friends & Smokers Two constants: Ana (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A)

  24. Example: Friends & Smokers Two constants: Ana (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A)

  25. Example: Friends & Smokers State of the World  {0,1} Assignment to the nodes Two constants: Ana (A) and Bob (B) Friends(A,B) Friends(A,A) Smokes(A) Smokes(B) Friends(B,B) Cancer(A) Cancer(B) Friends(B,A)

  26. Markov Logic Networks • MLN is template for ground Markov networks • Probability of a world x: • One feature for each ground formula

  27. Weight of formula i No. of true groundings of formula i in x Markov Logic Networks • MLN is template for ground Markov nets • Probability of a world x:

  28. Special cases: Markov networks Markov random fields Bayesian networks Log-linear models Exponential models Max. entropy models Gibbs distributions Boltzmann machines Logistic regression Hidden Markov models Conditional random fields Obtained by making all predicates zero-arity Markov logic allows objects to be interdependent (non-i.i.d.) Relation to Statistical Models

  29. Relation to First-Order Logic • Infinite weights  First-order logic • Satisfiable KB, positive weights Satisfying assignments = Modes of distribution • Markov logic allows contradictions between formulas • Markov logic in infinite domains

  30. Overview • Motivation • Background • Markov logic • Inference • Learning • Applications

  31. Inference Friends(A,B) Smokes(A)? Smokes(B)? Friends(A,A) Friends(B,B) Cancer(A) Cancer(B)? Friends(B,A)

  32. Most Probable Explanation (MPE)Inference Friends(A,B) Smokes(A)? Smokes(B)? Friends(A,A) Friends(B,B) Cancer(A) Cancer(B)? Friends(B,A) What is the most likely state of Smokes(A), Smokes(B), Cancer(B)?

  33. MPE Inference • Problem: Find most likely state of world given evidence Query Evidence

  34. MPE Inference • Problem: Find most likely state of world given evidence

  35. MPE Inference • Problem: Find most likely state of world given evidence

  36. MPE Inference • Problem: Find most likely state of world given evidence • This is just the weighted MaxSAT problem • Use weighted SAT solver(e.g., MaxWalkSAT [Kautz et al. 97])

  37. Computing Probabilities: Marginal Inference Friends(A,B) Smokes(A)? Smokes(B)? Friends(A,A) Friends(B,B) Cancer(A) Cancer(B)? Friends(B,A) What is the probability Smokes(B) = 1?

  38. Marginal Inference • Problem: Find the probability of query atoms given evidence Query Evidence

  39. Marginal Inference • Problem: Find the probability of query atoms given evidence Computing Zx takes exponential time!

  40. Belief Propagation • Bipartite network of nodes (variables)and features • In Markov logic: • Nodes = Ground atoms • Features = Ground clauses • Exchange messages until convergence • Messages • Current approximation to node marginals

  41. Belief Propagation • Smokes(Ana)  Friends(Ana, Bob) • Smokes(Bob) Smokes(Ana) Features (f) Nodes (x)

  42. Belief Propagation Features (f) Nodes (x)

  43. Belief Propagation Features (f) Nodes (x)

  44. Lifted Belief Propagation Features (f) Nodes (x)

  45. Lifted Belief Propagation Features (f) Nodes (x)

  46. Lifted Belief Propagation , : Functions of edge counts   Features (f) Nodes (x)

  47. Overview • Motivation • Background • Markov logic • Inference • Learning • Applications

  48. Learning Parameters Three constants: Ana, Bob, John

  49. Learning Parameters Three constants: Ana, Bob, John

  50. Learning Parameters Three constants: Ana, Bob, John Closed World Assumption: Anything not in the database is assumed false.

More Related