1 / 45

CSM6120 Introduction to Intelligent Systems

CSM6120 Introduction to Intelligent Systems. Propositional and Predicate Logic. Logic and AI. Would like our AI to have knowledge about the world, and logically draw conclusions from it Search algorithms generate successors and evaluate them, but do not “understand” much about the setting

gene
Télécharger la présentation

CSM6120 Introduction to Intelligent Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSM6120Introduction to Intelligent Systems Propositional and Predicate Logic

  2. Logic and AI • Would like our AI to have knowledge about the world, and logically draw conclusions from it • Search algorithms generate successors and evaluate them, but do not “understand” much about the setting • Example question: is it possible for a chess player to have 8 pawns and 2 queens? • Search algorithm could search through tons of states to see if this ever happens…

  3. Types of knowledge • There are four basic sorts of knowledge (different sources may vary as to number and definitions) • Declarative • Facts • E.g. Joe has a car • Procedural • Knowledge exercised in the performance of some task • E.g. What to check if car won't start • First check a, then b, then c, until car starts

  4. Types of knowledge • Meta • Knowledge about knowledge • E.g. knowing how an expert system came to its conclusion • Heuristic • Rules of thumb, empirical • Knowledge gained from experience, not proven • i.e. could be wrong!

  5. Methods of representation • Object-attribute-value • Say you have an oak tree. You want to encode the fact that it is a tree, and that its species is oak • There are three pieces of information in there: • It's a tree • It has (belongs to) a species • The species is oak • The fact that all trees have a species might be obvious to you, but it's not obvious to a computer • We have to encode it

  6. How to encode? • Maybe something like this: • (species tree oak) • species(tree, oak) • tree(species, oak) • Or some other similar method that your program can parse and understand

  7. What if you’re not sure? • You could include an uncertainty factor • The uncertainty factor is a number which can be taken into account by your program when making its decisions • The final conclusion of any program where uncertainty was used in the input is likely to also have an uncertainty factor • If you're not sure of the facts, how can the program be sure of the result?

  8. Encoding uncertainty • To encode uncertainty, we may do something like this: • species(tree, oak, 0.8) • This would mean we were only 80% sure that the tree concerned was an oak tree

  9. Rules • A knowledge base (KB) may have rules • IF premises THEN conclusion • May be more than one premise • May contain logical functions • AND, OR, NOT • If premises evaluate to TRUE, rule fires • IF tree (species, oak) THEN tree (type, deciduous)

  10. More complex rules IF tree is a beech AND the season is summer THEN tree is green IF tree is a beech AND the season is summer AND tree-type is red beech THEN tree is red • This example illustrates a problem whereby two rules could both fire, but lead to different conclusions • Strategies to resolve would include firing the most specific rule first • The second rule is more specific

  11. What is a Logic? • A language with concrete rules • No ambiguity in representation (may be other errors!) • Allows unambiguous communication and processing • Very unlike natural languages e.g. English • Many ways to translate between languages • A statement can be represented in different logics • And perhaps differently in same logic • Expressiveness of a logic • How much can we say in this language? • Not to be confused with logical reasoning • Logics are languages, reasoning is a process (may use logic)

  12. Syntax and semantics • Syntax • Rules for constructing legal sentences in the logic • Which symbols we can use (English: letters, punctuation) • How we are allowed to combine symbols • Semantics • How we interpret (read) sentences in the logic • Assigns a meaning to each sentence • Example: “All lecturers are seven foot tall” • A valid sentence (syntax) • And we can understand the meaning (semantics) • This sentence happens to be false (there is a counterexample)

  13. Propositional logic • Syntax • Propositions, e.g. “it is wet” • Connectives: and, or, not, implies, iff (equivalent) • Brackets (), T (true) and F (false) • Semantics (Classical/Boolean) • Define how connectives affect truth • “P and Q” is true if and only if P is true and Q is true • Use truth tables to work out the truth of statements

  14. A story • Your Friend comes home; he is completely wet • You know the following things: • Your Friend is wet • If your Friend is wet, it is because of rain, sprinklers, or both • If your Friend is wet because of sprinklers, the sprinklers must be on • If your Friend is wet because of rain, your Friend must not be carrying the umbrella • The umbrella is not in the umbrella holder • If the umbrella is not in the umbrella holder, either you must be carrying the umbrella, or your Friend must be carrying the umbrella • You are not carrying the umbrella • Can you conclude that the sprinklers are on? • Can AI conclude that the sprinklers are on?

  15. Knowledge base for the story • FriendWet • FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) • FriendWetBecauseOfSprinklers → SprinklersOn • FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella) • UmbrellaGone • UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella) • NOT(YouCarryingUmbrella)

  16. Syntax • What do well-formed sentences in the knowledge base look like? • A BNF grammar: • Symbol := P, Q, R, …, FriendWet, … • Sentence := True | False | Symbol | NOT(Sentence) | (Sentence AND Sentence) | (Sentence OR Sentence) | (Sentence → Sentence) • We will drop parentheses sometimes, but formally they really should always be there

  17. Semantics • A model specifies which of the propositional symbols are true and which are false • Given a model, I should be able to tell you whether a sentence is true or false • Truth table defines semantics of operators: • Given a model, can compute truth of sentence recursively with these

  18. Tautologies • A sentence is a tautology if it is true for any setting of its propositional symbols • (P OR Q) OR (NOT(P) AND NOT(Q)) is a tautology

  19. Is this a tautology? • (P → Q) OR (Q → P)

  20. Logical equivalences • Two sentences are logically equivalent if they have the same truth value for every setting of their propositional variables • P OR Q and NOT(NOT(P) AND NOT(Q)) are logically equivalent. Tautology = logically equivalent to True

  21. Famous logical equivalences • (a OR b) ≡ (b OR a) commutatitvity • (a AND b) ≡ (b AND a) commutatitvity • ((a AND b) AND c) ≡ (a AND (b AND c)) associativity • ((a OR b) OR c) ≡ (a OR (b OR c)) associativity • NOT(NOT(a)) ≡ a double-negation elimination • (a → b) ≡ (NOT(b) → NOT(a)) contraposition • (a → b) ≡ (NOT(a) OR b) implication elimination • NOT(a AND b) ≡ (NOT(a) OR NOT(b)) De Morgan • NOT(a OR b) ≡ (NOT(a) AND NOT(b)) De Morgan • (a AND (b OR c)) ≡ ((a AND b) OR (a AND c)) distributivity • (a OR (b AND c)) ≡ ((a OR b) AND (a OR c)) distributivity

  22. Entailment • A set of premises A logically entails a conclusion B (written as A |= B) if and only if every model/interpretation that satisfies the premises also satisfies the conclusion, i.e. A → B • i.e. B is a valid consequence of A • {p} |= (p∨q) • {p} |# (p ∧ q) • (e.g. p=T, q=F) • {p, q} |= (p ∧ q)

  23. Entailment • We have a knowledge base (KB) of things that we know are true • FriendWetBecauseOfSprinklers • FriendWetBecauseOfSprinklers → SprinklersOn • Can we conclude that SprinklersOn? • We say SprinklersOn is entailed by the KB if, for every setting of the propositional variables for which the KB is true, SprinklersOn is also true SprinklersOn is entailed! (KB |= SprinklersOn)

  24. Inconsistent knowledge bases • Suppose we were careless in how we specified our knowledge base: PetOfFriendIsABird → PetOfFriendCanFly PetOfFriendIsAPenguin → PetOfFriendIsABird PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly) PetOfFriendIsAPenguin • No setting of the propositional variables makes all of these true • Therefore, technically, this knowledge base implies/entails anything… • TheMoonIsMadeOfCheese

  25. Satisfiability • A sentence is satisfiable if it is true in some model • If a sentence α is true in model m, then m satisfies α, or we say m is the model of α • A KB is satisfiable if a model m satisfies all clauses in it • How do you verify if a sentence is satisfiable? • Enumerate the possible models, until one is found to satisfy • Can order the models so as to evaluate the most promising ones first • Many problems in Computer Science can be reduced to a satisfiability problem • E.g. CSP is all about verifying if the constraints are satisfiable by some assignments

  26. Inference • Inference is the process of deriving a specific sentence from a KB (where the sentence must be entailed by the KB) • KB |-i a = sentence a can be derived from KB by procedure i • “KBs are a haystack” • Entailment = needle in haystack • Inference = finding it • If KB is true in the real world, then any sentence derived from KB by a sound inference procedure is also true in the real world

  27. Simple algorithm for inference • Want to find out if sentence a is entailed by knowledge base… • For every possible setting of the propositional variables, • If knowledge base is true and a is false, return false • Return true • Not very efficient: 2#propositional variables settings • There can be many propositional variables among premises that are irrelevant to the conclusion. Much wasted work!

  28. Inference by enumeration

  29. Proof methods • Proof methods divide into (roughly) two kinds: • Model checking (inference by enumeration) • Truth table enumeration (always exponential in n) • Improved backtracking, e.g., Davis-Putnam-Logemann-Loveland (DPLL) • Heuristic search in model space (sound but incomplete) • Application of inference rules/reasoning patterns • Legitimate (sound) generation of new sentences from old • Proof = a sequence of inference rule applications Can use inference rules as operators in a standard search algorithm • Typically require transformation of sentences into a normal form

  30. Reasoning patterns • Obtain new sentences directly from some other sentences in knowledge base according to reasoning patterns • E.g., if we have sentences a and a → b, we can correctly conclude the new sentence b • This is called modus ponens • E.g., if we have a AND b, we can correctly conclude a • All of the logical equivalences from before also give reasoning patterns

  31. Formal proof that the sprinklers are on • Knowledge base: • FriendWet • FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) • FriendWetBecauseOfSprinklers → SprinklersOn • FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella) • UmbrellaGone • UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella) • NOT(YouCarryingUmbrella)

  32. Formal proof that the sprinklers are on • YouCarryingUmbrella OR FriendCarryingUmbrella(modus ponens on 5 and 6) • NOT(YouCarryingUmbrella) → FriendCarryingUmbrella(equivalent to 8) • FriendCarryingUmbrella(modus ponens on 7 and 9) • NOT(NOT(FriendCarryingUmbrella) (equivalent to 10) • NOT(NOT(FriendCarryingUmbrella)) → NOT(FriendWetBecauseOfRain) (equivalent to 4 by contraposition) • NOT(FriendWetBecauseOfRain) (modus ponens on 11 and 12) • FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers(modus ponens on 1 and 2) • NOT(FriendWetBecauseOfRain) → FriendWetBecauseOfSprinklers(equivalent to 14) • FriendWetBecauseOfSprinklers(modus ponens on 13 and 15) • SprinklersOn(modus ponens on 16 and 3)

  33. Reasoning about penguins • Knowledge base: • PetOfFriendIsABird → PetOfFriendCanFly • PetOfFriendIsAPenguin → PetOfFriendIsABird • PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly) • PetOfFriendIsAPenguin • PetOfFriendIsABird(modus ponens on 4 and 2) • PetOfFriendCanFly(modus ponens on 5 and 1) • NOT(PetOfFriendCanFly) (modus ponens on 4 and 3) • NOT(PetOfFriendCanFly) → FALSE (equivalent to 6) • FALSE (modus ponens on 7 and 8) • FALSE → TheMoonIsMadeOfCheese(tautology) • TheMoonIsMadeOfCheese(modus ponens on 9 and 10)

  34. Evaluation of deductive inference • Sound • Yes, because the inference rules themselves are sound • (This can be proven using a truth table argument) • Complete • If we allow all possible inference rules, we’re searching in an infinite space, hence not complete • If we limit inference rules, we run the risk of leaving out the necessary one… • Monotonic • If we have a proof, adding information to the KB will not invalidate the proof

  35. Getting more systematic • Any knowledge base can be written as a single formula in conjunctive normal form (CNF) • CNF formula: (… OR … OR …) AND (… OR …) AND … • … can be a symbol x, or NOT(x) (these are called literals) • Multiple facts in knowledge base are effectively ANDed together • FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) becomes (NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers)

  36. Converting story to CNF • FriendWet • FriendWet • FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) • NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers • FriendWetBecauseOfSprinklers → SprinklersOn • NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn • FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella) • NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella) • UmbrellaGone • UmbrellaGone • UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella) • NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella • NOT(YouCarryingUmbrella) • NOT(YouCarryingUmbrella)

  37. Unit resolution • Unit resolution: if we have l1 OR l2 OR … li … OR lk and NOT(li) we can conclude l1 OR l2 OR … li -1 OR li +1 OR … OR lk • Basically modus ponens

  38. Applying resolution to story problem • FriendWet • NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers • NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn • NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella) • UmbrellaGone • NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella • NOT(YouCarryingUmbrella) • NOT(UmbrellaGone) OR FriendCarryingUmbrella (6,7) • FriendCarryingUmbrella (5,8) • NOT(FriendWetBecauseOfRain) (4,9) • NOT(FriendWet) OR FriendWetBecauseOfSprinklers (2,10) • FriendWetBecauseOfSprinklers (1,11) • SprinklersOn (3,12)

  39. Resolution (general) • General resolution allows a complete inference mechanism (search-based) using only one rule of inference • Resolution rule: • Given: P1  P2  P3 … Pn and P1  Q1 … Qm • Conclude: P2  P3 … Pn Q1 … Qm Complementary literals P1 and P1 “cancel out” • Why it works: • Consider 2 cases: P1 is true, and P1 is false

  40. Applying resolution • FriendWet • NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers • NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn • NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella) • UmbrellaGone • NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella • NOT(YouCarryingUmbrella) • NOT(FriendWet) OR FriendWetBecauseOfRain OR SprinklersOn (2,3) • NOT(FriendCarryingUmbrella) OR NOT(FriendWet) OR SprinklersOn (4,8) • NOT(UmbrellaGone) OR YouCarryingUmbrella OR NOT(FriendWet) OR SprinklersOn (6,9) • YouCarryingUmbrella OR NOT(FriendWet) OR SprinklersOn (5,10) • NOT(FriendWet) OR SprinklersOn (7,11) • SprinklersOn (1,12)

  41. Systematic inference… • General strategy: if we want to see if sentence a is entailed, add NOT(a) to the knowledge base and see if it becomes inconsistent (we can derive a contradiction) • = proof by contradiction • CNF formula for modified knowledge base is satisfiable if and only if sentence a is not entailed • Satisfiable = there exists a model that makes the modified knowledge base true = modified knowledge base is consistent

  42. Resolution algorithm • Given formula in conjunctive normal form, repeat: • Find two clauses with complementary literals, • Apply resolution, • Add resulting clause (if not already there) • If the empty clause results, formula is not satisfiable • Must have been obtained from P and NOT(P) • Otherwise, if we get stuck (and we will eventually), the formula is guaranteed to be satisfiable

  43. Example • Our knowledge base: 1) FriendWetBecauseOfSprinklers 2) NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn • Can we infer SprinklersOn? We add: 3) NOT(SprinklersOn) • From 2) and 3), get 4) NOT(FriendWetBecauseOfSprinklers) • From 4) and 1), get empty clause = SpinklersOn entailed

  44. Exercises • In twos/threes, please!

More Related