1 / 36

Text Understanding through Probabilistic Reasoning about Actions

Text Understanding through Probabilistic Reasoning about Actions. Hannaneh Hajishirzi Erik T. Mueller. Problem. Understanding a text and answering questions A fundamental problem in Natural Language Processing and Linguistics Very hard to solve (specifically by machines).

virgo
Télécharger la présentation

Text Understanding through Probabilistic Reasoning about Actions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Text Understanding through Probabilistic Reasoningabout Actions • Hannaneh HajishirziErik T. Mueller

  2. Problem • Understanding a text and answering questions • A fundamental problem in Natural Language Processing and Linguistics • Very hard to solve (specifically by machines) Imagine if computers could understand text Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  3. Help Desk Problem: I’m having trouble installing Notes. I got error message 1. How do I solve it? Text Understanding System Commonsense Reasoning User text aboutthe problem Yes, you will get error message 1 if there is another notes installed. Solutions You must first uninstall Notes. Then, when you run setup you will get notes installed Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  4. Commanding a Robot Go one block ahead. Then, turn right. Take the keys. Open the door. Initial states Query: Where is the robot? Is the door open? Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  5. Question Answering Systems Question: Where was President Bush two years ago? Ask.com President Bush said, "Two hundred thirty-one years ago, ….. Applications at IBM: Playing Jeopardy Natural language input to semantic engine Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  6. General Solution to Text Understanding Task (1) Framework for representing the content of text (2) Algorithms for reasoning (based on the representation) Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  7. Approaches to Natural Language Processing Disadvantages: Bad on semanticsRequires training data Disadvantages: Unable to represent uncertainties in text/knowledge Machine learning and Statistical approaches(Manning & Schütze, 1999) Logical approaches(Alshawi, 1992; Hobbs, 1993) Our approach Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  8. Our Approach • Represent sentences using a logical framework + probabilities • Each sentence states properties or actions: • Property: a statement about the world • Action: a change in the world • Probabilities: uncertainty and ambiguity • Algorithms for stochastic inference Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  9. Potential Open Problems • Transferring text to logical representation • Represent sentences with actions • Disambiguate sentences using probabilities • Represent prior knowledge • Answer queries using probabilistic reasoning in logical framework • Efficient algorithms • Fill in missing actions Problem + Use cases ApproachesOur Approach: Representation, Inference Future work

  10. Representation ….John woke up. He flipped the light switch. He had his breakfast. He went to work…. Text Level Translation to actions WakeUp(John, Bed). SwitchLight(John).Eat(John, Food). Move(John, Work) Action Level Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  11. Elements in our Representation WakeUp(John, Bed). Switch(John,Light). Eat(John, Food). Move(John, Work) Predicates At(agent, location) Hungry(agent) Awake(agent) LightOn(room) … ConstantsJohn: agent Bedroom: room HotelRoom: room Work: location …. Variablesobject agent: object physobj: object location … World state: Example: At(John, Work), ¬ LyingOn(John,Bed), Hungry(John), ¬ OnLight(HotelRoom)... Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  12. run throw study disambiguatedactions review drive cook walk absorb build states s0 s2 s3 s1 move make memorize actions Text Representation • Transition: stochastic choice of deterministic execution Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  13. Action Declarations • Deterministic Actions: • Preconditions • Effects • Probabilistic Actions: • Assumption: description for basic primitives (e.g., walk) is known (preconditions and effects) • Goal: Find primitives related to a probabilistic action and disambiguate by transition probabilities WakeUp(John, Bed)Pre: ¬Awake(John), LyingOn(John, Bed)Eff: Awake(John), ¬LyingOn(John, Bed) Probabilistic Action Move (John, Location1, Location2) (simplified)1. Walk(John, Location1, Location2)2. Drive(John, Location1, Location2) Deterministic Action Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  14. Probabilistic Action • Determining each transition using Wordnet (Fellbaum, 1998): • Assign transition probabilities by: • Calculating probability of each primitive using the context of the sentence Go(John, Work) Walk(John, Work) Drive(John, Work) • Compute P(Walk|work), P(Drive|work) go, move drive test drive fly soar billow hover run skitter rush walk march countermarch step Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  15. Disambiguation Algorithm • Train set (Lillian Lee): • (noun, verb, frequency) for 1000 most popular nouns • Test set (Semcor, Senseval): • (object, siblings of the verb) label: verb in the sentence • Goal: P(sibling verb|object) for each sentence • Compute freq(sibling verb, object)/freq(object) • If “object” is not in the train set • Replace object with hypernym(object) e.g. replace “lady” with “woman” • If (object, candidate verb) is not in the train set • Find similar nouns to “object” Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  16. Some results Probability distribution over candidate verbs: Test set 2 Test set 1 Accuracy: Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  17. Prior Knowledge • Knowledge base for state constraints: • At(agent, location1), location1 != location2  ¬At(agent, location2) • AtHand(agent, physobj)  ¬OnFloor(physobj) • Bayes net or Markov Network to represent dependencies: • P(Hungry(agent)|Eat(agent, food))=.8 • P(Drive(agent,loc1,loc2)|distance(loc1,loc2)>1m)=.7 • Probabilistic Open Mind (Singh et al., 2002): Open Mind: You can often find “Object” in “Location” • “You can often find a bed in a bedroom” • “You can often find a bed in a hotel room” Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  18. Acquisition of Object Location Probabilities • Open Mind: You often find “Object” in “Location” • Goal: P(object in location) • P(bed in bedroom) > P(bed in hotelroom) > P(bed in hospital) • Method: • Extract objects list (1600 objects) and locations list(2675 locations) • Use a corpus of American literature stories (downloaded from Project Gutenberg) • Compute correlations between objects and locations: • Probability: P(Near(object, location)|object)  We used this • Cross-reference probabilities with Open Mind and normalize • (Some) Results: • P(bed in bedroom) = 0.5P(bed in hotelroom) = 0.33P(bed in hospital) = 0.17 • Add missing assertions to Open Mind suggested by corpus Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  19. Our Approach • Represent sentences using event calculus (a logical framework) + probabilities • Each sentence states properties or actions: • Property: a statement about the world • Action: a change in the world • Probabilities: uncertainty and ambiguity • Algorithms for stochastic inference Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  20. da11 da12 da13 da21 da23 da22 da31 da32 da33 Inference Algorithm • Goal: Answer a question related to text • Question Format: • Algorithm:Consider all possible paths from root to leaves.  For each path: 1. Compute 2. Compute logical formula Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  21. Path1: WakeUp(John, Bed). SwitchLight(John). State Space State Space State Space Updating world states States Updating world states Propagating back Query 1 Propagating back time 2 time 1 time 0 Query 1: Certain answer Updating world states Propagating information back Check for conflicts at each time Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  22. State Space State Space State Space States Updating world states Updating world states Query 1 Propagating back Propagating back Query 2 Query 2: Regress Query 2 to time 0 Use prior knowledge Example: P(At(John, Bedroom)0) = ? = P(In(Bed, Bedroom)) = … from Prior knowledge Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  23. Efficiency of the algorithm • Naïve algorithm (Complete state): • Truth assignment to all the possible predicates • Our algorithm (Partial state): • Truth assignment to some predicates useful for understanding the text Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  24. Free variables and quantifiers • No need to enumerate all the possible cases • Example • No need to enumerate all the possible permutations of objects inside the briefcase MvWithObj(B,l1,l2): Precondition: At(B,l1),~At(B,l2),o: In(o) Effect: ~At(B,l1),At(B,l2),At(o,l2) Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  25. Handling Free Variables • Store all the possible values that variable o can take • Add constraints when receive new information • P(o) New claim: P(K)  remove K from the possible values of o • P(o)New claim: ~P(K)  add (K != o) to the knowledge Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  26. Filling Missing Actions Bob woke up. Bob took shower. Solution: Bob went to bathroom. • Build the tree representing text before missing action • Build the tree representing text after missing action • If state for the left tree conflicts with initial state for the right tree • Find actions that do not have contradiction Future work: rank the candidate actions. Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  27. Conclusions and Future Work • Done: Framework for representing a text and reasoning algorithm for answering queries • Approximate reasoning algorithm (sampling)(Hajishirzi & Amir, AAAI07, UAI 08) • Comparing the performance of whole system with other approaches • Definition of deterministic actions (preconditions and effects) • More accurate disambiguation technique Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  28. Thank You Questions?

  29. References • Alshawi, H. (1992). The Core Language Engine, Cambridge, MA: MIT Press. • Hobbs, J. R., Stickel, M. E., Appelt, D. E., & Martin, P. (1993). Interpretation as abduction. Artificial Intelligence, 63, 69-142. • Manning, C. D. & Schütze, H. (1999). Foundations of Statistical Natural Language Processing. Cambridge, MA: MIT Press. • Singh, P., Lin, T., Mueller, E. T., Lim, G., Perkins, T., & Zhu,W. L. (2002). Open Mind Common Sense: Knowledge acquisition from the general public. In Lecture Notes in Computer Science: Vol. 2519. On the Move to Meaningful Internet Systems. Berlin: Springer. • Fellbaum, C. (1998). WordNet: An Electronic Lexical Database. Cambridge, MA: MIT Press.

  30. Action Declarations WakeUp(John, Bed)Pre: ¬Awake(John), LyingOn(John, Bed)Eff: Awake(John), ¬LyingOn(John, Bed) SwitchLight(John)1. Pre: ¬OnLight(room), At(John, room) Eff: OnLight(room) 2. Pre: OnLight(room), At(John, room) Eff: ¬OnLight(room) Eat(John, Food)1. Pre: Hungry(John), At(John, room), At(Food, room) Eff: ¬Hungry(John), ¬At(Food, room)2. Pre: ¬Hungry(John), At(John, room), At(Food, room) Eff: ¬At(Food, room) Move (John, Location1, Location2) (simplified)1. Walk(John, location1, location2)2. Drive(John, location1, location2)

  31. Path1: WakeUp(John, Bed). SwitchLight(John). Eat(John, Food). Regression Progression ¬Awake(John), LyingOn(John, Bed). ¬Awake(John), LyingOn(John, Bed), At(John, room),¬LightOn(room), At(Food, room), Hungry(John). WakeUp(John, Bed)Pre: ¬Awake(John), LyingOn(John, Bed)Eff: Awake(John), ¬LyingOn(John, Bed) Awake(John),¬LyingOn(John, Bed), At(John, room),¬LightOn(room), At(Food, room), Hungry(John). Awake(John), ¬LyingOn(John, Bed), At(John, room), ¬LightOn(room). SwitchLight(John)1. Pre: ¬OnLight(room), At(John, room) Eff: OnLight(room) Awake(John), ¬LyingOn(John, Bed), At(John, room), LightOn(room), At(Food, room), Hungry(John). Awake(John), ¬LyingOn(John, Bed), At(John, room), LightOn(room), At(Food, room), Hungry(John). Eat(John, Food)1. Pre: Hungry(John), At(John, room), At(Food, room) Eff: ¬Hungry(John), ¬At(Food, room) Awake(John), ¬LyingOn(John, Bed), At(John, room), LightOn(room), ¬At(Food, room), ¬Hungry(John).

  32. Partial Grounding Text:AtHand(John, Glass). Move(John, Kitchen, Bedroom). • Ground free variables when necessary: Move(John, Kitchen, Bedroom) Pre: At(John, Kitchen), AtHand(John, physobj), At(physobj, Kitchen). Eff: At(John, Bedroom), ¬At(John, Kitchen), At(physobj, Bedroom), ¬At(physobj, Kitchen). AtHand(John, Glass), At(John, Kitchen), At(Glass, Kitchen), AtHand(John, ?), counter=∞. If followed by: He took out his wallet out of his pocket. ? == Wallet, counter = counter – 1 Remove “?” when counter = 0

  33. Our Specific Contributions • Understanding spatial texts • Understanding texts by combining logical and probabilistic representations of commonsense knowledge • Representation of ambiguities and uncertainties in text • Efficient path-based algorithm • Acquisition of object location probabilities

  34. Path1: WakeUp(John, Bed). SwitchLight(John). Propagating back Updating world states ¬Awake(John), LyingOn(John, Bed). ¬Awake(John), LyingOn(John, Bed), At(John, room),¬OnLight (room). WakeUp(John, Bed)Pre: ¬Awake(John), LyingOn(John, Bed)Eff: Awake(John), ¬LyingOn(John, Bed) Free variable Awake(John),¬LyingOn(John, Bed), At(John, room),¬OnLight(room). Awake(John), ¬LyingOn(John, Bed), At(John, room), ¬LightOn(room). SwitchLight(John)1. Pre: ¬OnLight(room), At(John, room) Eff: OnLight(room) Awake(John), ¬LyingOn(John, Bed), At(John, room), OnLight(room). Efficient:1. Partial representation of states2. Partial groundings of actions Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  35. da11 da12 da13 da21 da23 da22 da31 da32 da33 Inference Algorithm • Goal: Answer a question related to text • Question Format: • Algorithm:Consider all possible paths from root to leaves.  For each path: 1. Compute 2. Compute logical formula Problem + Use cases Approaches Our Approach: Representation, Inference Future work

  36. da11 da12 da13 da23 da21 da22 da31 da33 da32 s0 s1 s2 s3 a1 a2 a3 Compute probability of each transition Problem + Use cases Approaches Our Approach: Representation, Inference Future work

More Related