1 / 27

Inducing Relations

Inducing Relations. Goal: Discover types of information salient to a domain and extract short phrases representing them. ... Boston was founded on November 17, 1630 , by Puritan colonists from England . Document 1.

zazu
Télécharger la présentation

Inducing Relations

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Inducing Relations Goal: Discover types of information salient to a domain and extract short phrases representing them ... Boston was founded on November 17, 1630, by Puritan colonists from England ... Document 1 ... New York City was settled by Europeans from The Netherlands in 1624 ... Document 2 ... San Francisco was founded in 1776 by the Spanish conquerors ... Document 3

  2. Application: Creating Metadata • Machine-readable access mechanism for searching, browsing, and retrieving text Disaster Reports Medical Records

  3. Application: Generating Infoboxes • Exploring the important attributes of new domains automatically Cambridge Seattle

  4. Regularities for Learning Relations Evoking word • Local lexical and orthographic similarity in expressionof relation instances … injuredsix people … injured16 relief workers … fourwere hurt injur* Relation phrase {number} • Recurring specific syntactic patterns in relation occurrence Three six killed injured • Similar document-level positioning of relations

  5. Note – this is the example for the document-level stuff for the slide above Beginning of Document A strong earthquake with a magnitude of 5.6 rocked the easternmost province of Irian Jaya on Friday.An earthquake of magnitude 6 is considered “severe,” capable of widespread damage near the epicenter. End of Document

  6. Highlights of the Approach • Novel source of supervision: declarative human knowledge about constraints • Rich combination of information sources that combines multiple layers of linguistic analysis • Mathematical formalism that guides unsupervised learning with human knowledge

  7. Input Representation • Each potential indicator word and argument phrase encoded with features Arguments: has_capital 0 1 1 is_number 0 0 0 height 1 1 2 Indicators: is_verb 0 1 0 earthquake 1 0 0 hit 0 1 0 ... ... ... ...

  8. Output Representation • Relation instances as indicator word and argument phrase pairings. syntactic context injured VBN VP S NP + discourse context six people

  9. Our Two-Pronged Approach • Model Structure: A generative model of hidden indicator and argument structure • Models local lexical and syntactic similarity • Biases toward consistent document-level structure • Soft declarative constraints: Enforced during inference via posterior regularization • Restricts global syntactic patterns • Enforces relation instance uniqueness

  10. Generating Indicators and Arguments For a single relation type, indicators and arguments drawn from relation-specific feature distributions     : parameters of indicator feature distributions      : parameters of argument feature distributions 10

  11. Backoff Distributions Remaining constituents generated from backoff feature distributions       : parameters of indicator feature distributions        : parameters of argument feature distributions 11

  12. Multiple Relations (maybe delete if no fit) Each relation      has its own       Constituent features drawn from pointwise product over all           : either indicator or backoff for each          : either argument or backoff for each    

  13. Selecting Relation Locations • Relation instance locations within document drawn from shared distribution • Indicator and argument within sentence selected uniformly at random Document 1 Sentence 1. Sentence 2. Sentence 3. Sentence 4. Document 2 Sentence 1. Sentence 2. Document 3 Sentence 1. Sentence 2. Sentence 3. Document 4 Sentence 1. Sentence 2. Sentence 3.

  14. Summary of Generative Process • For each relation                          : • Draw indicator, argument, and backoff distributions: • Draw location distribution:

  15. (continuation of previous slide) • For each document    : • For each relation                           : • Select a sentence (or null):                                                • Draw argument and indicator positions                   uniformly at random within sentence          • For each potential indicator word       : • Draw indicator features: is       if this word        selected as indicator,     otherwise • For each potential argument phrase       : • Draw argument features:

  16. Model Properties • Lexical similarity • Via features • Recurring syntactic patterns • Via features and constraints during learning • Regularities in document-level structure • Via document locationdistribution • Issue: how do we break symmetry between relations? • Via constraintsduring learning

  17. Variational Inference with Declarative Constraints • Desired posterior: • Optimize variational objective with mean field factorization: Observed data (words, trees) Model parameters Hidden structure (relations)

  18. Syntactic Constraints • Biases toward relations that are syntactically plausible • Indicator is verb and argument is object of indicator • Indicator is noun and argument is modifier • Indicator and argument are subject/object of same verb Counts number of relations in       that match canonical syntactic pattern Threshold of relations that must match syntactic pattern (80%)

  19. Separation Constraints (Argument) Counts number of relations whose arguments include word                 : no more than one relation should share the same argument word • Encourages relations to be diverse • Arguments cannot be shared...

  20. Separation Constraints (Indicator) Counts number of relations whose indicators include word                 : allow some relations to share indicator words • Encourages relations to be diverse • Indicators can be shared to an extent

  21. Experimental Setup Experiments on two news domains Example Document A strong earthquake rocked the Philippines island of Mindoro early Tuesday, killing at least two people and causing some damage, authorities said. The 3:15 am quake had a preliminary magnitude of 6.7 and was centered near Baco on northern Mindoro Island, about 75 miles south of Manila, according to the Philippine Institute of Vulcanology and Seismology. The U.S. Geological Survey in Menlo Park, Calif., put the quake's preliminary magnitude at 7.1. Gov. Rodolfo Valencia of the island's Oriental Mindoro province said two people reportedly were killed and that several buildings and bridges were damaged by the quake. Several homes near the shore reportedly were washed away by large waves, Valencia told Manila radio station DZBB. Telephone service was cut, he said. The quake swayed tall buildings in Manila. Institute spokesman Aris Jimenez said the quake occurred on the Lubang fault, one of the area's most active. A magnitude 6 quake can cause severe damage if centered under a populated area, while amgnitude 7 quake indicates a major quake capable of widespread, heavy damage.

  22. Extracted Relations Location Magnitude Time A strong earthquake rockedthe Philippines island of Mindoro early Tuesday, killing at least two people and causing some damage, authorities said. The 3:15 am quake had a preliminary magnitude of 6.7 and was centered near Baco on northern Mindoro Island... The 3:15 am quake had a preliminary magnitude of 6.7 and was centered near Baco on northern Mindoro Island...

  23. Generic versus Domain-specific Knowledge • Generic Feature Representation • Indicator: word, POS, word stem • Argument: word, syntax label, headword of parent, dependency label to parent • Domain-specific knowledge (relation independent) • Finance: prefer arguments with numbers • Earthquake: prefer relations in first two sentences of each document

  24. Main Results (Sentence F-score) Finance Earthquake • USP: Unsupervised semantic parsing(Poon and Domingos 2009) • CLUTO: CLUTO sentence clustering • Mallows: Mallows content model sentence clustering (Chen et al 2009)

  25. Main Results (Token F-score) Finance Earthquake • USP: Unsupervised semantic parsing(Poon and Domingos 2009)

  26. Constraint Ablation Analysis What happens as we modify declarative constraints? No-sep:No separation constraints No-syn: No syntactic constraints Hard-syn: Always enforce syntactic constraints Finance Earthquake

  27. What if we had Annotated Data?

More Related