1 / 35

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 )

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 ). Theresa Wilson Janyce Wiebe Paul Hoffmann ( University of Pittsburgh ) Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/. Outline. Introduction

amal-hyde
Télécharger la présentation

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 )

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis (HLT/EMNLP 2005 ) Theresa Wilson Janyce Wiebe Paul Hoffmann (University of Pittsburgh) Acknowledgements: This slide is created based on the presentation slides from http://www.cs.pitt.edu/~wiebe/

  2. Outline • Introduction • Manual Annotations • Corpus • Prior-Polarity Subjectivity Lexicon • Experiments • Conclusions

  3. Introduction (1/6) • Sentiment analysis: task of identifying positive and negative opinions, emotions, and evaluations • How detailed?  depends on the application • Flame detection, review classification  document-level analysis • Question answering, review mining  sentence or phrase-level analysis

  4. Introduction (2/6) • QA example: • Q: What is the international reaction to the reelection of Robert Mugabe as President of Zimbabwe? • A: African observers generally approved of his victory while Western Governments denounced it.

  5. Introduction (3/6) • Prior polarity: • Use a lexicon of positive and negative words • Examples: • beautiful  positive • horrid  negative • Out of context • Contextual polarity: • A word may appear in a phrase that expresses a different polarity in context • Example: • Cheers to Timothy Whitfield for the wonderfully horrid visuals.

  6. Introduction (4/6) • Another interesting example: • Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.

  7. Introduction (5/6) • Another interesting example: • Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable. prior polarity contextual polarity

  8. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Introduction (6/6) • Goal: automatically distinguish contextual polarity • Approach: use machine learning and variety of features

  9. Manual Annotation (1/3) • Need: sentiment expressions (positive and negative expressions of emotions, evaluations, stances) with contextual polarity • Had: subjective expression (words/phrases expressing emotions, evaluations, stances, speculations, etc.) annotations in MPQA Opinion Corpus • Decision: annotate subjective expressions in MPQA Corpus with their contextual polarity

  10. Manual Annotation (2/3) • Mark polarity of subjective expressions as positive, negative, both, or neutral • African observers generally approved (positive) of his victory while Western governments denounced (negative) it. • Besides, politicians refer to good and evil (both) … • Jerome says the hospital feels (neutral) no different than a hospital in the states. • Judge the contextual polarity of sentiment ultimately being conveyed • They have not succeeded, and will never succeed (positive), in breaking the will of this valiant people.

  11. Manual Annotation (3/3) • Agreement study: • 2 annotators, using 10 documents with 447 subjective expressions • Kappa: 0.72 (82%) • Remove uncertain cases  at least one annotator marked uncertain (18%) • Kappa: 0.84 (90%) • But all data are included in experiments

  12. Corpus • 425 documents from MPQA Opinion Corpus • 15,991 subjective expressions in 8,984 sentences • Divided into two sets • Development set • 66 docs / 2,808 subjective expressions • Experiment set • 359 docs / 13,183 subjective expressions • Divided into 10 folds for cross-validation

  13. Prior-Polarity Subjectivity Lexicon • Over 8,000 words from a variety of sources • Both manually and automatically identified • Positive/negative words from General Inquirer and Hatzivassiloglou and McKeown (1997) • All words in lexicon tagged with: • Prior polarity: positive, negative, both, neutral • Reliability: strongly subjective (strongsubj), weakly subjective (weaksubj)

  14. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Experiment • Both Steps: • BoosTexter AdaBoost.HM 5000 rounds boosting • 10-fold cross validation • Give each instance its own label 28 features 10 features

  15. Definition of Gold Standard • Given an instance inst from the lexicon: • if inst not in a subjective expression: goldclass(inst) = neutral • else if inst in at least one positive and one negative subjective expression: goldclass(inst) = both • else if inst in a mixture of negative and neutral: goldclass(inst) = negative • else if inst in a mixture of positive and neutral: goldclass(inst) = positive • else: goldclass(inst) = contextual polarity of subjective expression

  16. Features • Many inspired by Polanya & Zaenen (2004): Contextual Valence Shifters • Examples: little threat, little truth • Others capture dependency relationships between words • Example: wonderfully horrid mod

  17. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Word token • terrifies • Word part-of-speech • VB • Context (3 word tokens) • that terrifies me • Prior Polarity • negative • Reliability • strongsubj

  18. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • (Binary features) • Preceded by • adjective • adverb (other than not) • intensifier (e.g. deeply, entirely…) • Self intensifier • Modifies • strongsubj clue • weaksubj clue • Modified by • strongsubj clue • weaksubj clue Dependency Parse Tree

  19. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus poses subj obj report challenge det p det mod adj adj The human rights a substantial … • Word features • Modification features • Structure features • Sentence features • Document feature • (Binary features) • Climbing up the tree toward the root • In subject • The human rights report poses • In copular • I am confident • In passive voice • must be regarded

  20. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Count of strongsubj clues in • previous, current, next sentence • Count of weaksubj clues in • previous, current, next sentence • Counts of various parts of speech • adjectives, adverbs, whether a pronoun…

  21. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word features • Modification features • Structure features • Sentence features • Document feature • Document topic (15) • economics • health • Kyoto protocol • presidential election in Zimbabwe • …… • For example, document on health may contain the word “fever,” but it is not being used to express a sentiment.

  22. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 1a

  23. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 1b

  24. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Step 2: Polarity Classification • Classes • positive, negative, both, neutral 19,506 5,671

  25. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter

  26. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Word token • terrifies • Word prior polarity • negative

  27. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • (Binary features) • Negated • not good • does not look very good • Negated subject • No politically prudent Israeli could support either of them.

  28. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus substantial(pos)challenge(neg) • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Modifies polarity • 5 values: positive, negative, neutral, both, not mod • substantial: negative • Modified by polarity • 5 values: positive, negative, neutral, both, not mod • challenge: positive

  29. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus good(pos) and evil(neg) • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • Conjunction polarity • 5 values: positive, negative, neutral, both, not mod • good: negative

  30. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Word token • Word prior polarity • Negated • Negated subject • Modifies polarity • Modified by polarity • Conjunction polarity • General polarity shifter • Negative polarity shifter • Positive polarity shifter • 4 words before • General polarity shifter • pose little threat • contains little truth • Negative polarity shifter • lack of understanding • Positive polarity shifter • abate the damage

  31. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 2a

  32. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus Results 2b

  33. Lexicon Step 1 Step 2 All Instances Polar Instances Neutral or Polar? Contextual Polarity? Corpus • Ablation experiments removing features: • AB1: Negated, negated subject • AB2: Modifies polarity, modified by polarity • AB3: Conjunction polarity • AB4: General, negative, positive polarity shifters • Results: • The only significant difference is neutral F-measure when AB2 are removed  the combination of features is needed to achieve significant performance

  34. Conclusion • Automatically identify the contextual polarity of a large subset of sentiment expression • Presented a two-step approach to phrase-level sentiment analysis • Determine if an expression is neutral or polar • Determines contextual polarity of the ones that are polar • Achieve significant results for a large subset of sentiment expressions

  35. Q & A

More Related