1 / 23

Mining Opinions from Reviews

Mining Opinions from Reviews. Aditi S. Muralidharan Summer Intern Dept. of Computer Science, UC Berkeley. Dig. A walk-up-and use task-centered product browsing interface. (any product, not just cameras). Dig Demo. too many to read. too much to analyze. Star Ratings?. Reviews.

miach
Télécharger la présentation

Mining Opinions from Reviews

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mining Opinions from Reviews • Aditi S. Muralidharan • Summer Intern • Dept. of Computer Science, UC Berkeley

  2. Dig • A walk-up-and use • task-centered • product browsing interface (any product, not just cameras)

  3. Dig Demo

  4. too many to read too much to analyze Star Ratings? Reviews Customer opinions of features. My Job • Extract useful information from customer reviews Quantitatively expressed

  5. opinion mining Reviews Customer opinions of features The tasks • Extract product features from reviews • Extract opinions about features • Show them to users (Part 2)

  6. 1.Evaluation unit e.g. newspaper article, review, product- feature. 2. Opinion units e.g sentences, phrases, adjectives. { 3. Sentiment score Opinion Mining/ Sentiment Analysis

  7. { { { { { sentences classifier scores score news article training set sentences n-gram bag-of-words features score Established Application: Scoring Documents How many positive articles about President Obama last week? majority voting

  8. { product feature referring opinions product feature score Scoring Product Features • What are product features? • “The controls are intuitive.” • “I easily figured out how to operate it”. explicit explicit easy implicit hard We focus our analysis on explicit features.

  9. Extracting Features From Reviews Which words are product features? • INFORMATION EXTRACTION • How do people talk about known product features? • What else do they talk about that way? • Learn patterns and extract more • Computationally expensive • Precise • FREQUENCY • COUNTING • People describe product features in reviews • Therefore, frequent terms likely to be product features • Extract frequent sequences of nouns • Computationally cheap • Imprecise

  10. hits(“camera has <feature>”) hits(<feature>) x hits(“camera has”) flash daughter vacation ... zoom lens weight ... flash controls battery ... camera has _____ the _____ of this camera it features a _____ Reviews Reviews Extraction patterns Web-PMI Candidates Extracted features Seed features Pattern-Based Information Extraction parallelized implementation takes advantage of all available resources Seed features Seed features

  11. { product feature referring opinions explicit product feature score Scoring Product Features

  12. Extracting Opinions Which words are opinion words? • DEPENDENCY PARSING • Opinion words are adjectives and adverbs • Likely to be opinions if amod / nsubj/advmod relationship exists to feature mention. • Computationally expensive • neg (negation) relations are easily detected • Precise • PROXIMITY • Opinion words are adjectives and adverbs. • Likely to be opinions if they occur near a feature mention • Computationally cheap • Negation is hard to detect • Imprecise

  13. nsubj flash controls battery ... intuitive advmod amod large Review sentence dependency parses controls Extracted features natural Extracting Opinions “The controls are intuitive.” “There are large controls on the top.” nsubj “The controls feel natural.” How to classify adjectives?

  14. { product feature referring opinions explicit product feature score Scoring Product Features

  15. HITS(“camera” near adj, great) HITS(“camera” NEAR adj) x HITS(“camera” NEAR great) great + poor - excellent + terrible - ... intuitive (:) camera classifier +/- WebPMI(adj, great) = unknown adjective context training words WebPMI feature vector Web-PMI known-polarity adjectives Classifying Opinions +/- • Synonymous words have high Web-PMI with each other F1 Scores: 0.78(+) 0.76(-)

  16. { product feature referring opinions explicit product feature score Scoring Product Features avoid extreme estimates

  17. fixed priors a+ a- true sentiment s true adjective polarity p Estimating Product Feature Scores • When there are few data data points, averaging gives extreme estimates • Beta-binomial smoothing model. • Estimate “true” sentiment s for each product feature. • Distribution of observed adjectives is binomial on “true” sentiment. • Added layer for classification mistakes observed polarity w (from classifier)

  18. { product feature referring opinions explicit product feature score Scoring Product Features avoid extreme estimates

  19. Part 2: UI

  20. Opinions in the UI • Main interface helps user select a set of products • Need to compare selected products • Need to compare customer opinion summaries and details

  21. Comparison Interface • Parallel coordinates show different quantitative attributes

  22. Customer Opinions • Red and green bars summarize the number and positivity of opinions. Adjectives appear in a list.

  23. Thanks,Questions?

More Related