1 / 86

Ling 570 Day 7: Classifiers

Ling 570 Day 7: Classifiers. Outline. Open questions One last bit on POS taggers: Evaluation Classifiers!. Evaluating Taggers. Evaluation. How can we evaluate a POS tagger?. Evaluation. How can we evaluate a POS tagger? Overall error rate w.r.t . gold-standard test set.

Télécharger la présentation

Ling 570 Day 7: Classifiers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ling 570 Day 7:Classifiers

  2. Outline • Open questions • One last bit on POS taggers: Evaluation • Classifiers!

  3. Evaluating Taggers

  4. Evaluation • How can we evaluate a POS tagger?

  5. Evaluation • How can we evaluate a POS tagger? • Overall error rate w.r.t. gold-standard test set.

  6. Evaluation metric • A common single metric is accuracy • Defined as • Why not precision and recall?

  7. Evaluation metric • Precision • Recall

  8. Tagging Scenarios • But tagging doesn’t have to result in one tag • It might be possible that we don’t want to resolve tag ambiguities at tag time • E.g., save them for some other process • Or there may not necessarily be one answer for any given word/sequence (e.g., the gold standard contains multiple answers) • In either case eval is not so cut and dried

  9. Evaluation • How can we evaluate the POS tagger? • Overall error rate w.r.t. gold-standard test set. • Error rates on particular tags • Error rates on particular words • Tag confusions...

  10. Error Analysis • Confusion matrix (contingency table) • Identify primary contributors to error rate • Noun (NN) vsProperNoun (NNP) vsAdj (JJ) • Preterite (VBD) vs Participle (VBN) vs Adjective (JJ)

  11. Evaluation • Result is compared to manually coded “Gold Standard” • Typically accuracy reaches 96-97% • Compare with result for a baseline tagger (no context).

  12. Evaluation • Result is compared to manually coded “Gold Standard” • Typically accuracy reaches 96-97% • Compare with result for a baseline tagger (no context). • E.g. Most common class • Assign most frequent POS for each words

  13. Evaluation • Result is compared to manually coded “Gold Standard” • Typically accuracy reaches 96-97% • Compare with result for a baseline tagger (no context). • E.g. Most common class • Assign most frequent POS for each words • Important: • 100% is impossible even for human annotators.

  14. Text Classification andMachine Learning

  15. Example: Sentiment detection • Input: Customer reviews • Output: Did the customer like the product? • Setup: • Get a bunch of labeled data • SOMEBODY HAS TO DO THIS BY HAND! • Want to predict the sentiment given new reviews • Features: • Information about the reviews • Words: “love”, “happy”, “warm”; “waste”, “joke”, “mistake”

  16. Example: Sentiment detection • Input: Customer reviews • Output: Did the customer like the product? • Setup: • Get a bunch of labeled data • SOMEBODY HAS TO DO THIS BY HAND! • Want to predict the sentiment given new reviews • Features: • Information about the reviews • Words: “love”, “happy”, “warm”; “waste”, “joke”, “mistake”

  17. Example: Sentiment detection Don't waste your money. This company is a joke. They were and have been backordered for quite some time, but they just kept advertising on TV all through the holiday season.I agree with another reviewer that it's too easy to make a mistake when ordering on the company webite. I thought I was doing a mock order to see the final price before actually placing the order, but no. It placed the order immedia • Input: Customer reviews • Output: Did the customer like the product? • Setup: • Get a bunch of labeled data • SOMEBODY HAS TO DO THIS BY HAND! • Want to predict the sentiment given new reviews • Features: • Information about the reviews • Words: “love”, “happy”, “warm”; “waste”, “joke”, “mistake”

  18. Example: Sentiment detection Don't waste your money. This company is a joke. They were and have been backordered for quite some time, but they just kept advertising on TV all through the holiday season.I agree with another reviewer that it's too easy to make a mistake when ordering on the company webite. I thought I was doing a mock order to see the final price before actually placing the order, but no. It placed the order immedia • Input: Customer reviews • Output: Did the customer like the product? • Setup: • Get a bunch of labeled data • SOMEBODY HAS TO DO THIS BY HAND! • Want to predict the sentiment given new reviews • Features: • Information about the reviews • Words: “love”, “happy”, “warm”; “waste”, “joke”, “mistake” I love my snuggie! It is warm. I wear it with the opening to the back.The length is long but keeps your feet warm when you set down. If I need to walk around I put the opening to the front. It fits great either way and keeps me very warm. Try it before you judge. You will be very happy and warm. I have gave some to family and friends and the all loved them after they tryed them.

  19. Example: Sentiment detection Don't waste your money. This company is a joke. They were and have been backordered for quite some time, but they just kept advertising on TV all through the holiday season.I agree with another reviewer that it's too easy to make a mistake when ordering on the company webite. I thought I was doing a mock order to see the final price before actually placing the order, but no. It placed the order immedia • Input: Customer reviews • Output: Did the customer like the product? • Setup: • Get a bunch of labeled data • SOMEBODY HAS TO DO THIS BY HAND! • Want to predict the sentiment given new reviews • Features: • Information about the reviews • Words: “love”, “happy”, “warm”; “waste”, “joke”, “mistake” I love my snuggie! It is warm. I wear it with the opening to the back.The length is long but keeps your feet warm when you set down. If I need to walk around I put the opening to the front. It fits great either way and keeps me very warm. Try it before you judge. You will be very happy and warm. I have gave some to family and friends and the all loved them after they tryed them.

  20. Example: Sentiment Analysis • Task: • Given a review, predict if they liked it • Outcomes: • Positive / negative, or 1-5 scale • Split by aspect (liked the comfort, hated the price) • What information would help us predict this?

  21. Formalizing the task • Task: • inputs, e.g. review • outcomes; could be • Data set input/output pairs • Split into training, dev, test • Experimentation cycle • Learn a classifier on train • Tune free parameters on dev data • Evaluate on test • Need to keep these separate! Test data Training data Development data

  22. Classification Examples • Spam filtering • Call routing • Text classification

  23. POS Tagging • Task: Given a sentence, predict tag of each word • Is this a classification problem?

  24. POS Tagging • Task: Given a sentence, predict tag of each word • Is this a classification problem? • Categories: N, V, Adj,… • What information is useful?

  25. POS Tagging • Task: Given a sentence, predict tag of each word • Is this a classification problem? • Categories: N, V, Adj,… • What information is useful? • How do POS tagging, text classification differ?

  26. POS Tagging • Task: Given a sentence, predict tag of each word • Is this a classification problem? • Categories: N, V, Adj,… • What information is useful? • How do POS tagging, text classification differ? • Sequence labeling problem

  27. Word Segmentation • Task: Given a string, break into words • Categories:

  28. Word Segmentation • Task: Given a string, break into words • Categories: • B(reak), NB (no break) • B(eginning), I(nside), E(nd) • e.g. c1 c2 || c3 c4 c5

  29. Word Segmentation • Task: Given a string, break into words • Categories: • B(reak), NB (no break) • B(eginning), I(nside), E(nd) • e.g. c1 c2 || c3 c4 c5 • c1/NB c2/B c3/NB c4/NB c5/B • c1/B c2/E c3/B c4/I c5/E • What type of task?

  30. Word Segmentation • Task: Given a string, break into words • Categories: • B(reak), NB (no break) • B(eginning), I(nside), E(nd) • e.g. c1 c2 || c3 c4 c5 • c1/NB c2/B c3/NB c4/NB c5/B • c1/B c2/E c3/B c4/I c5/E • What type of task? • Also sequence labeling

  31. The Structure of a Classification Problem

  32. Classification Problem Steps • Input processing: • Split data into training/dev/test • Convert data into an Attribute-Value Matrix • Identify candidate features • Perform feature selection • Create AVM representation • Training • Testing • Evaluation

  33. Classifiers in practice Training data Model learning Model

  34. Classifiers in practice Training data Model learning Model Predictions Test data Modeltesting

  35. Classifiers in practice Training data Model learning Model Preprocessing Post-processing Predictions Test data Modeltesting

  36. Classifiers in practice Training data Model learning Model Preprocessing Post-processing Predictions Test data Modeltesting Evaluation

  37. Representing Input • Potentially infinite values to represent

  38. Representing Input • Potentially infinite values to represent • Represent input as feature vector • x=<v1,v2,v3,…,vn> • x=<f1=v1,f2=v2,…,fn=vn>

  39. Representing Input • Potentially infinite values to represent • Represent input as feature vector • x=<v1,v2,v3,…,vn> • x=<f1=v1,f2=v2,…,fn=vn> • What are good features?

  40. Example I • Spam Tagging • Classes: Spam/Not Spam • Input: • Email messages

  41. Doc1 Western Union Money Transfer office29@yahoo.com.phOne Bishops Square Akpakpa E1 6AO, CotonouBenin RepublicWebsite: http://www.westernunion.com/ info/selectCountry.asPPhone: +229 99388639Attention Beneficiary,This to inform you that the federal ministry of finance Benin Republic has started releasing scam victim compensation fund mandated by United Nation Organization through our office.I am contacting you because our agent have sent you the first payment of $5,000 for your compensation funds total amount of $500 000 USD (Five hundred thousand united state dollar)We need your urgent response so that we shall release your payment information to you.You can call our office hot line for urgent attention(+22999388639)

  42. Doc2 • Hello! my dear. How are you today and your family? I hope all is good,kindly pay Attention and understand my aim of communicating you todaythrough this Letter, My names is Saif al-Islam  al-Gaddafi the Son offormer  Libyan President. i was born on 1972 in Tripoli Libya,By Gaddafi’ssecond wive.I want you to help me clear this fund in your name which i deposited inEurope please i would like this money to be transferred into your accountbefore they find it.the amount is 20.300,000 million GBP British Pounds sterling through a

  43. Doc3 • from: web.25.5.office@att.net • Apply for loan at 3% interest Rate..Contact us for details.

  44. Doc4 • from: acl@aclweb.org • REMINDER:If you have not received a PIN number to vote in the elections and have not already contacted us, please contact either DragoRadev (radev@umich.edu) or Priscilla Rasmussen (acl@aclweb.org) right away.Everyone who has not received a pin but who has contacted us already will get a new pin over the weekend.Anyone who still wants to join for 2011 needs to do this by Monday (November 7th) in order to be eligible to vote.And, if you do have your PIN number and have not voted yet, remember every vote counts!

  45. What are good features?

  46. Possible Features • Words!

  47. Possible Features • Words! • Feature for each word

  48. Possible Features • Words! • Feature for each word • Binary: presence/absence • Integer: occurrence count • Particular word types: money/sex/: [Vv].*gr.*

  49. Possible Features • Words! • Feature for each word • Binary: presence/absence • Integer: occurrence count • Particular word types: money/sex/: [Vv].*gr.* • Errors: • Spelling, grammar

  50. Possible Features • Words! • Feature for each word • Binary: presence/absence • Integer: occurrence count • Particular word types: money/sex/: [Vv].*gr.* • Errors: • Spelling, grammar • Images

More Related