1 / 24

Ayako Hiramatsu Shingo Tamura Osaka Sangyo University Osaka University

Method for Atypical Opinion Extraction from Answers in Open-ended Questions (IEEE International Conference on Computational Cybernetics ICCC 2004). Ayako Hiramatsu Shingo Tamura Osaka Sangyo University Osaka University Hiroaki Oiso Norihisa Komoda Codetoys K. K. Osaka University. Abstract.

yannis
Télécharger la présentation

Ayako Hiramatsu Shingo Tamura Osaka Sangyo University Osaka University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Method for Atypical Opinion Extraction from Answers in Open-ended Questions(IEEE International Conference on Computational Cybernetics ICCC 2004) Ayako Hiramatsu Shingo Tamura Osaka Sangyo University Osaka University Hiroaki Oiso Norihisa Komoda Codetoys K. K. Osaka University

  2. Abstract • Introduction • Open-ended questions vs. closed-ended questions • Atypical opinions vs. typical opinions • System • Aim • 3 Methods: ratio, distance, phrases • Experiment • Application experiment • Evaluation experiment • Conclusion

  3. Introduction (1/5) • Motivation: Mobile game market has been expanding rapidly  Game providers need to attract more users and prolong the subscription period per user  Subscribers answer questionnaires when canceling their accounts  Closed-ended and open-ended questions  Typical and atypical opinions

  4. Introduction (2/5) • Target game: mobile quiz game • In Japanese • Since 2002 • 3 carriers • Questions are answered by choosing a correct answer from 4 choices. • If consumers unsubscribe, all information is lost, and the questionnaire (closed-ended & open-ended questions) is given to be answered.

  5. Introduction (3/5) • Closed-ended questions: • Users are asked to choose from a limited number of pre-selected answers. • Unable to acquire unexpected ideas • Example:

  6. Introduction (4/5) • Open-ended questions: • Consumers can freely write opinions. • Not punctuated, ungrammatical, and abbreviated • Reveal dissatisfaction that cannot be captured in the closed-ended questions. • Few useful answers: most answers reflect opinions already known by closed-ended questions • Time-consuming to read all of the texts • Types: typical & atypical

  7. Introduction (5/5) • Typical & atypical open-ended questions:

  8. System (1/10) atypical • Aim: a system that efficiently extracts unexpectedly unique ideas by culling useless opinions from the data of open-ended question. • Outline: typical words: noun, adj, verb (ChaSen) “packet “ + “fee” “packet fee” 3 comparing method Next slide

  9. System (2/10) • Typical word database:

  10. System (3/10) • To extract atypical opinions • Compare the keywords of each opinion with the typical word database • 3 methods: • Based on the ratio of typical word combinations in the sentences • Consider the word order and the distance of difference between the positions of words • Divide the opinion into phrases at each typical word combination

  11. System (4/10) • Method 1: ratio • Remove opinions having neither keyword nor a noun keyword • Compare keywords with typical elements (the combination in the typical word database)

  12. System (5/10) • Example: • Formula 1: 2+2×1≧4 2+2×1≦6 typical α=2

  13. System (6/10) • Problems: • Misrecognition  method 2 • Long sentence  method 3 2+2×1≧4

  14. System (7/10) • Method 2: distance • Keyword distance d : the position difference of keywords • Modify typical elements: keyword distance is short, i.e. 2 keywords appearing near (d = 2) • Apply Formula 1

  15. System (8/10) • Example: 2+2×1≧4 0+2×0≦4

  16. System (9/10) • Method 3: phrases • Long sentences  few atypical elements  should NOT be omitted  sentences should be divided into phrases by delimiters • Delimiters: • Punctuation mark  pictograph (X) • Typical elements  (O) • Apply Formula 1 on phrases

  17. System (10/10) • Example:

  18. Application Experiment (1/2) • Compare the three proposed methods • Questionnaire data of users who unsubscribed from a certain carrier for 7 months • Content provider classified 3263 opinions = 2993 typical & 270 atypical opinions • About 8000 kinds of word combinations were registered to the typical word database

  19. Application Experiment (2/2) • Result: ratio distance phrases ANS 2993 270

  20. Evaluation Experiment (1/2) • Examine the best method: method 3 • Questionnaire data of users who unsubscribed from other carriers • Content providers classified 1764 opinions = 1589 typical & 175 atypical opinions • The typical word database is the same as in the application experiment.

  21. Evaluation Experiment (2/2) • Result: • The opinions with short sentences having 3 or 4 keywords  low recall  α=1  extract a huge number of atypical opinions  low precision  tradeoff LESS Satisfactory! ANS 1589 175

  22. Conclusion • Described a support system for atypical opinion extraction from answers in open-ended questions collected from consumers of mobile games when they unsubscribe • Proposed three methods of extraction of atypical opinions: ratio, distance, phrases • Differences of carriers also affect the accuracy of extraction.

  23. Q & A

  24. Delimiters insertion

More Related