1 / 53

Automating & Evaluating Metadata Generation

Automating & Evaluating Metadata Generation. Elizabeth D. Liddy Center for Natural Language Processing School of Information Studies Syracuse University. Outline. Semantic Web Metadata 3 Metadata R & D Projects. Semantic Web.

clancy
Télécharger la présentation

Automating & Evaluating Metadata Generation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automating & Evaluating Metadata Generation Elizabeth D. Liddy Center for Natural Language Processing School of Information Studies Syracuse University

  2. Outline • Semantic Web • Metadata • 3 Metadata R & D Projects

  3. Semantic Web • Links digital information in such a way as to make the information easily processable by computers globally • Enables publishing data in a re-purposable form • Built on syntax which uses URIs and RDF to represent and exchange data on the web • Maps directly & unambiguously to a model • Generic parsers are available • However, requisite processing is still largely manual

  4. Metadata • Structured data about resources • Supports a wide range of operations: • Management of information resources • Resource discovery • Enables communication and co-operation amongst: • Software developers • Publishers • Recording & television industry • Digital libraries • Providers of geographical & satellite-based information • Peer-to-peer community

  5. Metadata (cont’d) • Value-added information which enables information objects to be: • Identified • Represented • Managed • Accessed • Standards within industries enable interoperability between repositories & users • However, produced manually

  6. Educational Metadata Schema Elements • GEM Metadata Elements • Audience • Cataloging • Duration • Essential Resources • Pedagogy • Grade • Standards • Quality • Dublin Core Metadata Elements • Contributor • Coverage • Creator • Date • Description • Format • Identifier • Language • Publisher • Relation • Rights • Source • Subject • Title • Type

  7. Educational Metadata Schema Elements • GEM Metadata Elements • Audience • Cataloging • Duration • Essential Resources • Pedagogy • Grade • Standards • Quality • Dublin Core Metadata Elements • Contributor • Coverage • Creator • Date • Description • Format • Identifier • Language • Publisher • Relation • Rights • Source • Subject • Title • Type

  8. Semantic Web  MetaData ? • But both…. • Seek same goals • Use standards & crosswalks between schema • Look for comprehensive, well-understood, well-used sets of terms for describing content of information resources • Enable mutual sharing, accessing, and reuse of information resources

  9. NSDL MetaData Projects • Breaking the MetaData Generation Bottleneck • CNLP • University of Washington • StandardConnection • University of Washington • CNLP • MetaTest • CNLP • Center for Human Computer Interaction – Cornell University

  10. Breaking the MetaData Generation Bottleneck • Goal: Demonstrate feasibility of automatically generating high-quality metadata for digital libraries through Natural Language Processing • Data: Full-text resources from clearinghouses which provide teaching resources to teachers, students, administrators and parents • Metadata Schema: Dublin Core + Gateway for Educational Materials (GEM) Schema

  11. Method: Information Extraction • Natural Language Processing • Technology which enables a system to accomplish human-like understanding of document contents • Extracts both explicit and implicit meaning • Sublanguage Analysis • Utilizes domain and genre-specific regularities vs. full-fledged linguistic analysis • Discourse Model Development • Extractions specialized for communication goals of document type and activities under discussion

  12. Information Extraction • Types of Features recognized & utilized: • Non-linguistic • Length of document • HTML and XML tags • Linguistic • Root forms of words • Part-of-speech tags • Phrases (Noun, Verb, Proper Noun, Numeric Concept) • Categories (Proper Name & Numeric Concept) • Concepts (sense disambiguated words / phrases) • Semantic Relations • Discourse Level Components

  13. Sample Lesson Plan Stream Channel Erosion Activity Student/Teacher Background: Rivers and streams form the channels in which they flow. A river channel is formed by the quantity of water and debris that is carried by the water in it. The water carves and maintains the conduit containing it. Thus, the channel is self-adjusting. If the volume of water, or amount of debris is changed, the channel adjusts to the new set of conditions. ….. ….. Student Objectives: The student will discuss stream sedimentation that occurred in the Grand Canyon as a result of the controlled release from Glen Canyon Dam. …

  14. NLP Processing of Lesson Plan Input: The student will discuss stream sedimentation that occurred in the Grand Canyon as a result of the controlled release from Glen Canyon Dam. Morphological Analysis: The student will discuss stream sedimentation that occurred in the Grand Canyon as a result of the controlled release from Glen Canyon Dam. Lexical Analysis: The|DT student|NN will|MD discuss|VB stream|NN sedimentation|NN that|WDT occurred|VBD in|IN the|DT Grand|NP Canyon|NP as|IN a|DT result|NN of|IN the|DT controlled|JJ release|NN from|IN Glen|NP Canyon|NP Dam|NP .|.

  15. NLP Processing of Lesson Plan (cont’d) Syntactic Analysis - Phrase Identification: The|DT student|NN will|MD discuss|VB <CN> stream|NN sedimentation|NN </CN> that|WDT occurred|VBD in|IN the|DT <PN> Grand|NP Canyon|NP </PN> as|IN a|DT result|NN of|IN the|DT <CN> controlled|JJ release|NN </CN> from|IN <PN> Glen|NP Canyon|NP Dam|NP </PN> .|. Semantic Analysis Phase 1- Proper Name Interpretation: The|DT student|NN will|MD discuss|VB <CN> stream|NN sedimentation|NN </CN> that|WDT occurred|VBD in|IN the|DT <PN cat=geography/location> Grand|NP Canyon|NP </PN> as|IN a|DT result|NN of|IN the|DT <CN> controlled|JJ release|NN </CN> from|IN <PN cat=geography/structure> Glen|NP Canyon|NP Dam|NP </PN> .|.

  16. NLP Processing of Lesson Plan(cont’d) Semantic Analysis Phase 2 - Event & Role Extraction Teaching event:discussactor:student topic:stream sedimentation event:stream sedimentationlocation:Grand Canyon cause:controlled release

  17. HTML Document Html Document MetaExtract HTML Converter Metadata Retrieval Module Configuration Potential Keyword data eQuery Extraction Module Cataloger Catalog Date Rights Publisher Format Language Resource Type Title Description Essential Resources Relation Creator Grade/Level Duration Date Pedagogy Audience Standard PreProcessor Tf/idf Keywords Output Gathering Program HTML Document with Metadata

  18. Automatically Generated Metadata Title: Grand Canyon: Flood! - Stream Channel Erosion ActivityGrade Levels: 6, 7, 8 GEM Subjects: Science--Geology Mathematics--Geometry Mathematics--MeasurementKeywords: Named Entities:Colorado River (river), Grand Canyon (geography / location), Glen Canyon Dam (geography / structures) Subject Keywords: channels, conduit, controlled_release, dam, flow_volume, hold, reservoir, rivers, sediment, streams Material Keywords: clayboard, cookie_sheet, cup, paper_towel, pencil, roasting_pan, sand, water

  19. Automatically Generated Metadata (cont’d) Pedagogy: Collaborative learning Hands on learning Tool For: Teachers Resource Type: Lesson PlanFormat: text/HTMLPlaced Online: 1998-09-02 Name: PBS Online Role: onlineProvider Homepage: http://www.pbs.org

  20. Metadata Evaluation Experiment • Blind test of automatic vs. manually generated metadata • Subjects: • Teachers • Education Students • Professors of Education • Web-based experiment • Subjects provided with educational resources and metadata records • 2 conditions tested

  21. Metadata Evaluation Experiment • Blind Test of Automatic vs. Manual Metadata • Expectation Condition – Subjects reviewed: 1st - metadata record 2nd - lessson plan and then judged whether metadata provided an accurate preview of the lesson plan on 1 to 5 scale

  22. Metadata Evaluation Experiment • Blind Test of Automatic vs. Manual Metadata • Expectation Condition – Subjects reviewed: 1st - metadata record 2nd - lessson plan and then judged whether metadata provided an accurate preview of the lesson plan on 1 to 5 scale Satisfaction Condition– Subjects reviewed: 1st – lesson plan 2nd – metadata record and then judged the accuracy and coverage of metadata on 1 to 5 scale, with 5 being high

  23. Qualitative Experimental Results • Expec Satis Comb • # Manual Metadata Records 153 571 724 • # Automatic Metadata Records 139 532 671

  24. Qualitative Experimental Results • Expec Satis Comb • # Manual Metadata Records 153 571 724 • # Automatic Metadata Records 139 532 671 • Manual Metadata Average Score4.03 3.81 3.85 • Automatic Metadata Average Score3.76 3.55 3.59

  25. Qualitative Experimental Results • Expec Satis Comb • # Manual Metadata Records 153 571 724 • # Automatic Metadata Records 139 532 671 • Manual Metadata Average Score4.03 3.81 3.85 • Automatic Metadata Average Score3.76 3.55 3.59 • Difference 0.27 0.26 0.26

  26. MetaData Research Projects • Breaking the MetaData Generation Bottleneck • StandardConnection • MetaTest

  27. StandardConnection • Goal: Determine feasibility & quality of automatically mapping teaching standards to learning resources • “Solve linear equations and inequalities algebraically and non-linear equations using graphing, symbol-manipulating or spreadsheet technology.” • Data:Educational Resources:Lesson Plans, Activities, Assessment Units, etc. • Teaching Standards:Achieve/McREL Compendix

  28. “Simultaneous Equations Using Elimination” URI: M8.4.11ABCJ Washington Mapping California Mapping New York Mapping Compendix Florida Mapping Arkansas Mapping Alaska Mapping Michigan Mapping Cross-mapping through the Compendix Meta-language Texas Mapping

  29. StandardConnection Components Educational Resources: Lesson Plans, Activities, Assessment Units, etc. Compendix Mathematics: 6.2.1 C Adds, subtracts, multiplies, & divides whole numbers and decimals State Standards

  30. Submitted by: Leslie Howe Email: teachhowe2@hotmail.com School/University/Affiliation: Farragut High School, Knoxville, Tn Grade Level: 9, 10, 11, 12, Higher education, Vocational education, Adult/Continuing education Subject(s): Mathematics / Algebra Duration: 30 minutes Description: The Elimination method is an effective method for solving a system of two unknowns. This lesson provides students with immediate feedback using a computer program or online applet. Goals: The student will be able to solve a system of two equations when there are two unknowns. Materials: Online computer applet / program http://www.usit.com/howe2/eqations/index.htm Similar downloadable C++ application available at the same site. Procedure: A system of two unknowns can be solved by multiplying each equation by the constant that will make the coefficient of one of the variables become the LCM (least common multiple) of the initial coefficients. Students may use the scroll bars on the indicated applet to multiply the equations by constants until the GCF is located. When the "add" button is activated after the correct constants are chosen one of the variables will be eliminated. The process can be repeated for the second variable. The student may enter the solution of the system by using scroll bars. When the "check" button is pressed the answer is evaluated and the student is given immediate feedback. (The same procedure can be done using the downloadable C++ application.) After 5-10 correct responses the student should make the transition to paper and solve the equations without using the applet. The student can still use the applet to check the answer. The applet will generate problems in a random fashion. All solutions are integers. Assessment: The lesson itself provides alternative assessment. The correct responses are recorded. Lesson Plan: “Simultaneous Equations Using Elimination”

  31. Submitted by: Leslie Howe Email: teachhowe2@hotmail.com School/University/Affiliation: Farragut High School, Knoxville, Tn Grade Level: 9, 10, 11, 12, Higher education, Vocational education, Adult/Continuing education Subject(s): Mathematics / Algebra Duration: 30 minutes Standard: McREL 8.4.11 Uses a variety of methods (e.g., with graphs, algebraic methods, and matrices) to solve systems of equations and inequalities Description: The Elimination method is an effective method for solving a system of two unknowns. This lesson provides students with immediate feedback using a computer program or online applet. Goals: The student will be able to solve a system of two equations when there are two unknowns. Materials: Online computer applet / program http://www.usit.com/howe2/eqations/index.htm Similar downloadable C++ application available at the same site. Procedure: A system of two unknowns can be solved by multiplying each equation by the constant that will make the coefficient of one of the variables become the LCM (least common multiple) of the initial coefficients. Students may use the scroll bars on the indicated applet to multiply the equations by constants until the GCF is located. When the "add" button is activated after the correct constants are chosen one of the variables will be eliminated. The process can be repeated for the second variable. The student may enter the solution of the system by using scroll bars. When the "check" button is pressed the answer is evaluated and the student is given immediate feedback. (The same procedure can be done using the downloadable C++ application.) After 5-10 correct responses the student should make the transition to paper and solve the equations without using the applet. The student can still use the applet to check the answer. The applet will generate problems in a random fashion. All solutions are integers. Assessment: The lesson itself provides alternative assessment. The correct responses are recorded. Lesson Plan: “Simultaneous Equations Using Elimination”

  32. Automatic Assigning of Standards as a Retrieval Process Index of terms from Standards

  33. DOCUMENT COLLECTION = Compendix Standards Indexed Processed Assembled Standard Standards Index of Standards is assembled from the subject heading, secondary subject, actual standard, and vocabulary.

  34. Automatic Assigning of Standards as a Retrieval Process Index of terms from Standards

  35. Automatic Assigning of Standards as a Retrieval Process Lesson Plan as Query Index of terms from Standards

  36. QUERY = NLP Processed Lesson Plan New Lesson Plan Relevant parts of lesson plan Simultaneous|JJ Equations|NNS Using|VBG Elimination|NN Query=Top 30 terms: equation, eliminate solve Filtering:Sections are eliminated or given greater weight (e.g. citations are removed). Natural Language Processing: Includes part-of-speech tagging, bracketing of phrases & proper names TF/IDF: Relative frequency weights of words, phrases, proper names, etc

  37. Automatic Assigning of Standards as a Retrieval Process Lesson Plan as Query Index of terms from Standards Assignment of Standard to Lesson Plan

  38. Teaching Standard Assignment as Retrieval Task Experiment • Exploratory test run • 3,326 standards (documents) • TF/IDF term weighting scheme • 2,239 lesson plans (queries) • top 30 weighted terms from each as a query vector • Manual evaluation • Focusing on understanding of issues & solutions

  39. Information Retrieval Experiments • Baseline Results • 68 queries (lesson plans) evaluated • 24 (35%) queries - appropriate standard was ranked first • 28 (41%) queries - predominant standard was in top 5 • Room for improvement, but promising

  40. Future Research • Improve current retrieval performance • Matching algorithm, document expansion, etc • Apply classification approach to Standard Connection Project • Compare information retrieval approach and classification approach • Improve browsing access for teachers & administrators

  41. Browsing Access to Learning Resources Browsable Map of Standards, e.g. Strand Maps Automatic Assignment of Standards to Lesson Plans Standard 8.3.6: Solves simple inequalities and non-linear equations with rational number solutions, using concrete and informal methods . Lesson Plan with Standards attached Standard 8.4.11:Uses a variety of methods (e.g., with graphs, algebraic methods, and matrices) to solve systems of equations and inequalities Standard 8.4.11 Linked Standard 8.4.12 Understands formal notation (e.g., sigma notation, factorial representation) and various applications (e.g., compound interest) of sequences and series

  42. MetaData Research Projects • Breaking the MetaData Generation Bottleneck • StandardConnection • MetaTest

  43. Life-Cycle Evaluation of Metadata • 1. Initial generation 2. Accessing DL resources - Methods - Users’ interactions - Manual - Browsing - Automatic - Searching - Costs - Relative contribution of - Time each metadata element - Human Resources - Technology 3. Search Effectiveness - Precision - Recall

  44. Browsing Searching Precision Recall METHODS: Manual Semi-Automatic Automatic COSTS: Time Human Resources Technology GOAL: Measure Quality & Usefulness of Metadata Metadata Metadata Generation Evaluation User System Understanding

  45. Evaluation Methodology • Automatically metatag a Digital Library collection that has already been manually meta-tagged. • Solicit range of appropriate Digital Library users. • For each metadata element: • 1.Users qualitatively evaluate it in light of the digital resource. • 2. Conduct a standard IR experiment. • 3. Observe subjects while searching & browsing. • Monitor with eye-tracking & think-aloud protocols

  46. Information Retrieval Experiment • Users ask queries of system • System retrieves documents using either: • Manually assigned metadata • Automatically generated metadata • System ranks documents in order by system estimation of relevance • Users review retrieved documents & judge relevance • Compute precision & recall • Compare results according to: • Method of assignment • The Metadata element which enabled retrieval

  47. User Studies: Methods & Questions • Observations of Users Seeking DL Resources • How do users search & browse the digital library? • Do search attempts utilize the available metadata? • Which metadata elements are most important to users? • Which are used consistently for the best results?

  48. User Studies: Methods & Questions (cont’d) • 2. Eye-tracking with Think-aloud Protocols • Which metadata elements do users spend most time viewing? • What are users thinking about when seeking digital library resources? • Show correlation between what users are looking at and thinking. • Use eye-tracking to measure the number & duration of fixations, scan paths, dilation, etc. • 3. Individual Subject Data • How does expertise / role influence seeking resources from digital libraries?

  49. Sample Lesson Plans

  50. Eye Scan Path For Bug Club Document

More Related