1 / 12

IFLA FRBR & MIC Metadata Evaluation

IFLA FRBR & MIC Metadata Evaluation. http://www.scils.rutgers.edu/~miceval Ying Zhang Yuelin Li October 14, 2003. MIC Metadata Evaluation. Framework – the IFLA FRBR (Functional Requirement for Bibliographic Record) MIC evaluation cases Experiences and lessons.

Télécharger la présentation

IFLA FRBR & MIC Metadata Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IFLA FRBR & MIC Metadata Evaluation http://www.scils.rutgers.edu/~miceval Ying Zhang Yuelin Li October 14, 2003

  2. MIC Metadata Evaluation • Framework – the IFLA FRBR (Functional Requirement for Bibliographic Record) • MIC evaluation cases • Experiences and lessons

  3. The IFLA FRBR - MICEval Framework • Find – can a user enter a search and retrieve records relevant to the search • Identify – once the user retrieves record, can he/she successfully interpret the information in the record to know whether the source information will be relevant to his/her needs? • Select – can the user compare the information in multiple records and determine the most relevant record? • Obtain – can the user successfully obtain the original artifacts, based on the information provided in the source information

  4. Directory Schema Evaluation –question/methods • Usefulness assessment How useful is each directory element in terms of helping target users to find, identify, select, and obtain source information ? • Criterion:Perceived usefulness • Methodology:Online survey • Embed the FRBR framework • Provide situational information • Sampling frame (Science educators, archivists)

  5. Directory Schema Evaluation –Sample section head 3. SELECT – Confirm that the record describes the organization most appropriate to the users’ needs based on conformance to important criteria for comparing one organization to others retrieved in a search USE: These fields will display in the short listing when multiple records result from a search. These fields will enable a user to quickly select the most useful records among multiple records retrieved in a search Example (Prototype screen for illustration)

  6. Directory Schema Evaluation –Results/Applications • Determine useful • directory elements • for the user • community • Identify potential • elements that are • missed in current • schema • Improve the search • and result display • interfaces

  7. Metadata schema evaluation—Proposal • Usability test How usable is the MIC metadata schema in terms of helping target users to find, identify, select, and obtain source information ? • Measures • Information adequacy • Information accuracy • Ease of understanding • Helpfulness • Physical item accessibility • Precision • Error rate • Satisfaction

  8. Metadata schema evaluationEmbedment of FRBR Information accuracy Precision Query modification Find Helpfulness Users’ relevance judgment vs. evaluators’ false judgment detection Ease of understanding Identify Information Adequacy Select Error Rate Physical accessibility check Obtain Physical Item Accessibility IFLA FRBR “Generic Tasks” Usability Measures Treatments

  9. Metadata schema evaluation Methods/treatments • Stratified and purposive sampling • Training and practicing • Demographic questionnaire • Simulated topical scenario • Query modification using metadata records as the source of relevance feedback • Post-test questionnaire • Lab observation (audio/video taping, observation notes) • Think aloud protocol • Exit interview …

  10. MIC Evaluation (experiences/lessons) • Embed the IFLA FRBR • Adapt measures & treatments • Provide situational information FRBR 4 generic tasks Embedment Evaluation questions Criteria measures instruments Adaptation Brain storming Literature review Communication MIC Analysis MIC Evaluation Approach

  11. Acknowledgement • Thank Ms. Grace Agnew for her innovative idea of applying the IFLA FRBR as the framework for the evaluation project • Thank Dr. Tefko Saracevic for his excellent leadership on our evaluation team • Thank Ms. Judy Jeng for her nice work as a team member

  12. MIC Evaluation Team Tefko Saracevic, Ph.D, Evaluation InvestigatorYing Zhang, Doctoral Student, Evaluation Coordinator Yuelin Li,Doctoral Student Judy Jeng,Ph.D. Candidate School of Communication, Information and Library StudiesRutgers, the State University of New Jersey4 Huntington Street, New Brunswick, NJ 08901 U.S.A.Tel.: (732)932-7500/Extension 8222Fax: (732)932-2644Email: miceval@scils.rutgers.eduURL:http://www.scils.rutgers.edu/~miceval

More Related