1 / 15

C rossing the R ubricon Assessing the Instructor

C rossing the R ubricon Assessing the Instructor. Ned Fielden, Mira Foster San Francisco State University San Francisco, California USA. Case Study Assessment of Librarian Instructors. Literature Review Theoretical Issues Rubric Design and Implementation Preliminary Review.

kyria
Télécharger la présentation

C rossing the R ubricon Assessing the Instructor

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Crossing the RubriconAssessing the Instructor Ned Fielden, Mira Foster San Francisco State University San Francisco, California USA

  2. Case StudyAssessment of Librarian Instructors • Literature Review • Theoretical Issues • Rubric Design and Implementation • Preliminary Review

  3. Instructor AssessmentSeveral Methods • Supervisor Review • Peer Evaluation • Surveys • Performance Assessment (Learning outcomes of students assessed)

  4. Institutional Need for Instructor Assessment • Retention of probationary candidates, Tenure and Promotion • CSU as public institution, criteria based, strict rules about personnel review • Summative vs. formative assessment

  5. Process • Literature Review • Identify Suitable Mechanism for Review • Create Draft • Consult with Library Education Committee • Formally Adopted by Library Faculty

  6. Rubrics • Powerful, easy to use, standardized • Considerable literature on rubric use for students/programs/outcomes • Little on library instructor usage

  7. Value of Rubrics • Standardised • Easy to use (minimal training) • Insures all criteria of review met • Possibilities of quantitative data analysis, introduction of new values • Can be employed both for summative and formative assessment

  8. Rubric Basics • Glorified “checklist”, annotated to establish criteria, distinct items

  9. Rubric Complexity • May be designed to reflect highly nuanced categories *Oakleaf, M.L., 2006. Assessing information literacy skills, Dissertation, University of North Carolina.

  10. Types of Rubrics • Analytic • Specific Criteria • Isolated Facets • Capacity For Highly Granular Scoring Analytic rubrics “divide … a product or performance into essential traits or dimensions so that they can be judged separately…” * • Holistic • Big Picture • Fuzzier Focus “overall, single judgment of quality” * *Arter and Tighe, Scoring rubrics, 2001.

  11. Rubric Design • What criteria to include • Opportunity to introduce specific values in program • Involvement of all constituents (evaluators/evaluatees)

  12. Rubric Implementation • Formative • Raw data given to candidate • Pre- and post-consultation • Candidate to use data however desired • Summative • Framework for formal letter for RTP file

  13. Summary • Powerful, easy to use tool, levels playing field, highly customizable • Issues of mixing formative and summative functions

  14. Further Study • Explore different varieties of instructor assessment tools • Test different rubrics • Establish balance point between depth of data and ease of use • Evaluate outcomes

  15. Crossing the RubriconAssessing the Instructor • Bibliography • http://online.sfsu.edu/~fielden/rbib.html • Sample Rubric • http://online.sfsu.edu/~fielden/rubrics.html Bridge Photo with permission from robep http://www.flickr.com/photos/robep/

More Related