1 / 72

Assessment 101

Assessment 101. Joyce Chapman Project Manager, Triangle Research Libraries Network Heads of Cataloging Interest Group ALA annual, Anaheim CA 25 June 2012. What is assessment?.

ayala
Télécharger la présentation

Assessment 101

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessment 101 Joyce Chapman Project Manager, Triangle Research Libraries Network Heads of Cataloging Interest Group ALA annual, Anaheim CA 25 June 2012

  2. What is assessment?

  3. Assessment is a continuous and cyclical process by which we evaluate and improve services, products, workflows, and learning.

  4. Acontinuous process "Assessment is a process whose power is cumulative.” –University of Washington’s assessment principles • “One-shot" assessments can be useful in the right context • Ongoing assessment is what fosters ongoing improvement

  5. Planning phase Amazing how often people decide to skip this. Planning is one of the most difficult phases. • Determine your objectives • Define the questions that need to be answered • Design a method to answer the questions (set up a study, collect new data, extract existing data from a system)

  6. Implementation (data gathering) • NOT numbers for the sake of numbers • We frequently measure things that are easy to measure, without a good reason for doing so. This data may not help us answer meaningful questions. • For data collection to foster assessment, we must first determine what it is we really care about, then initiate data collection that will inform meaningful analysis and outcomes.

  7. Assessment is a continuous and cyclical process by which we evaluateand improve services, products, workflows, and learning.

  8. React / refine • The most frequent piece of the assessment cycle that is ignored is the last: making change based on the findings of data analysis. • It is often inaction on the part of management that causes the assessment loop to remain incomplete, ending with reporting of data analysis findings and never resulting in action.

  9. Summary • Continuous • Cyclical • Evaluate AND improve • Requires that action be taken in response to findings • Our environment and users are always changing; we are always reacting.

  10. Evidence based practice A movement to encourage and give practitioners the means to incorporate research into their practice where it may have been lacking. – Journal of Evidence Based Library and Information Practice • Assumption that it is impossible to make good evidence-based decisions when our evidence base is weak; therefore, an evidence base must be built.

  11. Evidence based practice Stresses three aspects contributing to a practice that is evidence-based: • the best available evidence is used • moderated by user needs and preference • applied to improve the quality of professional judgments

  12. “An approach to information science that promotes the collection, interpretation, and integration of valid, important and applicable user-reported, librarian-observed, and research-derived evidence. The best available evidence, moderated by user needs and preferences, is applied to improve the quality of professional judgments.” – Anne McKibbon

  13. Context of assessment • Libraries often talk about assessment in the specific context of proving our institutional value to external audiences • Contribution to student retention, graduation, and employment rates; student learning outcomes • Equally valuable is assessment to improve internal workflows and services; or assessment of the cost and value of workflows to contribute to knowledge in the field.

  14. Why perform assessment? • Improve efficiency • Modify workflows to funnel staff time and efforts where they provide the most benefit • Prove the value of existing or proposed services/positions to higher administration • Contribute to the available data/literature in the cataloging field so that you can work together to implement evidence-based practice across the nation

  15. A culture of assessment

  16. What is a culture of assessment? A culture of assessment refers to whether the predominating attitudes and behaviors that characterize the functioning of an institution support assessment.

  17. What signals a culture of assessment? • Do staff take ownership of assessment efforts? • Does administration encourage assessment? • Is there comprehensive program? • Is assessment present, do we see ongoing assessment efforts throughout the organization? • Are there efforts to teach staff about assessment? • Is there inclusion of assessment in plans and budget? “Establishing a Culture of Assessment” by Wendy Weiner, 2009

  18. What signals a culture of assessment? • Is assessment mentioned in the strategic plan? Does the organization have an assessment plan? • Does the organization financially support staff members whose positions are dedicated in whole or in part to assessment-related activities? • Does the organization fund professional development of staff related to assessment? • Is the organization responsive to proposals for new endeavors related to assessment? “Establishing a Culture of Assessment” by Wendy Weiner, 2009

  19. Structural difficulties • Bottom-up: it is difficult for staff to gain support for conducting assessment projects or for implementing change based on findings when upper admin is not assessment-focused. • Top-down: an assessment-focused upper admin can have a staff that does not support assessment and mandates for assessment are resented (the “defiant compliant” culture).

  20. Context of assessment in higher education

  21. Inputs, outputs, and outcomes • Input measures: quantify a library's raw materials (collection size, staff size, budget). Longest tradition of measurement in libraries. • Output measures: measure the actual use of library collections and services(circulation stats, gate counts, reference transactions) • Outcome measures: measure the impact that using library services, collections, and space has on users(libraries' impact on student learning)

  22. What drives the “assessment agenda”? • Changing times • Explosive growth in technologies • Increased customer expectations for services such as quality and responsiveness • Shrinking budgets • justifications for spending $ on resources, programs, and services are now required • Increased competition for resources • Afight to remain relevant and prove value Martha Kyrillidou, “Planning for Results: Making the Data Work For You.” 2008.

  23. Why wasn’t there a focus on assessment in libraries for so long?

  24. Return On Investment “A performance measure used to evaluate the efficiency of an investment…. a way of considering profits in relation to capital invested.” “ROI provides a snapshot of profitability adjusted for the size of the investment assets tied up in the enterprise.” Sources: Wikipedia and Investopedia

  25. PROFITS! Cost = $$ Value = $$ For-profits must create their own revenue. Otherwise, they cease to exist. Livelihood depends on cost/value assessment.

  26. Libraries Like most higher education, we are non-profits. Funding sources aren’t directly tied to real value (more “perceived value” and tradition). Cost = ?? Value =??

  27. Put simply… because higher education has historically been given a large % of annual funding by external powers based on perceived value, we have not developed a culture of needing to closely prove value, track input to output, or investment to profit.

  28. Assessing The Cost and value of bibliographic control

  29. 2011 LRTS article by Stalberg & Cronin

  30. June 2009, Heads of Technical Services in Large Research Libraries interest Group of ALCTS sponsored a Task Force on the Cost/Value of Bibliographic Control. • Members: Ann-Marie Breaux, John Chapman, Karen Coyle, Myung-Ja Han, Jennifer O’Brien Roper, Steven Shadle, Roberta Winjum, Chris Cronin, and Erin Stalberg

  31. The task group found that the technical services community has long struggled with making sound, evidence-based decisions about bibliographic control • If technical services is to attempt to perform cost/value assessment on bibliographic control, one of our first problems is a lack of operational definitions of value we must create our own operational definitions of value with which to work.

  32. Fundamental questions for defining value • Can value be measured in ways that are non-numeric? • Is discussing relative value over intrinsic value helpful? • Does value equal use? • Is it possible to define a list of bibliographic elements that are “high-value” and others that are “low-value”?

  33. While the charge was to develop measures for value, the Task Force determined that doing so would not be helpful until the community has a common vocabulary for what constitutes value and an understanding of how value is attained, and until more user research into which bibliographic elements result in true research impact is conducted.

  34. Operational definitions of value • Discovery success • Use • Display understanding • Ability of bibliographic data to operate on the open web and interoperate with vendors and suppliers in the bibliographic supply chain

  35. Operational definitions of value • Ability to support the Functional Requirements of Bibliographic Records (FRBR) user tasks • Throughput and timeliness • Ability to support the library’s administrative and management goals

  36. “Value multipliers” Extent to which bibliographic data: • are normalized • support collocation and disambiguation in discovery • use controlled terms across format and subject domains • level of granularity matches what users expect • enable a formal and functional expression of relationships (links between resources) to find “like” items • are accurate • enhancements are able to proliferate to derivative records

  37. Measuring cost While elements contributing to the cost can be outlined, determining whether the costs are too high is impossible without first having a clear understanding of “value.” • Salaries & benefits multiplied by time spent on a task • Cost of cataloging tools, such as software • Time spent on database maintenance • Overhead (training, policy development, documentation) • Opportunity costs

  38. Collecting data

  39. Types of data Quantitative methods focus on numbers and frequencies; provide data that is easy to analyze statistically. “Numbers.” • Analysis of log data, systems reports, time data, web usage analytics, survey data (not free text) Qualitative methods capture descriptive data and focus on experience and meaning. “Words.” • Usability testing, focus groups, user interviews, ethnographic studies

  40. Coding qualitative data "There's no such thing as qualitative data.  Everything is either 1 or 0.” • Fred Kerlinger • While qualitative data provides the important whys and hows of user behavior, it is difficult for us to digest large quantities of descriptive data. • It is often useful to code quantitative data qualitatively for analysis.

  41. Fear: assessment takes a lot of time • Reality: it depends on the methodology and data sources used • Qualitative data gathering, coding, and analysis usually take a lot of time • Systems can be set up to gather quantitative data programmatically. Such data can be analyzed quickly, given the proper tools and skills • Quantitative data might also be gathered manually. Data collection will be a hassle, but analysis will be quick

  42. Existing data or new data?

  43. digital exhaust data

  44. Collect new data • Know what questions the data needs to be able to answer • Data requirements; structure of the data • Make sure you will be able to extract the data • Make sure the data format you’ve chosen will be interoperable with any other data you are using in an initiative

  45. Bad data planning

  46. Methodologies / techniques

More Related