1 / 19

Leading by Example: Assessing Institutional Research Office Outcomes

Leading by Example: Assessing Institutional Research Office Outcomes. Rachel Baker & Elizabeth Dayton Graduate School of Education, Stanford University Andrew LaManque Foothill-De Anza CCD Katherine McLain Cosumnes River College. Background.

booth
Télécharger la présentation

Leading by Example: Assessing Institutional Research Office Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Leading by Example: Assessing Institutional Research Office Outcomes Rachel Baker & Elizabeth Dayton Graduate School of Education, Stanford University Andrew LaManque Foothill-De Anza CCD Katherine McLain Cosumnes River College

  2. Background Institutional Research is becoming increasingly important for helping to measure institutional quality as witnessed by new reporting requirements from federal and state governments and accreditation. The senior institutions under WASC have a standard requiring an evaluation of the IR Office (Periodic Reviews of the IR Office and WASC Criterion for Review 4.2. Heather Brown, SuteeSujitparapitaya, Rebecca Sorell, Michael Wrona. CAIR, 2013). While ACCJC does not have such an explicit requirement the new standards assume a highly functional IR Office.

  3. Satisfaction, Outputs, and Outcomes

  4. Purposes of Assessing IR Outcomes

  5. FHDA IR AUOs 2012 Articulate 3 questions important to ask when starting a research project at FHDA. Interpret and draw correct conclusions from a cross tabulation of descriptive statistics such as course success rates by ethnicity. Using data provided by the IR office, identify and describe 3 key attributes of students served by the employees assigned work unit. Utilize data provided by the IR office to enhance programs and services.

  6. IR AUO Assessment Cycle Dropped one AUO Asked ‘test’ questions

  7. Structured Interviews Spring 2014 We pursued two overarching questions: 1. Is the information received from IR useful? And specifically, is it presented in a way that is easily understood and applied? 2. How is information received from IR used to make decisions?

  8. In discussing these questions, we often asked: • How often are you in touch with IR? In what capacity? (e.g., meetings, presentations, emails) • What kind of information do you tend to receive from IR? (i.e., tell me about the kinds of things you tend to learn from IR; do you have a recent example?) • How do you tend to use the information you receive from IR? What decisions have you made based in information you have received from IR? Can you think of a specific example? • How could the information you receive from IR be more useful? Are there ways in which the information IR provides you is mismatched with the kinds of questions you tend to have, or the kinds of decisions you need to make? Are there specific kinds of information you would especially appreciate from IR? • Has your understanding of data, and how best to use it, developed through your interactions with IR? Do you feel you’ve learned to use data differently? • What do you think IR does particularly well? • Where do you think IR could best expand or improve its campus-interactions?

  9. Interview Findings • IR informs decision making—heard in 10/10 interviews • Scheduling and curriculum decisions – 6 interviews • Hiring decisions – 1 interview • By complementing faculty, administrators’ “qualitative” knowledge – 8 interviews • There are opportunities to grow—10/10 interviews • IR is perceived to benefit some departments more than others – 4 interviews • More longitudinal data would be valuable, both before and after enrollment – 4 int. • More readily accessed data could be valuable – 6 interviews • IR provides additional value—9/10 interviews • Supporting grants and funding – 6 interviews • Providing individualized support – 8 interviews • Facilitating productive communication – 2 interviews • IR identifies strengths and points of vulnerability—7/9 interviews • Curriculum and programs – 7 interviews • Faculty – 2 interviews • Students – 3 interviews

  10. Assessment of Outcomes Campus leaders know how to interpret and apply research • 7/9 of those interviewed cited examples of IR skills that they had learned. {Goal = 90%} IR Data used to make improvements • 9/9 of those interviewed cited examples of IR skills that they had learned. {Goal = 100%}

  11. Establishing a Culture of Inquiry • Definition of Culture - the behaviors and beliefs characteristic of a particular group • Definition of Inquiry – the systematic investigation into a matter • Working definition of a Culture of Inquiry - An organization with a culture of inquiry would be characterized by the regular and systematic investigation of practices and outcomes across the organization. • Related Outcome - The college community has the information needed to adequately assess and improve the effectiveness of its instructional and non-instructional program and services.  RP Conference Presentation 2014

  12. Two Assessments Matrix to analyze data dissemination (BricTap Inquiry Guide p. 13) http://www.rpgroup.org/BRIC/InquiryGuide/InfoCapacity Administrator survey (derived from the Strategic Data Project from Harvard University) http://www.gse.harvard.edu/cepr-resources/files/news-events/sdp-rubric-self-asssessment.pdf RP Conference Presentation 2014

  13. Data Integration Matrix Data Integration Strategy Matrix High SCOPE Low Low High IMPACT RP Conference Presentation 2014

  14. Application of the Matrix Data Integration Strategy Matrix High SCOPE Low Low High IMPACT RP Conference Presentation 2014

  15. Reflection on the Assessment Tool • PROS • Easy to apply • Can be applied in a variety of contexts • Types of activities • Activity list/log • Research agenda • Provides a new way to track and evaluate activities • Quickly identifies possible areas of improvement • CONS • Evaluates outputs not their impact • May not span all research office activities • Some activities span classification areas RP Conference Presentation 2014

  16. Administrator Survey • Questions derived from the 7 principles about the strategic use of data from Strategic Use of Data Rubric • Perception survey about • Application of these principles at the college • Evaluation of personal application of these principles • Relative important of Research Office Data dissemination • Pilot-tested this semester in leadership team (n=16) • Survey instrument available at http://www.crc.losrios.edu/Faculty_and_Staff/Research_Office/Research_Office_Web-based_Surveys.htm RP Conference Presentation 2014

  17. Application of the Survey • Institution • Strengths - Using data to set goals and monitor college performance • Challenge - Using data for program enhancement • Individual • Challenge - Using data to adjust budgets • Important research office activities • Internal research disseminated in response to request or related to a topic/issue being discussed • Website, data briefings, discussions about actions taken in response to research • Relatively unimportant research office activities • External research that is related to a topic/issue being discussed RP Conference Presentation 2014

  18. Reflection on the Survey Tool • PROS • Can be customized to different groups • First implementation can set a baseline • Can identify relative strengths and potential areas of change • Open-ended questions provide a wealth of data • Survey can lead to individual learning and meaningful dialog • CONS • Diversity in the responses may reflect diversity of roles • Not appropriate for institution-wide assessment • Indirect assessment • Identifies potential growth areas not specific interventions • Many inputs contribute to the outcome RP Conference Presentation 2014

  19. Questions? RP Conference Presentation 2014

More Related