1 / 27

Quality Assessment July 31, 2006

Informing Practice. Quality Assessment July 31, 2006. By the end of this session, we will …. Participate, learn and have fun! Answer, Why is it important to ask? How do we “inform our practice” through four stages? Will assessing make a difference?

Télécharger la présentation

Quality Assessment July 31, 2006

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Informing Practice Quality Assessment July 31, 2006

  2. By the end of this session, we will … • Participate, learn and have fun! • Answer, • Why is it important to ask? • How do we “inform our practice” through four stages? • Will assessing make a difference? • Do I have the skills to begin successfully and feel good about what I am doing?

  3. Why Bother? • If you always do … • Our need to know. . . • How are we doing? • How well are our students doing? • How do we know? • Have we made a difference? • Have we met our goals? • Answer in a systematic way:  credibility

  4. Why Bother? Asking: • Provides “informed answers” - Speak knowledgeably to answer, “How do you know?” • Demonstrates that we are serving our students: • Interested in knowing if we delivered what we promised;  Listening • Gathering evidence to use to improve • Contributes to our own learning

  5. Natural Process Asking after an event occurs: • What was good about it? • What was not so good? • What will we do different next time? Why? Reflection Reflection

  6. Guiding Principles • There is no single “right answer” • Process of learning together • Value added process through synergy

  7. Mission Statement Strategic Goals Informing Practice 1. Setting Measurable Goals 2. Planning to Reach Goals 3. Data Collection 4. Data Analysis, Reporting, and Action Assessment Cycle

  8. What do we want our students to be able to know and do? What are observable and measurable outcomes (behaviors to track) that will let us know what our students know and can do? What tasks will students engage in that demonstrate what we expect of them? 1. Setting Measurable Goals Planning Questions Goals Students will participate in an effective experience that develops their interpersonal and leadership skills. Students will be able to rate the two aspects of the experience and demonstrate an example of applying their skills of collaboration and evaluating their leadership actions. Students will engage in experiences with two aspects where they will apply their skills of collaboration and evaluate their leadership actions. What tool is used to measure the indicator? Tools may be a survey, interview, observation checklist, etc. based on outcomes.

  9. 2. Planning to Reach Goals • Advice from “The expert” • Review goals • Data collection design • Data analysis, reporting, and action • Set future goals

  10. 3. Data Collection Planning for Success: • Purpose • Process: • Who? • What? • How? • When? • Lessons Learned

  11. 4. Data Analysis, Reporting, and Action Results • Analyze data to learn what was said • Report and communicate to “Close the Loop” • Action plans for the future • Data driven decision making • Completing the Cycle

  12. 3. Data Collection review Planning for Success: • Purpose • Process: • Who? • What? • How? • When? • Lessons Learned

  13. Purpose • Clearly write: The purpose for our survey is to. . . • State: • What do we want to know? • What we will do with the information; how we will use the assessment results for improvement? • Informs our Data Collection Design • Turns into Letter to Participants determine discover

  14. Process – Who? Who do we ask? • As researchers, we cannot assume that we know what everyone is thinking • The ones who are able to answer the questions from their perspective

  15. Process – What? • What do we ask? • Review purpose statement • Pilot: • Do the questions work? • What information will they give us? • Will the information inform our decision making? • Be cognizant of time of participant

  16. Process – How? How do we ask to get information? • Open-ended question format • Close-ended question format • Likert (feelings / attitude / opinion) scale of 1 to 5; 1 to 7; 1 to 4; others • Yes / No answers • Paper / electronic • Focus groups • Using scripts; recorder/ x-check; skilled interviewer • Consider need ? Consent letter / Institutional Review Board (IRB)? ? Anonymous / confidential?

  17. Process – When? When do we ask? • Immediately, or risk “time heals” syndrome • Later, to benefit from reflection • Check Survey Central • Has it been asked before? • Avoid “survey fatigue”

  18. Lessons Learned • Critique survey examples • Response population analyzed • Letter to Participants • Pilot • Scales and ratings

  19. Lessons Learned Critique survey examples

  20. Lessons Learned Response population analyzed • What is a good response rate? • Sample Size Calculator • Population Profile

  21. Lessons Learned Letter to Participants • Content • Message

  22. Lessons Learned Write an excellent letter of invitation to participate: • Identify self, why the survey is happening • What will be done with the results • Note changes made in past • “Data will be reported in aggregate form only” • Incentives? • Signatory? Personal connection makes a difference • Remember: Be empathetic

  23. Lessons Learned Pilot • Is wording clear? • Do the questions “work”? • What information will they give us? • Is the information meaningful? • Be cognizant of time of participant

  24. Lessons Learned Likert Scale • 5 point scale: mid-point cluster • 7 or higher point scale: many choices • 4 point scale: forced opinion • Define each level When rating, ask why. . . • Rating was  only negative responses. • Rating was  replicate the good.

  25. Lessons Learned Constructing questions: • Short and clear. Avoid misinterpretations. • Consider pairing statements to ensure reliability and validity • Avoid “and”. e.g. This limits statements to a single issue. • Pilot

  26. Will assessing make a difference? Data has contributed to our need to know. . . • We are… • Our students are… • We know… • We made a difference in these ways… • The goals we met are… We have the evidence!

  27. Informing Practice Quality Assessment July 31, 2006 Written and produced by Halyna Kornuta

More Related