1 / 80

American Institutes for Research and the U.S. Department of Education Office of Vocational and Adult Education

NRS Regional Spring 2007 Training on Desk Monitoring: Improving Program Performance. American Institutes for Research and the U.S. Department of Education Office of Vocational and Adult Education. Introductions and Expectations. Take a moment to look forward to

cytheria
Télécharger la présentation

American Institutes for Research and the U.S. Department of Education Office of Vocational and Adult Education

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NRS Regional Spring 2007 Training onDesk Monitoring:Improving Program Performance American Institutes for Research and the U.S. Department of Education Office of Vocational and Adult Education

  2. Introductions and Expectations Take a moment to look forward to the end of this training, two days from now. Make a note of two things you hope to learn and take with you from this training on desk monitoring. Then, as you introduce yourself to the group, please give your name, title, state or territory you represent, and two expectations you have of this training. Refer to H-1 02

  3. Objectives of the Training By the end of the training, participants will be able to • Discuss the purposes and benefits of desk • monitoring; • Identify measures, performance standards, • and evaluation tools; • Design a desk monitoring tool and summative • rubric; • Develop a plan for implementing and using desk • monitoring; and • Evaluate the desk monitoring system and make • refinements, as needed. 03

  4. Agenda for Day 1 • Welcome, introductions, expectations, objectives; • Purposes of local program monitoring; • Why do desk monitoring? Relationship between • on-site and desk monitoring; How monitoring • fits into the program improvement process; • Warm-up activity: Current state of the states • re: on-site and desk monitoring; • Steps to Desk Monitoring • Step 1. Design a Desk Monitoring Tool • States select process and outcome measures • Small group review and feedback 04

  5. Agenda for Day 2 • Step 1. Design a Desk Monitoring Tool (Cont.) • Demonstration of monitoring tool and template • State teams use desk monitoring tool to develop • templates; • Activity on evaluating a summative rubric; • Planning time for states to develop their own rubrics • Total group processing of issues related to developing • rubrics 05

  6. Agenda for Day 3 • Jigsaw Activity on Steps 2, 3, and 4 • Step 2. Plan and Develop Implementation Procedures • Step 3. Implement Your Desk Monitoring Tool • Step 4. Use and Evaluate Your Desk Monitoring Tool • State teams plan for implementation and use: 50 Questions • State reports • Next Steps and Adjourn 06

  7. States Monitor Data on… States Monitor Data on… Financial Information Program Staffing Student Outcomes Instruction Contact Hours Professional Development Student Characteristics to ensure that local programs are • Meeting grant requirements; • Following required procedures and policies; • Providing quality services. 07

  8. States Use Information from Local Program Monitoring to… • Meet state and NRS accountability requirements (student outcome data); • Implement and evaluate progress toward state policy objectives; • Promote program improvement (identify lower performing programs for T/A); • Identify effective programs and practices (and transfer to other programs). 08

  9. What is Desk Monitoring? An approach to reviewing and tracking local program performance by using data from your state system, and other quantitative data, as appropriate. 09

  10. Warm-up Activity Taking a snapshot of current use of desk monitoring Refer to H-2 10

  11. Reciprocal Relationship between Outcomes and Program Practice Good Program Practices Effective Instruction Good Data Collection and Reporting Procedures Professional Development for Teachers Valid and Reliable Assessment Good Outcomes 11

  12. But it’s aTenuousRelationshipbetweenProcessandOutcomes Because of the complexities of the following: • Uneven resources from year to year; • Varying staffing patterns; • Students with widely diverse skills and goals. 12

  13. Why do Desk Monitoring? • Guide program review activities; • Facilitate tracking and evaluation of • local program performance; • Both enhance and inform on-site monitoring; • Plan and provide technical assistance; • Establish mechanism for regular communication between state and local programs re: needs for technical assistance; • Promote continuous program improvement. 13

  14. Desk Monitoring 14

  15. Relationship between Desk and On-site Monitoring Desk Reviews Use findings to focus the more-intensive on-site reviews and to identify T/A needs. On-site Reviews Use findings to identify need for more-intensive desk monitoring. Coordinate both approaches to promote program improvement efforts 15

  16. Fitting Desk Monitoring into a Program Improvement System Onsite Monitoring Technical Assistance Program Improvement Desk Monitoring 16

  17. I-P-O Input/Givens Process/System Outcome/Results Elements that are “givens”– usually beyond our immediate control, e.g., student demographics, student skill levels, teacher characteristics Elements that describe the results of the agency’s processes, given the input, e.g., student achievement, attendance, completion rates Elements that describe actions programs plan for/implement to produce the outcomes they want, e.g., curriculum, instructional strategies & materials 17

  18. Examples you can cite of… • Programs that had highly positive • student outcomes one year and not so • positive outcomes the next year? • Possible reasons for the disparity? • Actions states can take to help local programs examine data and verify the root causes for the disparity? • Actions state can take, if any, to provide assistance and get them back on track? Refer to H-3 18

  19. Steps to Desk Monitoring 4. Use and Evaluate Desk Monitoring 5. RefineProcess, as Needed 3. Implement Desk Monitoring 2. Plan & Develop Monitoring Procedures 1. Design a Desk Monitoring Tool 19

  20. Step 1. Designing a Desk Monitoring Tool Essential elements of a good desk monitoring system • Measures tied to state priorities that accurately reflect performance; • Evaluativestandards or other benchmarks that define good performance; • A summative rubric that characterizes overall program performance. 20

  21. Step 1. Designing a Desk Monitoring Tool A. Measures Three elements of a good desk monitoring system B. Evaluative Standards C. Summative Rubric 21

  22. First Step in Designing the Tool… Decide on the measures you will use to evaluate program performance. • Some measures are those required by the NRS and other accountability needs; • Include other measures to help you understand local programs’ procedures and how the programs collect data; • Select a manageable number of measures. 22

  23. Three Measures Needed for an Effective Data Monitoring Instrument • Studentoutcomemeasures—the central • element (educational gain and follow-up • measures)—what students are achieving • Data process measures—how programs • collect outcome measures; how procedures • affect data quality. • Program process measures—what happens in • the program to produce outcomes; the • procedures and services that affect outcomes. 23 Step 1.

  24. Measures for Desk Monitoring Tool • 1. Outcome Measures • Educational gain • Secondary credential • Entry into postsecondary • education • Entered and retained • employment • Additional state measures • 3. Program Process Measures • Recruitment and enrollment • Retention and attendance • Professional development • and staff • Curriculum and instruction • Support Services • 2. Data Process Measures • Assessment procedures • Goal setting and orientation • Follow-up procedures 24

  25. To Select Measures, Your State Needs… • A clear policy and direction for programs. • A clear idea of the outcomes you want • programs to achieve that will define • quality. • e.g., State policies on the following can guide • selection of measures • contact hours, • number and type of students a programs should enroll, • outcomes expected (e.g., completing educational levels or promoting passage of GED Tests. “You measure what you treasure.” 25

  26. Not everything that can be counted • counts, • And not everything that counts • can be counted. • -Albert Einstein 26

  27. A Word about Data Quality… • Make sure that local programs are able to • collect and report information for desk • monitoring in a valid and reliable way. • CAUTION: The measures will hinder efforts at • program improvement if they do not accurately • reflect performance. 27 Step 1.

  28. 1. Student Outcome Measures • Educational level completion (one for • each NRS or state level or combined) • Receipt of secondary credential (adult • high school diploma or GED) • Entry into postsecondary education • Entered employment • Retained employment Note: You do not need to select all of these for your desk monitoring tool. 28 Step 1.

  29. 2. Data Process Measures • Number and percentage of studentspre-tested • by a set time • Number and percentagepre- and posttested • Average contacthours between pre- and posttest • Number of students whocompleted the goal- • settingprocess • Timethe goal setting was completed • Number of studentswith goals • Number of studentscontacted by time • Number of studentswith SSNs 29 Step 1.

  30. 2. Data Process Measures (Cont.) • A. State Assessment Policy & Procedures: • - Which pre- and posttest standardized assessments • to use • - When to administer pre- and posttests • - How to match scores on the assessment to NRS levels • State staff can monitor • - Number and percentage of students pre-tested by • a set time; • Number and percentage pre- and posttested; • Average contact hours between pre- and posttest. Step 1. 30

  31. 2. Data Process Measures (Cont.) • B. Goal Setting and Orientation Procedures: • - Set appropriate follow-up goals with students; • - Have system in place for tracking students; • - For collecting follow-up measures, ► If survey method, process for contacting • students and achieving sufficient response • rate for a valid survey; • ►If data matching, process for collecting • student SSNs accurately. Step 1. 31

  32. 2. Data Process Measures (Cont.) • B. Goal Setting and Orientation Procedures • State staff can monitor - Number of students who completed the goal-setting process; - Time that goal setting was completed relative to student enrollment; - Number of students with goals Step 1. 32

  33. 2. Data Process Measures (Cont.) • C. Follow-up Procedures: • - Phone survey • - Data matching • - Combination of methods • State staff can monitor • Number of students contacted by set time; • Number of students with SSNs. Step 1. 33

  34. 3. Program Process Measures • Total enrollment • Enrollment by subgroups • (e.g., educational level, • demographic variables) • Average attendance hours • Average hours by subgroups • Proportion of scheduled • hours attended • Number of credentialed • teachers • Number of teachers by • ethnicity • Average hours of • professional development • received • Number and types of classes • offered by educational • functioning level or subject • Number of instructional hours • offered for types of classes • Number of students receiving • or referred to support services • Number and type of agencies • with which programs partner • for support services • Expenditure per student • Expenditure per outcome • Expenditure per instructional hr. • Expenditure per student • instructional hour Step 1. 34

  35. 3. Program Process Measures (Cont.) • Offer clues to the reasons for good and poor performance; • Help guide program improvement efforts; • Can include model indicators of program quality in the areas of • Recruitment and Enrollment • Retention and Attendance • Professional Development and Staff • Curriculum and Instruction • Support Services • Expenditures Step 1. 35

  36. What do you Gain by Tracking… A. Recruitment and Enrollment? • Do programs serve students at all levels • or primarily those at the higher levels? • Do ABE classes enroll native-English-speaking learners or primarily foreign-born students who have completed ESL instruction? Note: Measures on student enrollment can provide an indicator of recruitment practices as well as evidence that programs need to improve recruitment practices. Step 1. 36

  37. What do you Gain by Tracking… B. Retention and Attendance? • Do students attend programs long enough • to realize gains? • Do programs provide the number of instructional hours required by the grant? • Which students exit before completing a level? • What is attendancerelative to scheduled class hours? Step 1. 37

  38. What do you Gain by Tracking… C. Professional Development and Staff? • Number of credentialed teachers • Number certified in adult education or TESOL • Number of teachers by ethnicity • Mean hours of professional development events that instructional staff attended or completed Step 1. 38

  39. What do you Gain by Tracking… D. Curriculum and Instruction? Too complex an issue for desk monitoring, but can monitor • Number and types of classes offered by educational functioning level or subject, and • Number of instructional hours offered for types of classes Note: This data is useful for monitoring compliance with grant requirements. It also can provide clues to performance patterns, e.g., few instructional hours and classes offered may explain low retention and educational gain. Step 1. 39

  40. What do you Gain by Tracking… E. Support Services? • Number of students receiving or referred • to services, such as • - Transportation • - Child care • either provided by program or • by program partner agencies Step 1. 40

  41. What do you Gain by Tracking… F. Expenditures? • Which funds to include (state, federal, local funds, in-kind contributions) • Unit of analysis: • - Expenditure per student • - Expenditure per outcome • - Expenditure per instructional hour • - Expenditure per student instructional • hour Step 1. 41

  42. Determining Measures for your Desk Monitoring Tool Refer to H-4a—d 42

  43. Determining Measures for your Desk Monitoring Tool • State teams report on their measures in small group; • Two other states in small group ask clarifying questions and make suggestions/recommendations; • Process observers report what they observed and conduct small group discussion; • Reporter shares with larger group the similarities and differences among the teams, the types of clarifying questions asked, and whether the questioning process caused any team to re-think or refine the measures it selected. Refer to H-5 43

  44. Reflection on Day 1Activities and Learning • Questions? • Individual Reflection • Pluses and Deltas Refer to H-6 44

  45. NRS Desk Monitoring Tool • Excel-based template • Includes 25 measures of student outcomes, data collection procedures, and program processes; • States select measures most relevant to their needs and design report to include trends, comparison data, or performance targets to evaluate local performance. 45 Step 1.

  46. Performance Standards and Performance Evaluation • Congratulations! • You’ve identified the • measures for your • monitoring tool. • But you’re not finished… • Now you need to define an • acceptable level of performance. 46

  47. Performance Standards and Performance Evaluation The measures a state selects for its desk monitoring tool • Define what the state believes is important about program quality and performance; • Reflect state policy—they are the basis by which the state judges and funds adult literacy services. Remember: Standard setting is a powerful method for changing behavior to implement state policy. Step 1. 47

  48. Three Models of Performance Standard-setting • Continuous Improvement: Trends over Time • Relative Ranking: Comparisons with Mean Performance • External Criteria: Uniform Standards Step 1. 48

  49. Performance Standard-setting Models Example Program will show a 10% increase from last year in the number of students advancing to low-intermediate ESL. Step 1. 49

  50. Performance Standard-setting Models (Cont.) Example Percentage of students passing GED in the program will be equal to or greater than the state average. Step 1. 50

More Related