1 / 47

Biennial Report

Biennial Report. Technical Assistance Meeting December 8, 2009. 3 Major Activities of the system and their different roles. Program Assessment – Is the program in alignment with the standards?

Lucy
Télécharger la présentation

Biennial Report

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Biennial Report Technical Assistance Meeting December 8, 2009

  2. 3 Major Activities of the system and their different roles Program Assessment –Is the program in alignment with the standards? Biennial Reports – Is the program effective in developing qualified educators and does the program use data to drive its program improvement efforts? Site Visit – Confirms that the Common Standards and Program Standards are implemented in an integrated, effective manner?

  3. 7 Year Accreditation Cycle

  4. Accreditation System Are programs effective in preparing competent educators? Site Visit Biennial Reports Are Common Standards and Program Standards implemented in an integrated, effective manner? Program Assessment Are programs aligned with standards?

  5. Uses for Biennial Reports • Critical part of the accreditation cycle • Key piece of evidence that an institution is responsive to Common Standards 2 and 9 • Used by review teams during Program Assessment and Site Visits. • Biennial Reports, Program Assessment and Site Visits, together give a more comprehensive picture of a program sponsor over time.

  6. 2009 – Where are we? • Just completed first year of full implementation of biennial reports in the accreditation system. • All programs from 3 cohorts submitted biennial reports • Approximately 47 institutions • Over 260 programs • Site visits that took place in Spring 2009 included biennial reports as part of their evidence.

  7. Relationship to the Standards Common Standard 2 – Unit and Program Evaluation System The education unit implements an assessment system for ongoingprogram and unit evaluation and improvement. The system collects, analyzes, and utilizes data on candidate and program completer performance and unit operations. (continued on next page)

  8. Common Standard 2 (continued) Assessment in all programs include on-going and comprehensive data collection related to candidate qualifications, proficiencies, competence, and program effectiveness. Data are analyzed to identify patterns and trends that serve as the basis for programmatic and unit decision-making.

  9. Common Standard 9 • Focuses on Candidate Competencies at the Program Level • Hold that thought…

  10. Biennial Report Two sections to the report: • A Submitted by each program. Current program context, recent changes, enrollment/completion data. Includes data from each approved program. • B Submitted by the designated director of educator preparation programs. Overall trends and institution’s action plan.

  11. Biennial Report, Section A Purpose: Snapshot of each program’s processes for utilizing data to increase program effectiveness Part I. Contextual information/Changes Part II. Assessments of Candidates and Completers Part III. Analyses of data Part IV. Proposed Program Changes

  12. Section AProgram Specific Information Part I. Contextual Information General information to help reviewers understand the program, the context in which it operates and what has changed significantly since the Commission approved the current program document. 1 page

  13. Context Number of candidates and completers

  14. Context – cont- • When begin/end program • Cohort model • Program features • Internships • Serves inner-city schools • Bilingual program • Changes since the last site visit/approval

  15. Section A, Part II – Candidate Assessment/Program Effectiveness Program describes assessment procedures and instruments it uses to ensure that candidates have the requisite competencies and that the program is effectively meeting its candidates’ academic and professional growth needs. ≤ 10 pages

  16. Part II. Candidate Assessment An Overarching Chart can be helpful

  17. Overarching chart –cont-

  18. Provide Aggregated Data • Provide Actual Aggregated Data for 4-6 Key Assessments • Data should reflect the last two academic years • For those submitting in fall 2009 – that would be 07-08 and 08-09 • For those submitting in fall 2010 – that would be 08-09 and 09-10

  19. Examples of Candidate Data Evidence of candidate and completer competence through coursework and practicum • TPA – for MS/SS programs • Key assignments in coursework, observations during fieldwork, practicum, or clinical practice • Demonstrations/presentations prior to being recommended for a credential • Portfolios • Others

  20. Examples of Program Effectiveness Data Evidence of program effectiveness for completers, employers, community • Completer and graduate surveys • Employer surveys/feedback • Retention rate in employment • Placement rates

  21. Possible Key Assessments for Induction Programs

  22. Reporting the Information • Describe the type of data being collected (e.g., TPA, employer data) • Identify instrument(s) used to gather data • Describe process of collecting data • Include descriptive statistics such as the range, mean, median, mode or percent in each category

  23. Not all Data is Equal! There is data, good data, and better data for the purposes of program improvement.

  24. Examples of Data Data: 100% of candidates successfully complete Ed 235 or Average grade for all candidates who took Ed 235 in Fall of 08 is 3.45 What do either of these examples tell you about program quality or effectiveness?

  25. Example of Better Data What are the competencies covered by ED 235? What key assignments, projects, fieldwork component, etc. are required? Examples of possible data: Data from a common rubric/scoring criteria Tied to explicit standards, competencies, TPEs, or TPA task

  26. Another Example of Good Data Student Teaching Final Evaluation– Exit Clinical Practice Completed by Master Teacher and University Supervisor

  27. Best Data Data from TPE observations during ED 235 (scored with a 4 pt. rubric) is compared to post program information such as the following: Employer Survey data verifies that first year teachers from the X program are effective in teaching…. What do these data sources tell you about program effectiveness?

  28. Lessons Learned - Data Best Biennial Reports • Include data at a level that can be tied to candidate competencies outlined in the standards • Include BOTH candidate assessments and program/post-program feedback information (employer surveys, completer surveys) • Present data in a way that allows the reader to compare candidate and program performance relative to the standards.

  29. Section A, Part IIIAnalysis of Data What does the data indicate about how well the program is performing in terms of developing candidates’ competencies?

  30. Analysis of Data Program uses results of data analyses to identify: • How well candidates are performing • Areas where candidates are not performing as expected • How well completers are performing • Areas in which completers feel unprepared • How the program is perceived by employers • Identify strengths of program and areas for growth

  31. Analysis of Data Programs may take the data as a whole and reach some conclusions, or may analyze each data source separately. If the latter, any areas of conflict in the data should be addressed. Do not overlook areas where the data indicate some improvements are necessary.

  32. Part IV. Use of Assessment Results for Program Improvement Program describes how it will use the results of the analyses of data to build on identified strengths and address areas in need of growth/improvement.

  33. Use of Assessments for P.I. • What changes have or will be made to the program? • What data will the program continue to watch over time? • Is there a need to improve the assessment tools themselves? • Make sure this section is linked to the data and analysis.

  34. Program Improvements Chart or Table may be useful here too.

  35. Biennial Report, Section BInstitutional Summary/Action Plan Purpose: Snapshot of the institution’s processes for utilizing data to increase program effectiveness Institution will review reports from each program and identify trends, institutional strengths, and areas needing growth that occur across programs, and describe a plan of action to improve the performance of all programs.

  36. Section B - Institution • Summary is submitted by unit leader: Dean, Director of Education, Superintendent, or Head of the Governing Board of the Program Sponsor. • Summary identifies: • Trends observed across programs • Areas of strength • Areas for improvement • Next steps or a plan of action. 1 Page

  37. Page Parameters • Section A for each program should be approximately 10 pages. • Section B should be approximately 1 page.

  38. When are they due? • Biennial Reports are due immediately following years 1, 3, and 5 of the cohort cycle • Due to the Commission • August 15, • October 15, or • December 15 They are due FOLLOWING the second academic year during which the data is collected.

  39. How are they submitted? Electronically – via E-mail to BiennialReports@ctc.ca.gov

  40. How are Biennial Reports Reviewed? Several levels of review: 1) Staff review 2) Program Assessment Reviewers 3) Site Visit Teams

  41. Staff Review • CTC staff will review the reports and, if necessary, seek additional information. • Feedback will be provided to program sponsors in a timely manner--6-8 weeks • A summary of the information from the Biennial Reports will be shared with the Committee on Accreditation.

  42. Staff Response Form

  43. Examples of Issues Identified • No data is submitted – assessment process is discussed • No candidate data is currently collected, analyzed or utilized by the institution, but there is a plan to do so in the future • Links between the data, analysis, and program modifications are hard to see

  44. Examples of Issues – cont- • Data is reported at a level that is difficult to link to candidate competencies explicit in the standards • Data includes only post-program effectiveness or candidate data and not both • Areas of apparent weaknesses are not addressed at all in the analysis or program modifications

  45. After the Staff Review In 4th Year of Cycle – Biennial Reports are provided to Program Assessment Reviewers (evidence for candidate assessment/competency standards) In 6th Year of Cycle – Biennial Reports are provided to Site Visit Team (evidence for Common Standards 2 and 9)

  46. Resources • The Biennial Report template and more can be found at: http://www.ctc.ca.gov/educator-prep/program-accred-biennial-reports.html under Biennial Report information • Cheryl Hickey, chickey@ctc.ca.gov • Rebecca Parker, rparker@ctc.ca.gov

  47. Questions?

More Related