1 / 45

CAPEA Presentation:

CAPEA Presentation:. Biennial Reports. Hilton Arden West Sacramento March 2, 2012. Note-taking Guide for Today ’ s Presentation.

noma
Télécharger la présentation

CAPEA Presentation:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CAPEA Presentation: Biennial Reports Hilton Arden West Sacramento March 2, 2012

  2. Note-taking Guide for Today’s Presentation

  3. I see the solution to each problem as being detectable in the pattern and web of the whole. The connections between the causes and effects are often much more subtle and complex than we…may suppose. --Douglas Adams, author

  4. Connectedness to the Whole From the day we arrive on the planet And blinking, step into the sun There's more to see than can ever be seen More to do than can ever be done There's far too much to take in here More to find than can ever be found But the sun rolling high Through the sapphire sky Keeps great and small on the endless round It's the Circle of Life, And it moves us all Through despair and hope, Through faith and love Till we find our place, On the path unwinding In the Circle, The Circle of Life

  5. Purpose • Provide concrete, discrete steps to enable programs to meet the requirements of the Accreditation System • Develop a program-specific Data Collection Plan using the template provided

  6. Data Collection

  7. What does Accreditation require? • Common Standard 2: Unit and Program Assessment and Evaluation • The education unit implements an assessment and evaluation system for ongoing program and unit evaluation and improvement. The system collects, analyzes, and utilizes data on candidate and program completer performance and unit operations. Assessment in all programs includes ongoing and comprehensive data collection related to candidate qualifications, proficiencies, and competence, as well as program effectiveness, and is used for improvement purposes.

  8. What does Accreditation require? • Common Standard 9: Assessment of Candidate Competence Candidates preparing to serve as professional school personnel know and demonstrate the professional knowledge and skills necessary to educate and support effectively all students in meeting the state-adopted academic standards. Assessments indicate that candidates meet the Commission-adopted competency requirements, as specified in the program standards.

  9. Read and Highlight • Accreditation Handbook, Appendix A, page 5 • “Directions for Completing the Biennial Report” section • Key terms to address • program effectiveness • candidate competence

  10. Standards-based Growth Model • Go deep! • Go beyond checklists of completion to impact on student achievement Checklist Impact of candidate work

  11. 17 Characteristics of a Highly Effective Evaluation System (Sinclair Research) • Linked to Common or Program Standards • Formative and Summative • Balanced between quantitative and qualitative approaches • Ongoing – continuous improvement • Manageable “do-able” – simple – doesn’t overwhelm • Contains a timeline Sinclair Research

  12. 17 Characteristics of a Highly Effective Evaluation System (Sinclair Research) • Timely • Systematized - Comprehensive - Examines effectiveness in all areas • Analyzes and uses data • Analysis tools appropriate to data • Analyzes trends and patterns over time • Triangulates data - Includes all stakeholders (multiple perspectives) Sinclair Research

  13. 17 Characteristics of a Highly Effective Evaluation System • Fully implemented • Clear questions appropriate to role group - kept simple • Questions reflect a continual drilling down of questions (see Thomas Guskey’s work) • Process appropriate to role group • Reports results back to stakeholders Sinclair Research

  14. Table Talk • Where is your program presently in regards to having a system of evaluation? • What sources of information did you use in your last Biennial Report?

  15. Section A, Part III: Analysis of Data Accreditation Handbook, Appendix A, page 5: “What does the analysis of the data demonstrate About a) candidate competence and b) program effectiveness?” Asked another way: What does the data indicate about how well the program is performing in terms of developing candidates’ competencies?

  16. Please locate your one-page handout of the Biennial Report cycle

  17. Data Identification • What data does your program currently use to prepare a discussion of candidate competence for Biennial Reports? • What data does your program currently use to prepare a discussion of program effectiveness for Biennial Reports? • Share ideas for possible assessment tools with your table group.

  18. Data Analysis

  19. CTC reminds us, “Not all Data is Equal!” There is data, good data, and better data for the purposes of program improvement. CCAC 2009 presentation on biennial reports discussed:

  20. Examples of Data 100% of candidates successfully completed Ed 235 or Average grade for all candidates who took Ed 235 in Fall of 2008 is 3.45 Do either of these examples provide information about candidate competences that were addressed or program quality/effectiveness?

  21. Example of Better Data What are the competencies covered by ED 235? What key assignments, projects, fieldwork component, etc. are required? Examples of possible data: Data from a common rubric/scoring criteria Tied to explicit standards, competencies, TPE, or TPA task

  22. Another Example of Good Data Student Teaching Final Evaluation– Exit Clinical Practice Completed by Master Teacher and University Supervisor

  23. Best Data Data from TPE observations during ED 235 (scored with a 4 pt. rubric) is compared to post program information such as the following: --Employer Survey data verifies that first year teachers from the X program are effective in leadership positions…. --What do these data sources tell you about program effectiveness?

  24. Lessons Learned - Data Best Biennial Reports • Include data at a level that can be tied to candidate competencies outlined in the standards • Include BOTH candidate assessments and program/post-program feedback information (employer surveys, completer surveys) • Present data in a way that allows the reader to compare candidate and program performance relative to the standards.

  25. Please locate your one-page handout of the Biennial Report cycle

  26. Gap Analysis: Table Talk • What data is currently being collected from a variety of sources? • What data should be collected from sources that do not currently exist? • Generate a list of possible sources that are feasible given your program’s context • Make sure you cover both candidate competence and program effectiveness

  27. Addressing the Gap: Back at the Ranch • What questions could be crafted for a survey? For a focus group? Modifications to an existing log or report? • What time adjustments might need to be made to accomplish newly crafted items/groups/surveys?

  28. Graphic of Data Collection Cycle

  29. Triangulating Data • Data analysis tends to be like Matruska dolls: you think about a result and wish for more data to help you understand it. • Triangulation of data from different types of evidence and knowledgeable groups is best. • Triangulation of data is also time-consuming and/or expensive, so it needs to be used strategically.

  30. A Review of the Biennial Reports • Data is only as good as what is reported • Garbage in, garbage out • Consistency across reports • PASC data for three years studied:

  31. A Snapshot of Biennial Data

  32. Biennial Data, continued

  33. Biennial Data Strengths • Most of us are using observations from fieldwork as a data point (candidate competency) • Most of us are using exit interviews (program effectiveness) • Many of us are using portfolios (candidate competency and program effectiveness)

  34. Biennial Data: Areas for Improvement • The term rubric is used loosely; many “rubrics” are in actuality ratings scales. • Most assessments do not account for inter-rater reliability or validity. • We haven't done a good job of tying our decision-making and analysis to the data. Decisions seem to made without much thought to what the data says.

  35. Deconstructing Biennial Reports

  36. Deconstructing Biennial Reports

  37. Deconstructing Biennial Reports

  38. Deconstructing Biennial Reports

  39. Deconstructing Biennial Reports • Additional sources of data: • Portfolio defense based on growth towards CPSELS • Aggregated exit survey • Aggregated alumni survey (two years out) • Aggregated employer survey (two years out) • University supervisor evaluation • Evaluations of site supervisor from the candidate • Advisory Committee minutes (qualitative analysis) • Retention data

  40. Deconstructing Biennial Reports CSU Channel Islands Lessons learned

  41. Deconstructing Biennial Reports • At your table, review the Educational Leadership Biennial Report • What is strong about the report? • How could the report be made stronger?

  42. Best practices. . . Would you like a follow-up session at the October retreat?

  43. Synectics • Collecting program data is like a thermometer because _____________

  44. Questions? • Administrator of Accreditation • Cheryl Hickey • Chickey@ctc.ca.gov • CTC Consultant (Biennial Reports and Administrative Services Credentials) • Gay Roby • Groby@ctc.ca.gov or gayroby@mac.com

  45. Collegial Contacts • Deborah Erickson • deericks@callutheran.edu • Gary Kinsey • gary.kinsey@csuci.edu

More Related