1 / 19

TSPC MEETING JULY 20-22,2011

TSPC MEETING JULY 20-22,2011. ACCREDITATION SITE VISITS. HISTORY OF SITE VISITS. DIVISION 010 – SITE VISIT PROCESS DIVISION 017 – UNIT STANDARDS DIVISION 065 – CONTENT STANDARDS. HISTORY OF SITE VISITS (cont.). Team selected from higher education peers and k-12 educators.

uriah-hale
Télécharger la présentation

TSPC MEETING JULY 20-22,2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TSPC MEETINGJULY 20-22,2011 ACCREDITATION SITE VISITS

  2. HISTORY OF SITE VISITS • DIVISION 010 – SITE VISIT PROCESS • DIVISION 017 – UNIT STANDARDS • DIVISION 065 – CONTENT STANDARDS

  3. HISTORY OF SITE VISITS (cont.) • Team selected from higher education peers and k-12 educators. • Institutions would present evidence at TSPC office for review. • Teams would review evidence and visit the institution. • Team evaluated evidence based on standards. • Purpose of the site visit was to determine compliance to standards.

  4. HISTORY OF SITE VISITS (cont.) • Programs approved by commission and reapproved as part of unit site visit. • Critics of process • Process subjective • Inconsistent in evaluations • Teams made recommendations based on site visit findings. • Undefined culture of evidence • No program review process.

  5. Proposed Changes • Move purpose of process from compliance to continuous improvement • Change the definition of culture of evidence • Define required assessment systems • Define required categories of data to demonstrate candidate competencies. • Define processes for use of data for program improvement.

  6. Proposed Changes • Create a rigorous program review process as part of accreditation process. • Emphasis on assessments, rubrics and scoring guides • Emphasis on quality of data for purposes of continuous improvement • Use of data in continuous improvement process

  7. Unit Site Visits • Key standards for accreditation • Candidate competencies evidenced by data • Assessment systems • Field experiences • Cultural competency/Diversity and inclusion • Faculty Qualifications • Unit Resources and Governance

  8. Unit Site Visits – (cont.) • Site team use of rubrics to determine meeting standards • Allows for meeting standards yet determining Areas for Improvement (AFI)

  9. Program Review Process • New process in accreditation. • Evidence used to demonstrate validity of candidate competency data during unit site visit. • Program review process virtual in nature based on electronic exhibits. • Program reviews conducted six months prior to unit site visits.

  10. Program Review Process (cont.) • The commission has adopted a template for the program review process associated with site visits, major program modifications and new endorsement programs • The intent is to provide clear directions on the requirements for program review, addition and modification. Electronic submission of materials is required for easier review by commissioners and site team members.

  11. Program Review Process (cont.) • PRINCIPLES TO FOLLOW FOR DATA COLLECTION • · Candidates ability to impact student learning • · Knowledge of content • · Knowledge of content pedagogy • · Pedagogy and professional knowledge, • · Dispositions as defined by state standards or the unit’s conceptual framework • · Technology

  12. Program Review Process (cont.) • The following rubric will be used when considering whether the program meets state standards. • Acceptable: The program is aligned to the state program standards. Assessments do address the range of knowledge, skill and dispositions stated in standard or by unit. Assessments are consistent with the complexity, cognitive demands, and skill required by the standard it is designed to measure. The assessment does measure what it purports to measure. The assessments are defined. The assessments and scoring guides are free of bias.

  13. Program Review Process (cont.) • Assessment instruments do provide candidates or supervisors with guidance as to what is being sought. Assessments and scoring guides allow for levels of candidate proficiency to be determined. The assessments do address candidate content knowledge, content-pedagogy, pedagogy and professional knowledge, student learning and dispositions. Field experience does meet the requirements of the standards. There is evidence data has been summarized and analyzed. The data has been presented to the consortium. Syllabi clearly align and clearly address the program standards.

  14. Program Review Process (cont.) • AFI Example: Key assessment do not provide candidates or supervisors with substantive guidance as to what is being sought. • Rationale: Scoring guides use simple words (i.e. unacceptable, emerging, proficient, or exemplary) and are left to broad interpretation.

  15. Program Review Process (cont.) • AFI Example: Instruments and scoring guides do not allow for levels of candidate proficiency to be determined. • Rationale: Data demonstrates little or no distribution of candidates across the scoring guide scale. All candidates receive predominately the same score.

  16. Program Review Process (cont.) • State Program Review Results Report: • The State Program Review Results Report is the document that will be submitted by the program review site team to the Commission for review. at the meeting prior to the submission of the unit’s Institutional Report.

  17. Program Review Process (cont.) • The program review site team will make recommendations to the Commission regarding whether the Commission should extend full state recognition of the program(s), recognition with conditions, or denial of the program’s recognition. {See Division 10 for the levels or program review recognitions.}

  18. Program Review Process (cont.) • Small group activity: • Question: Does the acceptable level in the rubric define clearly expectations for program review and approval? • Question: Show teams review syllabi to program standards? • Question: Should teams evaluate assessments and data for quality?

  19. Program Review Process (cont.) • Small Group (cont.) • Question: Should programs provide evidence of consortium review of data.? • Question: National standards require 3 years of data. What should be Oregon’s standard? • Question: At what point should conditions be imposed? At what point should recognition be denied?

More Related