1 / 20

Institutional Indicators & Benchmarking

Institutional Indicators & Benchmarking. Presentation to AQIP Quality Check-up Team March 2008. Where have we been? The Impetus. Ohio Partnership for Excellence First done in 2001- 02 Provides external review Served as point of entrance to AQIP

Audrey
Télécharger la présentation

Institutional Indicators & Benchmarking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Institutional Indicators & Benchmarking Presentation to AQIP Quality Check-up Team March 2008

  2. Where have we been?The Impetus • Ohio Partnership for Excellence • First done in 2001- 02 • Provides external review • Served as point of entrance to AQIP • Impetus for 3 AQIP Action Projects, including Institutional Indicators of Effectiveness • Subsequent OPE, Baldrige, AQIP processes provided the impetus for the refinement and the benchmarking phase of this process

  3. Purpose of Indicators Process To develop an institutional effectiveness model including mission priorities, indicators, benchmarks/targets, and data sources To identify and organize key data and information to measure institutional progress against the mission

  4. Framework • Broad Components—Mission Areas/ Emphasis, Critical Success Factors, Key Success Factors • Indicators—Indicators, Key Performance Indicators, Core Indicators • Measures—Measure, Performance Standards • Targets—Targets, Goals, Benchmarks

  5. Initial Changes/Refinements • Indicator Process • Initially—4 Groups: • District Board of Trustees (DBT) • Administrative Leadership Team (ALT) • Faculty • Students • What we learned— • Four frameworks created alignment challenges • Collapsed into one document • DBT selects indicators and IEP works with the appropriate committees and/or organizational units to develop and implement actions—and report back to the President and DBT

  6. Indicator Timeline July Report on progress Identify key indicators to monitor Develop action plans for continuous improvement Update data and select/remove indicators August/ Review and discuss indicators of December effectiveness Monitor progress of key indicators Report on progress January/ Monitor progress of key indicators June

  7. Performance Grid

  8. The graph on the left shows the total number of graduates as well as the proportion of all who attend LCCC and other Ohio colleges, and those who did not enroll in post secondary education subsequent to graduation. 5 of the 6 high schools with the largest graduating classes send a higher proportion of their college bound students to LCCC than to other Ohio colleges Half of these are also among the schools with the lowest proportion of first year college students. In fact, Admiral King, Southview, and Elyria rank 3rd, 4th, and 5th from the bottom, respectively, in proportion of graduates who attend college after graduation North Ridgeville, Firelands, Keystone, and Clearview also send more students to LCCC than to other Ohio institutions Number To College

  9. Recent High School GraduatesActionHighlights/Examples • Develop a committee that would meet regularly to coordinate and align recruitment efforts • Target 4-6 high schools to increase the number of recent high school graduates who enroll directly to LCCC • KnowHow2Go—Campaign focuses on preparing and education 7th through 10th graders (aligned with Lorain P-16 Council)

  10. Remediation RatesAction Highlights/Examples • DBT Community Connection Session with Superintendents and School Board Members • Focus on incoming students • Implementing the Ohio Core Initiative (February 22, 2007) • Planning Sessions with the Academic Foundations Division (Fall 2006) • Focus on currently enrolled students • The development of rubrics for grading all courses in all areas (MTHM, ESLG, ENGL, RDST) of the Academic Foundation Division • Determine the viability of distance education for Academic Foundation course offerings

  11. Graduate Tracking SurveyAction Highlights/Examples • Convened a group of Ohio AQIP institutions to begin discussions around the development of a common graduate tracking survey to collect comparative data and information • To submit a plan to the OBOR and seek funding to support endeavor—advocacy

  12. Indicators of Effectiveness The Next Evolution

  13. What did we learn? • External Feedback—AQIP, Baldrige, and OPE reports recommended alignment with Vision 2015 and better cohort comparison groups (benchmarking) • Internal feedback—suggested the reduction and revision of indicators from 36 to about 12

  14. Measuring Institutional Effectiveness Indicators of Effectiveness There were 36 indicators for the following three areas: Promote Education, Stimulate Community Development, and Enhance Institutional Effectiveness Charge Revise framework around the four cornerstones of the new mission: Education, Economy, Community, Culture Reduce the number of indicators to 12. Goals To develop one document with about 18 indicators that reflect the indicators of effectiveness and Vision. Vision 2015 Score Card Consists of 6 Priorities with 32 Initiatives. Create a “scorecard” to measure and monitor the short/long-term progress of the strategic vision. Feedback The College would select 12 indicators of success for Vision 2015. The suggestion would be to select 2 indicators (major outcomes) for each of the six strategic priorities

  15. Education

  16. AQIP Benchmarking Action Project

  17. Purpose • To enhance and sustain an institutional culture that uses a defined benchmarking process to systematically compare LCCC against other colleges, universities, and organizations • To address feedback from various feedback reports • To enhance the comparison groups for various projects such as the indicators of effectiveness

  18. Yes Yes Yes Are There Curriculum-Based Characteristics to Identify Cohort Groups? Is there a Specified Default Group? No No No Review the Key Questions for Each Group and Characteristic Area Selecting a Cohort Group to Benchmark Against Do You Want to Select a Cohort Group by Institutional Type? Institutional Type Curriculum-Based Cohorts Default Group Institutional Characteristics Curriculum Characteristics Institutional Characteristics Review Data Questions Reference: McCormick & Cox, 2003

  19. Application and Usage • Key Institutional Processes • Institutional Indicators • Academic program and cluster review • Operation systems review • Institutional Effectiveness and Planning • A guiding protocol for related work

  20. Next Steps • February - March—Make any revisions or adjustments to the proposed institutional indicators • March—Present revised framework to the District Board of Trustees • July—Present framework with the data publication along with updates on current projects and recommendations for any new indicators that might need attention or monitoring

More Related