1 / 20

Using The Collegiate Learning Assessment: Impressions and Advice

Using The Collegiate Learning Assessment: Impressions and Advice. Presentation to the Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009. Presenters. Patricia Francis Associate Provost for Institutional Assessment and Effectiveness SUNY Oneonta

Télécharger la présentation

Using The Collegiate Learning Assessment: Impressions and Advice

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using The Collegiate Learning Assessment: Impressions and Advice Presentation to the Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009

  2. Presenters Patricia Francis Associate Provost for Institutional Assessment and Effectiveness SUNY Oneonta Rosalyn Lindner Associate Vice President Buffalo State College

  3. Session Topics • CLA Structure, Scoring, and Reporting • Mapping the CLA to SUNY GE Outcomes • Advantages and Challenges in Using the CLA • Institutional Strategies for Overcoming Challenges

  4. Describing the CLA: Structure, Scoring, and Reporting

  5. What Does the CLA Measure? • Key Higher Order Skills • Critical thinking • Analytical reasoning • Problem solving • Written communication • Types of Tasks • Performance Task • Analytical Writing • Make-an-Argument • Critique-an-Argument

  6. Administration Details • CLA Administered Over Computers for a 60-90 Minute Session • Two Approaches • Cross-sectional, comparing 100 first-semester students and 100 second-semester seniors in same academic year • Longitudinal, comparing same students as freshmen (n=300) and seniors

  7. Administration Details (cont.) • The CLA is Not a Multiple Choice Test • Students Respond to One of Three Prompts • Analyze Complex, Realistic Scenarios • Write Persuasive, Analytic Essay • Critique Written Arguments

  8. What Do Students See? • Split Screen • Left side: Directions, questions and response box • Right side: Document library with pull down menu

  9. Examples • Dyna Tech • “Truth” in Media • Child Obesity Study

  10. CLA Scores • Unadjusted Performance Score for Absolute Comparisons • Deviation Score for Controlled Comparisons

  11. Expected Score • Uses SAT/ACT Scores for Measures of Academic Ability of Students Prior to Matriculation (Entering Academic Ability, or EAA, Score) • Estimates Linear Relationship between CLA Scores and EAA • Reports in Standard Error Units

  12. Mapping CLA Scores to SUNY GE Learning Outcomes: Critical Thinking and Written Communication

  13. GEAR Review, Spring 2009 • Focused Exclusively on SCBA Outcomes for Written Communication and Critical Thinking • Primary Questions • Is there reasonable face validity between CLA measures and SUNY outcomes? • Does the CLA report provide campuses with sub-scores for individual outcomes?

  14. Correspondence Between CLA and SUNY Outcomes • CLA Measures Map Well to (and Provide Sub-scores for): • Critical Thinking Outcome #1 (“Identify, analyze, and evaluate arguments….” • Critical Thinking Outcome #2 (“Develop well-reasoned arguments”) • Written Communication Outcome #1 (“Produce coherent texts within common college-level written forms”) • Implications for Information Management?

  15. Advantages and challenges in using the cla: And, Campus Responses to Challenges

  16. Advantages • Student Artifacts Much Richer and More Complex • Inherently Value-Added Nature of CLA • Controls for Students’ Incoming Ability • Provides Information on Multiple Learning Outcomes • Implications for Faculty Workload (for Administration and Scoring) • Student Engagement/Interest

  17. Challenges • Recruitment of Students (and Implications for Representativeness of Sample) • Student Participation and Motivation • Relevance to Classroom • Scheduling of Computer Labs • Staff Time and Effort

  18. Campus Responses: Buffalo State College • Recruitment of Students and Sampling • Freshmen and seniors • Direct recruiting compared to faculty recruiting • Student Participation and Motivation • Scheduling • $$$ • Relevance to Classroom • Closing the loop • Scheduling of Computer Labs

  19. Campus Responses: SUNY Oneonta • Student Recruitment and Sampling • Preparation of sampling plan for CAE • Assuring comparability between sample and other students • Student Participation and Motivation • Reliance on academic departments • Emphasis to students on benefits to institution, programs, and themselves • Relevance to Classroom • Provision of individual student results to programs • “CLA in the Classroom” initiative

  20. Using The Collegiate Learning Assessment: Impressions and Advice Presentation to the Association of Institutional Research and Planning Officers Buffalo, New York June 11, 2009

More Related