1 / 18

Measuring Learning—Using Results

Measuring Learning—Using Results. Richard J. Shavelson Stanford University Inside Higher Ed Audio Conference March 31. 2010 @ Noon EST. richs@stanford.edu. Overview. Assessment landscape: A hot issue that doesn’t seem to go away

adriel
Télécharger la présentation

Measuring Learning—Using Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Learning—Using Results Richard J. Shavelson Stanford University Inside Higher Ed Audio Conference March 31. 2010 @ Noon EST richs@stanford.edu

  2. Overview • Assessment landscape: A hot issue that doesn’t seem to go away • Differing demands for measuring learning and accounting for outcomes: Conflict of cultures • Differing purposes for learning assessment: Tools being used • Using assessment results: “Culture of Evidence” and “Learning Organization”

  3. Landscape: Close Encounters! • A NAEP for higher education—NSF question in early 1990s • One (multiple-choice) assessment fits all—response of small liberal arts college to North Central Accreditation • Report card for all colleges—NY State Education Department

  4. Recognized stature of U.S. higher education... BUT “Higher Education must change from a system primarily based on reputation to one based on performance” Urged “creation of robust culture of accountability and transparency” Student achievement inextricably connected to institutional success Achievement measured by value added Relative effectiveness of campuses published in league tables Landscape: Spellings’ Commission on the Future of Higher Education

  5. Landscape: Higher-Education Professional Organization’s Assessment • Identify seven areas for improvement (affordability, learning, secondary education assistance, increased accountability, internationalize student experience, increase opportunities life-long learning and workforce training) • Agree that there is a need to: • Improve learning • Increase accountability for educational outcomes • Most extreme response: • Voluntary System of Accountability (VSA) to: • Demonstrate accountability and stewardship to the public • Measure educational outcomes to identify effective educational practices • Assemble information that is accessible, understandable, and comparable • College Portrait system of indicators • High Stakes: Publish comparative results of student achievement and value added

  6. Landscape: Academics’ Reaction • One size does not fit all—Commission failed to recognize diversity of higher education institutions • Outcomes vary by academic major • Sole focus on “cognitive” outcomes (declarative and procedural knowledge) too limiting—need individual and social responsibility outcomes • Intrudes on academic culture where faculty responsible for curriculum, teaching and assessment • Higher education system too complex for simple quantitative measurement • “Horse-race” comparisons of colleges and universities at best misleading and at worse have perverse effect on teaching and learning

  7. Landscape: Toward A Common Ground • Rejection of standardized assessment of learning and accountability in higher education: • Ignores good and bad assessment practices • Is undemocratic saying only a few superior students in a narrowly defined discipline can succeed instead of identifying, modeling, and making a set of intellectual skills and practices accessible to all students • Denies higher education’s responsibility to provide a coherent curriculum • Is wrong minded when argue that campuses share no common standards

  8. Landscape: Common Ground For Higher Education—Generic and Majors • Think critically • Reason analytically • Connect apparently disparate pieces of information • Explore others’ knowledge claims • Justify own knowledge claims with evidence and examples • Take another’s perspective and act accordingly

  9. Landscape: Macro View of Motivation • “Massification” • Internationalization • Diversification of Missions • Diversity of Students • Cost • Dilution of Curriculum • Lack of transparency • Cultural Conflict between Policy Makers + Public v. Higher Education • Distrust of Diverse Student Body—”Standards”

  10. Landscape: Beyond The US

  11. Landscape: President Obama? • Community colleges… community colleges… etc. • Assistant Secretary Kanter supports AHELO in concept (no $) • Higher education (assessment and accountability) policy?????

  12. Cultural Conflict Vs. Transparency AccountabilityTriangle Accountable To Whom? Academy Policy Makers Clients

  13. LEARNING ASSESSMENT PURPOSES • Formative function— • Improving teaching and learning • Faculty constructed • Faculty buy in • Linked to curriculum • Assuring the public • Institution is being accountable • Trust us • Summative function— • Assuring the public • Transparency—Issue is how given conflict of cultures and lack of trust? • External oversight credibility given broken contract • Improving teaching and learning • Signaling function • Benchmark performance—how good is good enough? • Feedback for improvement of teaching and learning

  14. Learning Assessment Examples • Formative focus—Summative background: • Portfolios • Capstone courses and projects • Performance assessments • Summative focus—Formative background: • Pencil & paper measures of critical thinking etc. in general and in subject areas (e.g., CAAP, MAPP—now ETS® Proficiency Profile) • Collegiate Learning Assessment

  15. Using Assessment Information: Not So Easily Done • Case studies of six “exemplary” campus learning assessment systems • Major findings—vastly different: • Visions of learning assessment systems • Levels of maturity • Levels of capacity (assessment leaders essential if not lead top down) • Levels of administrator and faculty buy in • Levels of student engagement and feedback • Levels of impact

  16. Assessment Of Learning Outcomes • Linked formative and summative assessment of learning • Strong internal learning assessment capacity • External learning assessment for signaling and benchmarking (but not external reporting)

  17. A Vision Of Learning Assessment & Accountability • Culture of evidence—evidence fed back up and down system from president to students • Learning organization—organizational structures in place to: • Feedback information • Assess gap between goals and current performance • “Experiments” to try alternative improvements • Repeat cycle gauging improvement • Appropriate incentives 17

  18. Thank You!

More Related