1 / 33

Measuring Learning and Improving Education Quality: International Experiences in Assessment

Measuring Learning and Improving Education Quality: International Experiences in Assessment. John Ainley South Asia Regional Conference on Quality Education for All New Delhi, India, October 24 - 26, 2007. Quality education for all. Shift from provision to outcomes

deirdra
Télécharger la présentation

Measuring Learning and Improving Education Quality: International Experiences in Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Measuring Learning and Improving Education Quality:International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education for All New Delhi, India, October 24 - 26, 2007

  2. Quality education for all • Shift from provision to outcomes • Emergence of large-scale assessment programs • Developments in methods and reporting • Developments in applications • Assessments used to: • Monitor variations over time in relation to: • Established standards / criteria • Changes in policy and practice • Map variations within countries to establish action targets: • Regions and sub-regions • Sub-groups of students • Contextualise national patterns: • In relation to international patterns • In relation to comparable countries

  3. Large-scale assessment surveys • Conducted at various levels • International • Regional • National • Sub-national – state or province • Provide information at various levels • System • School • Classroom • Parent and student • Indicate what is valued • Impact teaching and learning • Drive change in policy and practice

  4. OECD PISA Population and samples 15-year-olds in school pps school sample random selection of students Domains Reading literacy Mathematics literacy Science literacy Cycle Three years Since 2000 IEA Populations and samples Grade 4, Grade 8, Grade 12 pps school sample Random selection of classrooms Domains Reading – PIRLS Grade 4 Mathematics – TIMSS Science – TIMSS Cycle TIMSS Four years, since 1994/5 Antecedents back to 1964 PIRLS Five years since 2001 Other studies: ICCS 99 and 2009 International assessment studies

  5. OECD (PISA) framework expert development consultation future needs domain coverage Rotated booklet design data sources school student teachers option in 2009 psychometrics one parameter IRT Reporting Scale: sd = 100 Proficiency bands IEA (TIMSS & PIRLS) framework curriculum analysis (OTL) common elements what is taught domain coverage Rotated booklet design data sources school student teacher psychometrics three parameter IRT Reporting Scale: sd = 100 Proficiency bands International assessment studies

  6. Regional assessment studies • Latin America • Latin American Laboratory for Assessment of the Quality of Education (LLECE) • Second International Comparative Study (SERCE) • Language, mathematics science • Africa • Southern Africa Consortium for Monitoring Educational Quality (SACMEQ) • Supported through IIEP

  7. National assessment studies • NAEP (USA) • Sequences over many years • Key stage assessment (United Kingdom) • Latin America (Puryear, 2007) • Rare in 1980 • Common by 2005 • Vietnam 2001, 2007 • Australia

  8. Sub-national assessments • Typically in federal systems • Australian State assessments • Equating at benchmark levels • Transition to a national assessment in 2008 • Germany • Canada, Ontario

  9. Issues in national and international assessment surveys • Domains and sub-domains assessed • Census or sample • Analysis • Reporting

  10. Assessment domains • Typically • Language (literacy, reading) • Mathematics (numeracy) • Science sometimes • Coverage within domains • Multiple matrix designs • Rotated booklets to ensure coverage • Other domains • Sample studies

  11. Grades or ages assessed • Define population • Age • Grade • One grade or several • One grade • End of common period of schooling • Multiple grades • End of primary school • End of common secondary school • Mid-primary school

  12. Sample or census • Advantages of census • Reporting to schools, teachers, parents • Enough data to identify disadvantaged groups • Enough data to identify regional variations • Advantages of sample studies • Cost effective • Minimal disruption to school teaching programs • Cover a wider range of areas • Combinations of census and sample • Census for literacy and numeracy • Samples for other domains

  13. Analysis issues • Item response theory • Development of a common scale • Student performance • Item difficulty • Difference in detail • Vertical equating • Long scales • Common items overlapping • Common person equating studies • Horizontal equating • Equating over time • Common items over each cycle • Common person equating studies

  14. Reporting assessment data • Reporting scales • Typically a mean for one grade fixed (e.g. 400) • Standard deviation of 100 • Examine distributions for different groups • Proficiency bands – standards referenced • Defined in terms of item difficulties • Band width of equal difficulty • Describe what is represented by items in a band • Report percentages • Standard setting exercises • Define standards in terms of: • Proficient standard • Minimum competency • Panels of expert judges

  15. Reporting scale scores:TIMSS Maths Grade 8

  16. Describing distributions: Writing (Australia)

  17. Scale descriptions • Provide an interpretation of scores • Monitor student development • Identify developmental continua • Plan student learning • Progress maps at state and school level

  18. From item data to described scales: Computer literacy

  19. PISA Maths Profile:Selected level descriptions

  20. Profile distribution: Reading literacy (Australia)

  21. Establishing expected standards • Consultation • What should a student be able to do? • Different standards • Minimum competency • Proficient • Advanced • Provide a basis for simple comparisons

  22. % students at benchmark standard for reading by sub-group

  23. Achievement in relation to Most students in the state System average A defined benchmark

  24. Uses of assessment • Public information • About the system overall • About sections of the education system • Accountability • Directing resources and interventions • Groups of students • Levels of schooling • Schools • Individual students • Defining learning progress • Establishing progress maps • Establishing standards • Providing examples of student work at different levels • Evaluating programs and research • Understanding “what works”

  25. Public information • Stimulating demand for education • Identifying areas of need • Indigenous students • Boys reading • How wide is the gap • Providing comparisons internationally • Staying the same • Relative change

  26. Directing interventions • Identifying disadvantaged students • Based on social characteristics • Based on diagnostic information – requires census • Allocating funds • Chile: bottom 10% schools • Australian states: bottom 15% schools • Focus on the early years • Providing a basis for intervention • In most education systems • Use of consultants to work with schools • Easier with census assessment • Education action zones

  27. Evaluation and research • Evaluating what works • Starting school • Approaches in early childhood • Impact of policy interventions • Using data longitudinally • What contributes to enhanced growth • Value-added measures (NSW Smart Schools) • Studying later progress (e.g. PISA Longitudinal) • Uses of assessment data • Linkage to other data about schools • Literacy and numeracy in the middle years • Literacy development of boys • Effective teaching for literacy

  28. Concerns at different levels

  29. Concerns at different levels

  30. Conclusions • Assessment programs have grown • International, regional, national and sub-national • Have begun to impact on policy and practice • Complementary roles at different levels • Emergent design principles • Described scales and standards referencing • Higher order skills & thinking • Domain coverage • Varied methods and formats • Enhancing application • Report meaningfully • Provide interpretation • Balance pressure and support

  31. Questions ??Comments !!Discussion *#&^

More Related