1 / 41

Recommendations for Technology and Innovation in Assessment

Recommendations for Technology and Innovation in Assessment. Edys S. Quellmalz Michael J. Timms Barbara C. Buckley WestEd Invited presentation at the Race to the Top Assessment Program Public and Expert Input Meeting. Boston, MA November 13, 2009.

Philip
Télécharger la présentation

Recommendations for Technology and Innovation in Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Recommendations for Technology and Innovation in Assessment Edys S. Quellmalz Michael J. Timms Barbara C. Buckley WestEd Invited presentation at the Race to the Top Assessment Program Public and Expert Input Meeting. Boston, MA November 13, 2009

  2. Question 1: How can innovative technologies be deployed to create better assessments? • Break the mold, transform, don’t transition • Go beyond delivery, scoring, reporting • Take advantage of capabilities of technology to represent domain principles, support use of “tools of the trade” • Focus new development on what is not currently well tested in paper formats, i.e., integrated knowledge, active processes • Reform test form designs and timing • Form collaboratives to develop collections of innovative tasks • Create common core of state and classroom standards, specifications, task banks • Create common platforms for authoring and administration

  3. Question 1: How can innovative technologies be deployed to create better assessments?

  4. Increasing Use of Innovative Tasks and Simulations in Summative Assessments • PISA Computer-Based Science Assessment • 2009 NAEP Science Interactive Computer Tasks • Minnesota State Science Test • 2009 PISA Electronic Text • 2011 NAEP Writing • 2012 NAEP Technological Literacy

  5. Technology Affordances • Rich, authentic environments and problems form contexts, forge integration via models, discourse structures and problem types • Multiple modalities (static, active, interactive) • Iterative, active processing/ inquiry-multiple trials • Multiple representation and symbols/overlays • Multiple response formats (control of pacing, replay iterate) • Multiple modalities may benefit ELL and SWD • Dynamic representations of spatial, causal, temporal science phenomena • Support access to collections of information sources and expertise • Support formal and informal forms of collaboration and social networking

  6. SimScientists: Model Progressions

  7. Advantages of Simulation-Based Assessments for Formative Assessment • Can log problem solving sequences during inquiry • Can provide immediate, individualized feedback • Can provide customized, graduated coaching • Can cue with highlighting, working examples • Can provide adaptive tasks and items

  8. Example of SimScientists Formative AssessmentTask Figure 1 – Screen Shot of the Mountain Lake Food Web Embedded Assessment with Coaching

  9. English Language Arts: Digital/Cyber Literacy • Authentic, integrated tasks representing discourse aims and structures • Multimodal/multimedia “tools of the trade” • Static, active, interactive • images, graphics, symbols, multimedia • Search and find • Comprehend • Highlight • Summarize, take notes • Select, assemble, represent, transform

  10. Formative Assessment: Student Self-Assessment of Constructed Responses: Scientific Explanation In embedded assessments students revise and evaluate constructed responses. Write task criteria Revise example Evaluate

  11. Mathematics: Digital/Cyber Learning • Authentic, integrated tasks representing significant problem types • Multimodal/multimedia “tools of the trade” • Static, active, interactive • images, graphics, symbols, multimedia • Search and find • Analyze data, visualizations, simulations • Run iterative solutions • Transform representations (tables, graphs) • Select and present best evidence • Present, explain, display processes and solutions

  12. SimScientists: Simulations for Iterative Investigations, Interpretations of Multiple Data Representations and Analyses Figure 2 – Screen Shot of the Australian Grassland Population Dynamics Benchmark Assessment

  13. Question 2. What would be features of a system to develop, administer, score, report with quality and cost effectiveness? • Design templates and re-usable components for rapid and cost-effective development • Web-based for easy access from any site • Should run from a browser so that no software has to be loaded on school computers • Scoring should be computer-based and computer-assisted • Provide formative and summative reports to students and teachers

  14. Technology Support for Teacher Scoring of Constructed Responses on Unit Benchmark .

  15. Costs and Benefits Additional Costs Benefits No printing and shipping of assessments No scanning of bubble sheets No human-scoring sessions Results could be sent electronically (less mailing) Reuse of templates and components via authoring system Allows authoring by developers and teachers Easier to add accommodations like large print or read-aloud (text to speech) • Increased site administration costs due to need for more skilled personnel • Increased item development costs • Initial system development costs • Ways to limit additional costs: • States could form consortia to spread the costs of system and assessment development • Strategic use of complex item types (e.g. matrix sampling of specific knowledge/skills)

  16. Question 3.How could the technology platform support development of high quality interim assessments? • Design templates, specification shell, storyboard, and re-usable components for rapid and cost-effective development • Common tasks design specifications for core tasks at state and classroom levels • Common core collection of secure and public tasks • Models for embedding secure tasks in end of unit benchmarks

  17. Multilevel Balanced State Assessment System state Proficiency by Standards Proficiency by Standards district Proficiency by Standards Proficiency by Standards Classroom Core standards Specifications Common tasks Item bank Benchmark Summative Unit Assessments Benchmark Summative Unit Assessments Benchmark Summative Unit Assessments Benchmark Summative Unit Assessments Embedded Formative Assessments Embedded Formative Assessments Embedded Formative Assessments Embedded Formative Assessments

  18. SimScientists Embedded Formative Assessments Unit Benchmark Assessments Online assessment with feedback and coaching Follow up Classroom Reflection Activity Progress report Embedded Formative Assessments Benchmark Summative Unit Assessments Online assessment without feedback Teacher scores constructed responses Bayes Net Proficiency report

  19. SimScientists Embedded Formative Assessment Class Report

  20. SimScientists Unit Benchmark Summative Proficiency Report

  21. Developing Multilevel Balanced State Assessment Systems • The assessment systems leverage the power of collaboration to share the costs and logistics of fully developing, maintaining, and articulating science assessments. • The systems develop explicit plans for connecting and integrating assessment designs and results gathered from multiple levels of the system. • The science assessment systems document alignment of current and planned assessment tasks and items with content and performance standards. These standards are further specified to define the knowledge, skills, and strategies and levels of performance that comprise the assessment targets of a student outcome model. • The systems employ common task and item specifications to shape pools of tasks and items that can be accessed for assessments at multiple levels of the system and that will elicit and link evidence of achievement of science standards. • From Quellmalz, E.S. & Moody, M. (2004). Developing Multilevel State Science Assessment Systems. Report commissioned by the National Research Council Committee on Test Design for K-12 Science Achievement..

  22. Developing Multilevel Balanced State Assessment Systems • The systems develop strategies for sampling from the collections to build and connect test forms at different levels of the system. • The science assessment systems design and implement professional development on assessment, item and task development, administration, and use of results. • The systems place an emphasis on the use of science assessment for learning, i.e., for diagnosis at the classroom level. • The assessment systems draw upon the capabilities of technologies to support assessment design, administration, scoring, interpretation, and assessment literacy. • From Quellmalz, E.S. & Moody, M. (2004). Models for Multilevel State Science Assessment Systems. Report commissioned by the National Research Council Committee on Test Design for K-12 Science Achievement.

  23. References

  24. Recommendations for Technology and Innovation in Assessment http://simscientists.org Edys S. Quellmalz equellm@wested.org Michael J. Timms mtimms@wested.org Barbara C. Buckley bbuckle@wested.org

  25. Recommendations on psychometrics • Need more research on effective methods to assess learning in complex tasks in games and simulations. • Expand our range of psychometric tools to include such things as: • Bayes Nets • Artificial Neural Networks • Model Tracing • Rule-based methods • Ed researchers need to work with other disciplines like computer science to find other methods. • Train future psychometricians in wider range of methods.

  26. Illustrative InnovativeTasks

  27. SimScientists: Ecosystem Unit Benchmark Task Screenshot of SimScientists Ecosystems Benchmark Assessment Showing a Food Web Diagram Interactively Produced by a Student After Observing the Behaviors of Organisms in the Simulated Australian Grasslands Environment.

  28. NSF PROJECT :TOWARD A COORDINATED FRAMEWORK FOR THE DESIGN OF ICT ASSESSMENTS Quellmalz, E. S. & Kozma, R. (2003). Designing assessments of learning with technology. Assessment in Education, 10(3), 389-407.

  29. COORDINATED ICT FRAMEWORK ICT ASSESSMENT FRAMEWORK

  30. Integrated Performance Assessments with Technology (IPAT) Modular Design Interrated • Performance assessments designed to test problem-based • reasoning using technology • Core components • planning • accessing information • reasoning with the information • drawing conclusions • communication • 1

  31. IPAT Modular Design • Modules can be based on an ICT strategy,technology tool, • subject-matter of the problem, or complexity • Modules can be inserted or deleted without disruputing the • flow of the investigation • Permits custom design, adaptive assessment • Permits separate score reports for domain knowledge, • strategies, technology use • 1

  32. ICT PERFORMANCE ASSESSMENT Middle School Level Prototype • Predator – Prey ICT Performance Assessment Task • Intended audience = grade 8 / pop. 2 / 13 year olds • Problem within science context--well taught & learned • Modules for ICT strategies • Technologies • Common Internet and productivity tools • Net Logo Modeling Tool

  33. Predator and PreyPerformance Assessment

  34. Too many hares! • Read the following passage. Park rangers in Canada have observed that too many hares in the parks are causing problems. They eat small plants that other animals depend on for food. Some park rangers suggest that the government bring in more lynx (which eat hares) and help reduce the population. The park service hired Dr. Kloss at the Arctic Research Institute to investigate. Dr. Kloss wants teams of students made up of students from different schools to help. You’ll be working with two other students, Filo from York and Kari from Ottawa. For this project you will use technology to solve the problem should the government bring in more lynx?

  35. Collect information, take notes, and cite sources Data / information to collect: • When does the hare population start to go down? • What are reasons that can cause it to go down? » Canadian lynx » The lynx and hare » Predator-prey cycles Notes Lynx mostly eat hares. They don’t eat much else. If there aren’t many hares, they start to get hungry. http://dspace.dial.pipex.com/agarman/

  36. Data from the last 4 years Access & Organize Information / Data Analyze & Interpret – infer trends & patterns Here is the data for the number of hares in the parks over the last four years. Last year (2002) there were about 95,000 hares. The year before that (2001) there were about 80,000. In 2000, there were 25,000. And in 1999, there were only about 1,000 hares.Organize the data and describe the population trend.Pick a tool to use:

  37. More Data Collaborate Represent & Transform Information / Data Hi Maribel, I saw the data that you got from the Canadian park service. I did some more research and found a list that goes back 25 years and shows how many hares and lynx there were. I attached the information in a spreadsheet but I can’t understand it. We need to figure out how to make sense of all this. Can you find a better way to analyze and display the data? Thanks,Kari

  38. Transform Data

  39. Use Modeling Tools to Investigate, Compare, Test, Analyze & Interpret Information / Data Once you’ve used the modeling tool, record your results below. Size of hare population if lynx are NOT added to the park: Your Web research indicated that although lynx depend on hares for food, introducing more lynx might decrease the hare population too quickly. But how quickly is too quickly? Dr. Kloss would like to use a modeling program to predict what will happen to the hare and lynx populations if you added more lynx to the parks. 2003 2008 increase a lotincrease a littledecrease a littledecrease a lot increase a lot increase a little decrease a little decrease a lot Size of hare population if lynx ARE added to the park: 2003 2008 increase a lotincrease a littledecrease a littledecrease a lot increase a lotincrease a littledecrease a littledecrease a lot

  40. Use a model to predict outcomes

  41. Create a presentation • Dr. Kloss would like you to prepare a short presentation (3 pages / slides) to the park service with your recommendations what to do about about the hare and lynx populations. • Be sure your presentation is clearly organized and includes: • A statement of the problem • Your group’s recommendation • Information, data and explanations to support your recommendation

More Related