Download
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
Item Development PowerPoint Presentation
Download Presentation
Item Development

Item Development

134 Vues Download Presentation
Télécharger la présentation

Item Development

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Item Development Presented to: Majmaah University Neil Wilkinson

  2. Creating better test items

  3. The FACTS of good test items

  4. Creating good test items

  5. Right Expertise • Subject matter expertise • ensures test is valid • Content development expertise • ensures test is superiorly crafted • Psychometric expertise • ensures test is reliable and fair

  6. Sound Processes

  7. Sound item writing processes

  8. Item performance analysis and feedback Why analyze items? • Statistical behavior of “bad” items is fundamentally different from that of “good” items • Provides quality control indicating items which should be reviewed by content experts • Feedback on how items are performing can be used in future item writing efforts providing continuous quality improvementin the item writing process

  9. Distractoranalysis • High-scoring candidates should select the correct option • Look at discrimination for each distractor • Low-scoring candidates should select from among the distractors at random  Look at p-values for each distractor

  10. Diagnosing potential problems

  11. Feedback into the item development process • Sharing performance statistics to authors of individual items may be problematic due to time between authoring and use, but... • Item performance statistics can still inform the item development process. • Share and discuss item performance and reasons for poorly performing items. • Show examples of good, average, and poorly performing items.

  12. Efficient Tools Item Bank A repository of items with their associated metadata & historical record

  13. Efficient Tools • Item Bank – Data • The item itself • Item author • Item type • Content area reference from test blueprint • Cognitive level • Revision history • Item status • Performance statistics

  14. Efficient Tools • Test construction • Test assembly • Export features • Ancillary features • Security and access • Workflow management • Project tracking • Item Bank – Features • Item authoring • Remote item authoring • Support of multiple item types • Item banking • Search capabilities • Import/export capabilities • Batch editing capabilities

  15. Effective Item Development • Guidelines for an effective item development process • An effective item development process begins well before the first test item is written with development of test blueprints.

  16. Effective Item Development • Guidelines for an effective item development process • Carefully consider the associated information that is required and stored for each item.

  17. Effective Item Development • Guidelines for an effective item development process • Recognize that item development requires more than subject matter expertise.

  18. Effective Item Development • Guidelines for an effective item development process • Item authors require training in item writing.

  19. Effective Item Development • Guidelines for an effective item development process • Item development requires more than authoring, sound review processes are also essential.

  20. Effective Item Development • Guidelines for an effective item development process • Style and grammar guidelines and review processes are necessary to ensure items are testing what they are meant to test.

  21. Effective Item Development • Guidelines for an effective item development process • Use item performance analysis to guide further review of the items and to provide feedback to item authors.

  22. Writing Good Multiple-Choice Test Items

  23. What Makes a Content-Appropriate Item • Reflects specific content • Able to be classified on the test blueprint • Does not contain trivial content • Independent from other items • Not a trick question • Vocabulary suitable for candidate population • Not opinion-based Source: Haladyna, T.M. (1996). Writing Test Items to Evaluate Higher Order Thinking. Boston, USA: Allyn & Bacon.

  24. Anatomy of a Multiple-Choice Item • Which city has been awarded • the 2016 Summer Olympics? • Chicago • Madrid • Rio de Janeiro • Tokyo Stem Distractors Answer Options Key

  25. Not opinion based Avoid the use of opinion based material in items • Poor Stem: Which are the best football team in the world? a. Barcelona b. Manchester United. c. Bayern Munich d. Athletico Madrid • Better Stem: Which football team won the European championship in 1999?

  26. Guidelines for Writing the Stem Place most of the phrasing in the stem. • Poor Stem: Type II diabetes is a. also called juvenile-onset. b. characterised by insulin dependency. c. primarily seen in adults over 40. d. often managed by drug therapy. • Better Stem: What is the most frequent age of onset for Type II diabetes?

  27. Guidelines for Writing the Stem Avoid excessive, unnecessary wording. • Poor Stem: Certified individuals are required to undertake continuing education. What is the minimum number of hours required in a two-year period? a. 10 b. 15 c. 20 d. 25 • Better Stem: How many hours of continuing education must a certified individual take in a two-year period?

  28. Guidelines for Writing the Stem Clearly describe the problem or what is being asked of the candidate. • Poor Stem: Continuing education is a. required every two years. b. important to life-long learning. c. an optional activity. d. undertaken at certified centers. • Better Stem: How many hours of continuing education is a certified individual required to undertake in a 5-year cycle?

  29. Guidelines for the Answer Options • Maintain similarity in the answer options. • Keep the answer options relatively equal in length. • Keep all answer options grammatically correct with the stem and parallel. • Avoid cueing the right answer. • Avoid words such as “always” and “never”. • Avoid repetitive wording in the answer options. • Vary location of key. • Place answer options in logical order.

  30. Number of Answer Options Four is most common. • Recent research indicates three may be sufficient if the two distractors are strong and plausible. • Rarely have three functional distractors. • “Using more options does little to improve item and test score statistics and typically results in implausible distractors.”* *Rodriquez, M.C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24(2), 3-13 (p. 11).

  31. Ten steps to an Effective Item Writing Process

  32. Ten Steps to an Effective Item Writing Process Use item bank inventories to direct item writing activities.

  33. Ten Steps to an Effective Item Writing Process Provide a training program for item writers.

  34. Ten Steps to an Effective Item Writing Process Develop a style guide.

  35. Ten Steps to an Effective Item Writing Process Require all items to be classified according to the test blueprint.

  36. Ten Steps to an Effective Item Writing Process Ensure all items are validatedwith current references and the validations are recorded.

  37. Ten Steps to an Effective Item Writing Process Editorially review all items for grammar and style.

  38. Ten Steps to an Effective Item Writing Process Include an item review process by Subject Matter Experts (SMEs).

  39. Ten Steps to an Effective Item Writing Process Share acceptance criteria with item writers and item reviewers.

  40. Ten Steps to an Effective Item Writing Process Use statistical analysis of items to guide further review.

  41. Ten Steps to an Effective Item Writing Process Provide feedbackto item writers.

  42. Thank You!