Download
test construction administration and analysis n.
Skip this Video
Loading SlideShow in 5 Seconds..
Test construction, administration and analysis PowerPoint Presentation
Download Presentation
Test construction, administration and analysis

Test construction, administration and analysis

336 Vues Download Presentation
Télécharger la présentation

Test construction, administration and analysis

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Test construction, administration and analysis Tasks for MEd Practicum Students 14 Baishakh 2076 Peshal Khanal, PhD Associate Professor Central Department of Education Tribhuvan University

  2. Key tasks

  3. Preparation of specification table

  4. Organizing test items onto a test paper • Full marks: 30, Time – 1 hr • Group A multiple choice (15 x 1 = 15) • Group B Short answer question, attempt any one ( 1 x 5 = 5 ) • Group C Long answer question, attempt any one (1 x 10 = 10)

  5. Administering the test • Administer the test in a group of at least 30 students • Test can be taken in more than one class or school

  6. Preparing answer key and rubrics and scoring the answers

  7. Item analysis of multiple choice items • Difficulty level (p) – the proportion of examinees who answer the item correctly ( range 0 to 1) • Discriminating index (D) – the value that shows discrimination power of the item between high achieving and low achieving students (range -1 to +1) • Power of distractors – analyzing the extent to which the distractors are plausible

  8. General procedures for item analysis • Arrange the answer books from the highest total scores to lowest scores • Keep the upper and lower 27% answer books separately and use these two groups for analysis (middle 46% copies are not considered, keep them away) • For example, if you have 30 answer books arranged from the highest score to lowest score, then • 27 % of 30 = 8.1 (= 8), so keep 8 copies from the upper group and 8 copies from the lower group. Keep remaining 14 copies away ! • Now you are ready for calculating difficulty level and discriminating index of each item.

  9. Difficulty level (p) • Let’s start from item 1. • How many students answer item 1 correctly from the upper group? Let’s say – 7 • How many students answer the item 1 from the lower group? Let’s say – 2 • Then, difficulty level of item 1 (p1)= = = 0.56 • The easiest item has a p-value closer to 1 and difficult item has a p-value closer to 0.

  10. If you want to use formula for p • If number of students answer the item correctly in the upper group = Ur • If number of students answer the item correctly in the lower group = Lr • If the total number of students in the upper group = Un • If the total number of students in the lower group = Ln • Difficulty level of the item (p) =

  11. Criteria for interpretation

  12. Discrimination index (D) • Let’s recall the response on item 1. • 7 and 2 students answer the item correctly from the upper and lower groups. • Then, the discriminating value of item 1 (D1) = = = 0.62 • The highly discriminating item has a value closer to 1, and lower discriminating item has a value closer to 0. If the value is negative, the item is discriminating the students negatively (dangerous !)

  13. If you want to use formula for p • If number of students answer the item correctly in the upper group = Ur • If number of students answer the item correctly in the lower group = Lr • If the total number of students in the upper group = Un • If the number of students in the lower group = Ln • Discrimination index of the item (D) =

  14. Criteria for interpretation

  15. Power of distractors • We count responses on the correct answer and distractors and use a logical analysis to assess the extent to which distractors are plausible. • Two basic criteria – • the correct answer should be responded by the majority of students and the number of students ticking the correct answer in high scoring group should be greater than the number of students in lower groups • all distractors should attract the students nearly equally

  16. Examples Response on items Interpretations and decisions No. 1 is a good item. No. 2 - Distract d is poor, should be revised. No. 3 - This is a poor item. The key (a) and distractor ‘b’ are ambiguous. Should be revised the whole item.

  17. Analysis of test scores • Mean – average score of the total students • Standard Deviation – deviation or dispersion of the scores from the mean • Example – next slide

  18. Interpretations • Out of 30 marks average marks obtained by students is 20 (that is 66.67%) – that is satisfactory. • The scores range from 30 to 4 and SD 6. The scores are fairly distributed. • Students can analyze the scores of boys and girls separately and compare the results.

  19. Reporting format • Introduction – what is the test about, grade, contents etc. • Specification table – Include the table • Test paper – include a sample of test paper (with appropriate organization, instructions, and items) • Test administration – write a paragraph describing the procedure (where, when, how etc) • Scoring key and rubrics – Include answer key and rubrics • Item analysis • difficulty level (include a table showing p-value of each item and your interpretations) • discrimination value (include a table showing D-value of each item and your interpretations • Power of distractors (include separate distractor analysis tables for each item and your interpretations) • Analysis of test results (include the calculation of mean and standard deviations and your interpretations)

  20. ??? Thank You peshal.Khanal@tucded.edu.np