 Download Download Presentation ITEM ANALYSIS

# ITEM ANALYSIS

Télécharger la présentation ## ITEM ANALYSIS

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. ITEM ANALYSIS Prepared by: Rebecca S. Galela

2. A name given to a variety of statistical techniques designed to analyze individual items on a test • It involves examining class-wide performance on individual test items.

3. It sometimes suggests why an item has not functioned effectively and how it might be improved • A test composed of items revised and selected on the basis of item-analysis is almost certain to be more reliable than the one composed of an equal number of untested items.

4. item analysis provides teachers 3 information • Difficulty index • Discrimination index • Analysis of response options/distracter analysis

5. Difficulty index • The proportion of students in class who got an item correct. The larger the proportion , the more students who have learned the content measured by the item

6. Discrimination index • A basic measure of the validity of an item. • A measure of an item’s ability to discriminate between those who scored high on the total test and those who scored low. • It can be interpreted as an indication of the extent to which overall knowledge of the content area or mastery of the skill is related to the response on an item

7. Analysis of response options/distracter analysis • In addition to examining the performance of a test item, teachers are often interested in examining the performance of individual distracters ( incorrect answer options) on multiple-choice items.

8. By calculating the proportion of students who chose each answer option, teachers can identify which distracters are working and appear to be attractive to students who do not know the correct answer, and which distracters are simply taking up space and not being chosen by many students

9. To eliminate blind guessing which results in a correct answer purely by chance (which hurts the validity of a test item), teachers want as many plausible distracters as is feasible.

10. The process of item analysis • 1. Arrange the test scores from highest to lowest

11. 2. Select the criterion groups • Identify a High group and a Low group. The High group is the highest-scoring 27% of the group and the Low group is the lowest scoring 27% • 27% of the examinees is called the criterion group. It provides the best compromise between two desirable but inconsistent aims: to make the extreme groups as large as possible and as different as possible. then we can say with confidence that those in the High group are superior in the ability measured by the test than those in the Low group.

12. 3. For each item, count the number of examinees in the High group who have correct responses. Do a separate, similar procedure for the low group

13. 4. Solve for the difficulty index of each item • The larger the value of the index, the easier the item. • The smaller the value, the more difficult is the item. • Scale for interpreting the difficulty index of an item Below 0.25 item is very difficult 0.25 – 0.75 item is of average difficulty or item is rightly difficult Above 0.75 item is very easy

14. The difficulty index of a given type of test is simply the mean of the difficulty indices for each item • Mean = ΣIidif/ k Where k is the number of test items Iidifis the difficulty index of test item i

15. For test items scored with 0 or 1 (Ex: matching type, multiple choice, t/f) the difficulty index is given by the formula • Idif = (Hc + Lc) / 2N Where Idif is the index of difficulty of a particular item N refers to the number of examinees in the criterion group Hc number of examinees in the High group who have correct responses Lc number of examinees in the Low group who have correct responses

16. For items scored other than 0 and 1 (Essay and problem solving) • Idif = (ΣXi + ΣXj – 2NXmin) / 2n(Xmax-Xmin) • Where Idif is the index of difficulty • N is the number of examinees in the criterion group • Xi is the test item score of student i in the High group • Xj is the test item score of student j in the Low group • Xmax highest possible score on that item • Xmin lowest possible score on that item

17. 5. Solve for the discrimination index (Idis) of each item • Formula: Idis = (Hc – Lc) / N • If more students from the High group chose the correct answer than did students from the Low group, the index will be positive. The greater the positive value, the stronger is the relationship between overall test performance and performance on that item. A negative index would imply that for some reason students who scored low on the test were more likely to get the answer correct which is a strange situation suggesting poor validity for that test item.

18. Scale for interpreting Idis • 0.40 and above Very good item • 0.30 - 0.39 Reasonably good item but possibly subject to improvement • 0.20 – 0.29 Marginal item, subject to improvement • Below 0.20 Poor item, to be discarded or deleted

19. The discrimination index of a given type of test is the mean of the discrimination indices of all items • Mean = Σ Iidis / k Where k is the number of test items Iidis is the discrimination index of test item i

20. For test items scored with 0 and 1 • Idis = (Hc – Lc) / N • For items scored other than 0 and 1 • Idis = (ΣXi - ΣXj) / N (Xmax – Xmin)

21. 6. For Multiple-Choice type of Test, Conduct Distracter Analysis 3 info are needed for each option per item • Count of students from the High group who selected each of the options • Count of students from the Low group who selected each of the options • Index of Effectiveness (IE) for each option is calculated by using the index of discrimination formula

22. Interpretation of IE • If IE < 0 → a negative IE indicates that the option is plausible since there are more students in the Low group who selected the option compared to the High group. • If IE > 0 → a positive IE would show that more from the High group chose the option. This also shows that the option is not working well as a distracter and should be revised • If IE =0 → If no one selects the option then it has to be discarded If the same count of students (but very few), discard the item If the same count of students (but many), can be retained but improved

23. Example:Item analysis • 1. Count and arrange the scores from highest to lowest. • Ex. n=43 scores • 2. Calculate the criterion group (N) which is 27% of the total number of scores. • Ex. N=27% of 43= (0.27)(43) = 12 • 3. Take 12 scores from the highest down and take 12 scores from the lowest up, call these High group and Low group respectively. • 4. Tabulate the number of responses for each options from the high and low groups for that particular item under analysis.

24. 5. Solve for the difficulty index of each item • The larger the value of the index, the easier the item. The smaller, the more difficult. • Scale for interpreting the difficulty index of an item Below 0.25 item is very difficult 0.25 – 0.75 item is of average difficulty or item is rightly difficult Above 0.75 item is very easy

25. The following can be used to interpret the index of discrimination.

26. Interpreting the results by giving value judgment

27. Index of difficulty = (Hc + Lc) / 2N = (9+4)/2(12)=.54 ----the item is rightly difficult • Index of discrimination = (Hc –Lc)/N=(9-4)/12=.42 ---- high index of discrimination ---- the item has the power to discriminate Hence, item number 5 has to be retained. • Distracter analysis: A and C are good distracters