1 / 15

Stages of Test Development

Stages of Test Development. By Lily Novita - 69090007. Make a full and clear statement of the testing ‘problem’. Write complete specifications for the test. Write and moderate items. Trial the items informally on native speakers and reject or modify problematic ones as necessary.

airlia
Télécharger la présentation

Stages of Test Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Stages of Test Development By Lily Novita - 69090007

  2. Make a full and clear statement of the testing ‘problem’. • Write complete specifications for the test. • Write and moderate items. • Trial the items informally on native speakers and reject or modify problematic ones as necessary. • Trial the test on a group of non-native speakers similar to those for whom the test is intended. • Analyse the results of the trial and make any necessary changes. • Calibrate scales. • Validate. • Write handbooks for test takers, test users and staff. • Train any necessary staff (interviewers, raters, etc.).

  3. 1. Stating the Problem • The essential initial step in any testing is to make oneself perfectly clear what one wants to know and for what purpose • What kind of test it is constructed for? • What is the precise purpose? • What abilities are to be tested? • How detailed must the results be? • How accurate must the results be? • How important is backwash? • What constraints are set by unavailability of expertise, facilities, time ? (for construction, administration and scoring)

  4. 2. Writing specifications for the test • Content • Operations • Types of text • Addresses of texts • Length of text(s) • Topics • Readability • Structural range • Vocabulary Range • Dialect, accent, style • Speed of processing

  5. 2. Writing specifications for the test • Structure , timing, medium/channel and techniques • Test structure • Number of items • Medium / channel • Timing • Techniques

  6. 2. Writing specifications for the test • Criterial levels of performance • Accuracy • Appropriacy • Range • Flexibility • Size

  7. 2. Writing specifications for the test • Scoring procedures • Subjectivity • Achievement of high reliability & validity in scoring • Rating scale to be used? • No. of people rating each piece of work? • Solutions on disagreements between raters

  8. 3. Writing and moderating items • Sampling • Writing items • Moderating items

  9. 4. Informal trialling of items on native speakers

  10. 5. Trialling of the test on a group of non-native speakers similar to those for whom the test is intended • trials are designed to help ensure that the items function appropriately and are not confusing for the students. • this is accomplished by embedding field test items in the operational test, to ensure that the items are taken by a representative group of motivated students under standard conditions.

  11. 6. Analysis of the results of the trial – making of any necessary changes • 2 kinds of analysis should be carried out : • Statistical analysis : reveals qualities (reliability) as a whole and individual items – how difficult they are , how well they discriminate between stronger and weaker candidates. • Qualitative analysis : responses are examined to discover misinterpretations, unanticipated but possibly correct answers and indicators of other faulty items.

  12. 7. Calibration of scales • It means collecting samples of performance which cover the full range of the scales. • A calibration test is a procedure in which an instrument, tool, or device is tested to confirm that it conforms with the standard. Calibration is very important, as it ensures that objects are working properly. There are a number of reasons to conduct a calibration test, ranging from concerns that something is not working as it should to preparations for an event in which very precise calibration is desired, and there are a number of ways to perform a calibration.

  13. 8. Validation • Essential validation – for high stakes or published tests • Small-scale validation – for low stakes used within an institution

  14. 9. Writing handbooks for test takers, test users and staffs (contents) • The rationale for the test; • An account of how the test was developed and validated • A description of the test • Sample items • Advice on preparing for taking the test • An explanation of how test scores are to be interpreted • Training materials • Details of test administration

  15. 10. Training Staff • All staffs who will be involved in the test process should be trained : interviewers, raters, scorers, computer operators, and invigilators.

More Related