1 / 26

Developing Assessment Instruments

Developing Assessment Instruments. Instructional Design: Unit, 3 Design Phase. Criterion-Referenced Tests. Designed to measure explicit behavioral objectives Allows instructors to decide how well the learners have met the objectives that were set forth. Used to evaluate:

Télécharger la présentation

Developing Assessment Instruments

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase

  2. Criterion-Referenced Tests • Designed to measure explicit behavioral objectives • Allows instructors to decide how well the learners have met the objectives that were set forth. • Used to evaluate: • learner performance • effectiveness of the instruction

  3. Criterion-Referenced • Also called objective-referenced , or domain-referenced • Refers directly to explicit “criterion” or specified performance • “Criterion-Referenced Test” must: • match test item and performance objective • provide degree of mastery of the skill

  4. Types of Criterion-Referenced Tests Dick, Carey and Carey discuss four different types of criterion-referenced tests that fit into the design process: • Entry Behaviors Test • Pretest • Practice Tests • Posttests

  5. Types of Criterion Tests Entry behavior test: • Consists of items that: • measure entry behavior skills • test skills to be taught • draw from skills below the entry behavior line • Helps determine appropriateness of required entry skills. • Used during formative evaluation process. May be discarded in final version of instruction.

  6. Types of Criterion Tests Pretest: • used to determine whether learners have previously mastered some or all of the skills that are to be included in the instruction. • IBT determines whether or not students are ready to begin your instruction, • PT helps determine which skills in your main instructional analysis, Students may already be familiar with.

  7. Types of Criterion Tests Practice test: • To provide active learner participation during instruction. • Enable learners to rehearse the new knowledge and skills they are being taught. • also allow instructors to provide corrective feedback to keep learners on track.

  8. Types of Criterion Tests Posttest: • Are administered following instruction, and they are parallel to pretest. • Assesses all the objectives, focusing on terminal objectives • Helps identify ineffective instructional segments • Used during the design process and may be eventually modified to measure only terminal objectives.

  9. 1-4 5 - 13 • Using the instructional analysis diagram in this slide, indicate by box number (s) the skills that should be used to develop test items for: • Entry behaviors test:………. • Pretest:………… • Posttest:……….. 5 - 13 14 12 13 11 7 8 9 10 5 6 Skills for instruction Entry behaviors 4 1 2 3

  10. Designing Tests for Learning Domains • Intellectual & Verbal Information • paper & pencil, short-answer, matching, and multiple-choice. • Attitudinal • state a preference or choose an option • Psychomotor • performance quantified on checklist • subordinate skills tested in paper-and-pencil format

  11. Determining Mastery Levels • Approach # 1 • mastery defined as level of performance normally expected from the best learners • arbitrary (norm-referenced) (group comparison methods) • Approach # 2 • defined in statistical terms, beyond mere chance • mastery varies with critical nature of task • example: nuclear work Vs. paint a house • Is the level required in order to be successful on the job.

  12. Writing Test Items What should test items do? • Match the behavior of the objective • Use the correct “verb” to specify the behavior • Match the conditions of the objective

  13. Writing Test Items How many test items do you need? • Determined by learning domains • Intellectual requires three or more • Wide range use random sample

  14. Writing Items (continued) What types (true / false, multiple choice, etc..) to use? • clues provided by the behavior listed in the objective • review “Types of Test Items” this chap. p 148 • Entry behavior • Pretest • Practice test • Posttest

  15. Writing Items (continued) Item types tempered by: • amount of testing time • ease of scoring • amount of time to grade • probability of guessing • ease of cheating, etc. • availability of simulations

  16. Writing Items (continued) What types are inappropriate? • true / false for definition • discrimination, not definition Acceptable alternatives from “best possible” • for simulations • list steps

  17. Constructing Test Items Consider: • vocabulary • setting of test item (familiar Vs. unfamiliar) • clarity • all necessary information • trick questions • double negatives, misleading information, etc.

  18. Other Factors • Sequencing Items • Consider clustering by objective • Test Directions • Clear and concise • General • Section specific • Evaluating Tests / Test Items

  19. Measuring Performance, Products, & Attitudes • Write directions to guide learner activities • Construct an instrument to evaluate these activities • a product, performance, or attitude • Sometimes includes both process and a product

  20. Test Directions for Performance, Products, & Attitudes • Determine the • Amount of guidance? • Special conditions • time limits, special steps, etc. • Nature of the task (i.e., complexity) • Sophistication level of the audience

  21. Assessment Instruments for Performance, Products, & Attitudes • Identify what elements are to be evaluated • cleanliness, finish, tolerance (possibility) of item, etc. • Paraphrase each element • Sequence items on the instrument • Select the type of judgment for rater • Determine instrument scoring

  22. Formats for Assessments of Performance, Products, & Attitudes • Checklist • Rating Scale Frequency Counts • Etc.

  23. Evaluating Congruency • Skills, Objectives, & Assessments should refer to the same behaviors • To check for congruency • Construct Congruency Evaluation Chart • include: Subskills, Behavioral Objectives, & Test Items

  24. Design Evaluation Chart

  25. Example • Objective1: Given a research topic and a list of ten Google search results, select the three web sites most appropriate to the research topic. • What will they need to do? The learners should be able to select web sites from a list of search results. • What conditions will need to be provided? The learners will need to be given a predetermined research topic and a list of actual Google search results related to that topic. • Domain : Intellectual Skills: Rules. Students have to apply a set of criteria in order to make a decision. • This objective will require fill-in-the-blank test item, as the students will have to write down the three most appropriate sites based on certain criteria. • Test Item 1: • Take a look at the following Googlesearch results: (show screen capture of search results). Which 3 web sites are likely to have specific and relevant information dealing with the subject of Life on Mars?

More Related