1 / 38

The Assessment Process

The Assessment Process. Establishing Objectives. Selecting/ Designing Instruments. Using Information. Continuous Cycle. Collecting Information. Analyzing/ Maintaining Information. Which tool do I use?. Choose the tool(s) that best addresses your objective(s).

karena
Télécharger la présentation

The Assessment Process

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Assessment Process Establishing Objectives Selecting/ Designing Instruments Using Information Continuous Cycle Collecting Information Analyzing/ Maintaining Information

  2. Which tool do I use? • Choose the tool(s) that best addresses your objective(s). • Choose the tools that are best for your situation, audience, and resources (tables in notebook). • Time, money, resources, skill and philosophy strongly influence choices. • No tool is perfect in all situations.

  3. Which Assessment Tool to Use? Testing Performance Assessment Selected Supply Restricted Extended Response Response Performance Performance From Gronlund (2003) REALISM OF TASKS LOW HIGH COMPLEXITY OF TASKS LOW HIGH TESTING TIME NEEDED LOW HIGH JUDGMENT IN SCORING LOW HIGH

  4. Tests & Questionnaires Assess: • Knowledge – what people know • Beliefs, attitudes and opinions – the perceptions people hold • Behavior – what people do • Attributes – age, education, occupation, gender, ethnicity

  5. Questionnaires Data collection instruments (printed or oral) used to collect information from a lot of respondents about attitudes, skills and behaviorsin a non-threatening way.

  6. When to use Questionnaires If you: • Want self-reported data about attitudes, skills, and behaviors • Need to collect data from a large group

  7. Advantages of Questionnaires • Can reach a large number of people = lots of data • Closed-ended responses can be tallied and compared easily • Can be completed on respondent’s own time; Respondents have time to think about answers • Relatively easy to administer and grade • Can be confidential; anonymous • Open-ended questions can be included

  8. Disadvantages of Questionnaires • Participants may not answer honestly • Participants may skip items • Questions may be answered improperly and data analyst will never know • Wording of items can bias responses • Impersonal • May need statistical expertise to analyze • Limited options for unanticipated responses • Does not yield the whole story/picture

  9. Tests • Data gathering tools used to judge knowledge or skill level. • Should give a good idea of students’ current level of knowledge or skill ability • Snapshot in time.

  10. When to use Tests • Need to assess individuals’ knowledge and/or skill level • Want to know performance at a specific point in time; does not predict past or future performance • Want to assess change over time (repeated measures)

  11. Advantages of Tests • Can identify level of knowledge or skill • Helps identify problems or deficiencies • Results are easily quantified • Individual performance easily compared • Helps determine if intervention has changed knowledge or skill level

  12. Disadvantages of Tests • Valid Tests are difficult to construct • May not assess all students know/can do • Language or vocabulary can be an issue • Students will be concerned about how test will be used

  13. Test & Questionnaire “Issues” • Reliability & Validity * • Response Rates • Readability • Testwiseness * • Question Types * • Errors * • Good questions - Problem questions *

  14. Reliability Reliability – the consistency of your measurement, or the degree to which an instrument measures the same way each time it is used under the same condition with the same subjects. • Does the instrument (method or tool) produce consistent results? • Are learners taking the same test? • Consistency does not guarantee truthfulness. • Much easier to demonstrate than validity.

  15. Validity Validity - the extent to which a test measures what it is supposed to measure. Validity is a property of the meaning of the test scores. • Is the instrument (method or tool) an appropriate one to measure what you want to know? • Does the instrument measure what it is supposed to measure? • Does the instrument give you the true story; does it detect the real ability, or skill that is important? • Valid instruments are true and accurate.

  16. Reliability and Validity To be useful, instruments must be valid & reliable – • true, • accurate, AND • consistent

  17. Design and Format 1.Title 2. Introductory statements • State the purpose of the test or questionnaire • Clarify terms • Show regard • Provide contact information 3. Questions • Questions should have a natural flow • Include directions • Use transition statements to signal a new topic • Write several questions targeting constructs of interest

  18. 4. Format • Start: Easy-to-answer items. On tests, arrange in ascending order of difficulty. • Middle: Include the most important items • End: Ask the most personal items • Throughout: Group items into coherent categories by themes, types of items. Include more than one type of item especially if questionnaire/test is long. • Throughout: Include directions for how to complete items 5. Check readability. Keep reading level low (as possible). 6. Review, pilot, review instrument 7. No timed tests. Measure knowledge +, not speed

  19. Test Wiseness • Test sophistication, a test-taker’s ability to use the characteristics and formats of the test &/or the test-taking situation to increase his or her score. • The goal of cognitive/knowledge tests is to measure the test-takers’ best or highest level of performance. • Many extraneous factors can influence the measurement of maximum performance.

  20. Types of Questions • Fixed choice or closed-ended items – multiple choice, matching, true/false • Constructed response or open - ended items - essay tests, open-ended questions, fill–in-the-blank

  21. FixedChoicevs. Open-endedItems Fixed Choice items Good for measuring outcomes at lower levels of learning (e.g., knowledge, comprehension, and application); inadequate for organizing and expressing ideas. The use of a large number of items results in broad coverage which makes representative sampling of content feasible. Open-ended items Inefficient for measuring knowledge outcomes; best for ability to organize, integrate, and express ideas. The use of a small number of items limits coverage which makes representative sampling of content infeasible. Learning outcomes measured Sampling of content

  22. FixedChoicevs. Open-ended Items Fixed Choice items Preparation of good items is difficult and time consuming. Objective, simple, and highly reliable. Reading ability and guessing. Encourages learners to remember, interpret, and use the ideas of others. Open-ended items Preparation of good items is difficult but easier than Free Choice items Subjective, difficult, and less reliable. Writing ability and bluffing. Encourages learners to organize, integrate, and express their own ideas. Preparation of items Scoring Factors distorting scores Probable effect on learning

  23. Constructed Response Questions • Allow respondents to answer in their own words • Useful when examples, explanations and experiences are sought • Often used in “pre-survey” to identify closed question responses • Can provide answers you didn’t consider • Can be difficult to analyze • More demanding to complete, so use sparingly Example: What did you like most about your experience today?

  24. Fixed Choice Questions • Provide responses for people to mark • Responses are more uniform • Easily tabulated • Easily misinterpreted • May limit respondents choices resulting in less meaningful responses • Many types...

  25. Two-Way Questions • Only two responses are provided, on the ends of a scale: Yes/No; True/False; Agree/Disagree • Easy to write • Easy to score • Easy to guess • Easy to misinterpret Example: Indicate whether you agree or disagree with the following statement: a. Plants are an important part of my life. [ ] Agree [ ] Disagree

  26. Multiple Choice Questions • Useful to test knowledge • Used to collect demographic data • Easy to write • Responses are independent; not inter-related • Categories should be mutually exclusive • Categories should be exhaustive • Easy to score Example: Which of the following is a renewable resource? a.) oil b.) iron ore c.) trees d.) coal

  27. Categorical Questions • Responses are mutually exclusive categories of your choosing • Results depend on quality of choices • Categories should be exhaustive • Easy to score Example: Indicate the dollar amount you contributed to conservation or environmental organizations in 2004? • $0 • $1-50 • $51-100 • More than $100

  28. Matching Questions • Match term with example: • animal with season, person with invention, etc. • Often used to analyze knowledge • Not too threatening for most respondents • Easy to score

  29. _____ Tidal Salt Marsh a. blue tangs _____ Open Ocean b. polar bears _____ Sandy Beach c. tuna _____ Arctic Ocean d. ghost crabs _____ Coral Reef e. oysters Example: The following columns include a list of different habitats (left) & different organisms (right). Place the letter for each organism in the blank space next to the kind of habitat in which it is most commonly found. Adapted from Marcinkowski (1993)

  30. Rating Scales • Respondents circle a number or mark a scale that provides a relative value • Choices must match content • 5 choices are ideal (4 depending on nature of question) • Endpoints should be equally intense Strongly disagree - Strongly agree • Orient scales similarly (1=low; 5=high)

  31. Example: Likert item example from educator survey

  32. Ranking Questions • People rank items 1st, 2nd, 3rd … to provide their priority • Most helpful when list is short Example: Rank the following items (1 being the highest; 3 being the lowest) as to what you feel are the 3 most important environmental issues concerning the United States.  ___ Air pollution ___ Water pollution ___ Urban sprawl ___ Global warming ___ Other, _________________

  33. Errors – To Reduce: Review, Pilot, Revise... • Write several items/questions for each purpose • Design items/questions • Get colleagues and other professionals to critique • Revise – based on feedback • If Possible - Pilot with small sample of students or rewrite or remove problematic items • Talk through responses & any questions • Revise; Reduce number of test items yet sample all important content • Review & pilot again, until variation & confusion disappears

  34. Question Dos • Adapt wording and vocabulary to the reading skills of your respondents • Ensure questions are clearly worded – pilot • Use complete sentences or phrases • Use reasonable time frames • Include all necessary information • Make response choices clear and logical • Make personal questions less objectionable • Use mutually exclusive categories • Make items as clear, short & specific as possible

  35. Question Don’ts • Use abbreviations and jargon • Offer double-barreled questions • ie. _____ and _____, _____ or _____ • Use leading or biased questions • Include double negatives • Write questions that may be too precise or too vague

  36. Practice • Problem questions exercise • In groups of 2 • Look at Problem Questions What could be improved about each Question? Then we will share as a group

  37. Practice Based on your evaluation plan: • Design an instrument to collect information of interest to your project • Share with group

More Related