1 / 42

TESTING

TESTING. PRESENTED BY MWESIGYE NELSON MASTER IN EDUCATION MGT ( MAKERERE UNIVERSITY) BACHELOR OF EDUCATION ( MAKERERE UNIVERSITY). What is testing?. Definition:

lorin
Télécharger la présentation

TESTING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TESTING PRESENTED BY MWESIGYE NELSON MASTER IN EDUCATION MGT (MAKERERE UNIVERSITY) BACHELOR OF EDUCATION (MAKERERE UNIVERSITY)

  2. What is testing? • Definition: Testing is the practice of making objective judgments regarding the extent to which the student meets, exceeds or fails to meet stated objectives. 2.

  3. Qualities of a good test • Validity:A test is considered as valid when it measures what it is supposed to measure. The validity of a test is the extent to which the test measures what it is intended to measure. In order to be valid, 3.

  4. What is testing ? Cont. • a) A test or an exam must test what has been taught.b) A test or an exam must not test things which have not been taught.c). A test or an exam must test only concepts not things about Intelligence, memory or general knowledge.A test is said to be valid if it is measures what it claims to measure. . 4.

  5. Validity cont. • Validity is a unified concept. As such, it has several dimensions or aspects, they are: • Face Validity It relates to how a test looks to other people, students, experts, etc. 5.

  6. Validity cont. • Example: a grammar test should test the grammar not the vocabulary. Thus, in a grammar test, the vocabulary should be easy and vice versa. Therefore in an accounts test the grammar and vocabulary should be simple because that isn’t what you are testing. 6.

  7. Validity cont. • Content Validity A test needs to have a representative sample of the teaching/instructional contents as defined and covered in the module. How do we know that a test is valid in content? By using blue print: Example: a grammar test should contain every part of the grammar. But we cannot take all of them, so just take the sample to represent the content. 7.

  8. Validity cont. • Empirical Validity It relates to the closeness between the score obtained from a test with the other criteria outside that test. It is divided into two: a)        Concurrent validity: how well the test estimates current performance on some valued measure other than the test itself.e.g. TOEFL and TOEIC >> if someone’s TOEFL score is high, then we can say that automatically her/his TOEIC score is high too. 8.

  9. Validity cont. • b)        Predictive Validity: how well the test predicts future performance on some valued measure other than the test itself. e.g. GPA >> if someone’s GPA is high, we can say that s/he will have a high salary job. 9.

  10. Validity cont. • Construct Validity It relates to our understanding of the existing theory to construct a test. Example: if we want to construct an organisational theory test, we have to find, compare, and criticize the theory about the organisational test not the other. Construct validity requires two levels: 10.

  11. Validity cont. • Logical Analysis  Dimension >> standard of competence-----this is a university level. Variable >> basic competences…….. what have you covered in the module.   Sub variable   Indicator   Test items 11.

  12. Validity cont. • Empirical Analysis Whether the test items measure what is defined by their indicators in the module. • Wash back Validity It relates to the influence of a test to the teaching learning process. There are two types of wash back: Positive Wash back Micro Level (classroom setting) 12.

  13. Validity cont. •   Tests induce teachers to cover their subjects more thoroughly. Tests make students work harder. Tests encourage positive teaching-learning processes. Macro Level (educational/societal system) Decision makers (govt.) use the authority power of high stakes testing to achieve the goals of teaching and learning, such as the introduction of new textbooks and new curricula. 13.

  14. Validity cont. • Negative Wash back Micro Level (classroom setting)   Tests encourage teachers to make “teaching to the test” curriculum.    Tests bring anxiety to both teachers and students and distort their performance.   Tests drag students to learn discrete points of knowledge that are tested. 14.

  15. Validity cont. Tests make students create a negative judgment toward tests, and alter their learning motivation. Macro Level (educational/societal system) Decision makers overwhelmingly use tests to promote their political agendas and seize influence and control of educational systems. 15.

  16. Reliability . • Reliability: A test is considered reliable if it is taken again by the same students under the same circumstances and the score average is almost the constant, taking into consideration that the time between the test and the retest is of reasonable length. There are three aspects of reliability: the circumstances in which the test is taken, the way in which it is marked and the uniformity of the assessment it makes. 16.

  17. Reliability cont. • In order to be reliable, a test or an exam a) must have instructions which are equally clear to all the pupils.b) must allow an appropriate amount of time.c) must include enough questions.d) should not allow students to choose which questions they answer.e) must have only one correct answer per question. 17.

  18. Reliability cont. • Objectivity: Objectivity means that if the test is marked by different people, the score will be the same. In other words, marking process should not be affected by the marking person's personality. 18.

  19. Reliability cont. • Comprehensiveness: A good test should include items from different areas of material assigned for the test. e.g. (dialogue - composition - comprehension - grammar - vocabulary - orthography - dictation - handwriting) 19.

  20. Reliability cont. • - Simplicity: Simplicity means that the test should be written in a clear, correct and simple language, it is important to keep the method of testing as simple as possible while still testing the skill you intend to test. (Avoid ambiguous questions and ambiguous instructions) 20.

  21. Reliability cont. • 6.Scorability : Scorability means that each item in the test has its own mark related to the distribution of marks given by ( The Academic department) • T he amount of content demanded should match the score allocated. 21.

  22. Bloom's Taxonomy of Learning Domains • Bloom's Taxonomy was created in 1956 under the leadership of educational psychologist Dr. Benjamin Bloom in order to promote higher forms of thinking in education, such as analyzing and evaluating, rather than just remembering facts (rote learning). 22.

  23. Broom’s taxonomy cont. • Three Types of Learning were highlighted. • Cognitive: mental skills. (Knowledge) • Affective: growth in feelings or emotional areas. (Attitude or self) • Psychomotor: manual or physical skills. (Skills) 23.

  24. Cognitive Domain • The cognitive domain involves knowledge and the development of intellectual skills (Bloom, 1956). This includes the recall or recognition of specific facts, procedural patterns, and concepts that serve in the development of intellectual abilities and skills. There are six major categories, which are listed in order below, starting from the simplest behavior to the most complex. The categories can be thought of as degrees of difficulties. That is, the first ones must normally be mastered before the next ones can take place. 24.

  25. Broom’s pyramid

  26. Knowledge (Remembering) • Recalling memorized information: May involve remembering a wide range of material from specific facts to complete theories, but all that is required is the bringing to mind of the appropriate information. It represents the lowest level of learning outcomes in the cognitive domain. 26.

  27. Knowledge cont. • Learning objectives at this level: know common terms, know specific facts, know methods and procedures, know basic concepts, know principles. • Question verbs: arranges, defines, describes, identifies, labels, lists, matches, names, outlines, recalls, recognizes, reproduces, selects, states. 27.

  28. Comprehension (understanding) • This calls for ability to grasp the meaning of given material. It involves translating material from one form to another (words to numbers), interpreting material (explaining or summarizing), and estimating future trends (predicting consequences or effects). Goes one step beyond the simple remembering of material, and represent the lowest level of understanding. 28.

  29. Comprehension cont. • Learning objectives at this level: understand facts and principles, interpret verbal material, interpret charts and graphs, translate verbal material to mathematical formulae, estimate the future consequences implied in data, justify methods and procedures. • Question verbs: Explain, Comprehends, converts, defends, distinguishes, estimates, explains, extends, generalizes, gives an example, infers, interprets, paraphrases, predicts ,rewrites, summarizes, and translates. 29.

  30. Application (applying) • This calls for ability to use learned material in new and concrete situations. It involves applying rules, methods, concepts, principles, laws, and theories. Learning outcomes in this area require a higher level of understanding than those under comprehension. 30.

  31. Application cont. • Learning objectives at this level: apply concepts and principles to new situations, apply laws and theories to practical situations, solve mathematical problems, construct graphs and charts, and demonstrate the correct usage of a method or procedure. • Question verbs:applies, changes, computes, constructs, demonstrates, discovers, manipulates, modifies, operates, predicts, prepares, produces, relates, shows, solves, uses. 31.

  32. Analysis (analyzing) • This calls for ability to break down material into its component parts. Identifying parts, analysis of relationships between parts, recognition of the organizational principles involved. Learning outcomes here represent a higher intellectual level than comprehension and application because they require an understanding of both the content and the structural form of the material. 32.

  33. Analysis cont. • Learning objectives at this level: recognize unstated assumptions, recognizes logical fallacies in reasoning, distinguish between facts and inferences, evaluate the relevancy of data, analyze the organizational structure of a work (art, music, writing). • Question verbsanalyzes, breaks down, compares, contrasts, and diagrams; deconstructs, differentiates, discriminates, distinguishes, identifies, illustrates, infers outlines, relates, selects, and separates.

  34. Synthesis (evaluating) • (By definition, synthesis cannot be assessed with multiple-choice questions. It appears here to complete Bloom's taxonomy.) • This calls for ability to put parts together to form a new whole. This may involve the production of a unique communication (theme or speech), a plan of operations (research proposal), or a set of abstract relations (scheme for classifying information). Learning outcomes in this area stress creative behaviors, with major emphasis on the formulation of new patterns or structure.

  35. Synthesis cont. • Learning objectives at this level: write a well organized paper, give a well organized speech, write a creative short story (or poem or music), propose a plan for an experiment, integrate learning from different areas into a plan for solving a problem, formulate a new scheme for classifying objects (or events, or ideas). • Question verbs: : categorizes, combines, compiles, composes, creates, devises, designs, explains, generates, modifies, organizes, plans, rearranges, reconstructs, relates, reorganizes, revises, rewrites, summarizes, tells, writes. 35.

  36. Evaluation (creating) • This calls for ability to judge the value of material (statement, novel, poem, research report) for a given purpose. The judgments are to be based on definite criteria, which may be internal (organization) or external (relevance to the purpose). The student may determine the criteria or be given them. Learning outcomes in this area are highest in the cognitive hierarchy because they contain elements of all the other categories, plus conscious value judgments based on clearly defined criteria. 36.

  37. Evaluation cont. • Learning objectives at this level: judge the logical consistency of written material, judge the adequacy with which conclusions are supported by data, judge the value of a work (art, music, writing) by the use of internal criteria, judge the value of a work (art, music, writing) by use of external standards of excellence. • Question verbs: Appraises, compares, concludes, contrasts, criticizes critiques, defends, describes, discriminates, evaluates, explains, interprets, justifies, relates, and summarizes, supports. 37.

  38. Constructing test items. • Test items should be constructed with the help of a test blue print. This refers to a systematic and all inclusive exam that meets the objectives as the module taught in a given period. 38.

  39. Test blue print.

  40. Distribution of scores.

  41. Normal distribution curve

  42. CLOSURE • The test should cover what has been taught giving students adequate choice. • It should be of clear, simple, understandable language to all students. • It should be valid and reliable. • It should be scorable. • We should base on a test blue print to measure the desirable skills. • Distribution of scores should be reflected in the performance- normal distribution curve.

More Related