1 / 19

How do we evaluate how effective we are in helping our students master course material?

How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? Prelim and final exam performance % F, D and C- grades Course evaluations The problem of silent evidence. .

bishop
Télécharger la présentation

How do we evaluate how effective we are in helping our students master course material?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? • Prelim and final exam performance • % F, D and C- grades • Course evaluations The problem of silent evidence.

  2. How do we evaluate how effective we are in helping our students master course material? In promoting our broader educational agendas? - % of initial enrollees who complete the course. • Course enrollment trends (e.g., elective enrollees) • Standardized Pre- and Post- Testing

  3. Idea: • Generate and validate a test to evaluate student understanding of key concepts • Give the test to the students at the beginning of the semester. • Give the same test to the students at the end of the semester. • Compare pre- and post- test scores for each student to evaluate learning gains.

  4. Two kinds of tests: • Concept Tests: Evaluate conceptual understanding of core course content Force Concept Inventory (FCI) Force and Motion Conceptual Evaluation (FMCE) Conceptual Survey in Electricity and Magnetism (CSEM)

  5. Two kinds of tests: • Attitude Tests:Evaluate attitudes and beliefs about science and learning science. Epistomological Beliefs Assessment for Physical Science (EBAPS) Colorado Learning Attitudes about Science (CLASS) Maryland Physics Expectations Survey (MPEX)

  6. 2. When it comes to understanding physics or chemistry, remembering facts isn’t very important. 13. If physics and chemistry teachers gave really clear lectures, with plenty of real-life examples and sample problems, then most good students could learn those subjects without doing lots of sample questions and practice problems on their own.

  7. 20. In physics and chemistry, how do the most important formulas relate to the most important concepts? Please read all choices before picking one. (a) The major formulas summarize the main concepts; they’re not really separate from the concepts. In addition, those formulas are helpful for solving problems. (b) The major formulas are kind of "separate" from the main concepts, since concepts are ideas, not equations. Formulas are better characterized as problem-solving tools, without much conceptual meaning.

  8. 25. Anna:I just read about Kay Kinoshita, the physicist. She sounds naturally brilliant. Emily: Maybe she is. But when it comes to being good at science, hard work is more important than “natural ability.” I bet Dr. Kinoshita does well because she has worked really hard. Anna: Well, maybe she did. But let’s face it, some people are just smarter at science than other people. Without natural ability, hard work won’t get you anywhere in science! (a) I agree almost entirely with Anna. (b) Although I agree more with Anna, I think Emily makes some good points. . . .

  9. 1990's: FCI in P207 (Littauer, Holcomb) 2008: PhysTEC-mandated testing in intro courses 2009: Online pre- and post-testing Physics 1112, 1116: FMCE, EBAPS Physics 2207: FCI, EBAPS Physics 2213, 2217: CSEM

  10. Fall 2009 Pre-Test Response Rate P1112: 155/195 79%* P1116: 029/071 41% P2207: 123/319 39% P2213: 159/397 40% P2217: 023/047 49% * P1112 offered a tiny amount of course credit for taking the test.

  11. P2207 Average: 14.8 (49%) St Dev: 5.9 (20%) Score

  12. P1112 Average: 20.5 (62%) St Dev: 8.5 (26%) Score

  13. P1116 Average: 29.2 (88%) St Dev: 7.3 (22%) Score

  14. P2213 Average: 12.8 (43%) St Dev: 4.9 (16%) Score

  15. P2217 Average: 20.0 (65%) St Dev: 4.3 (14%) Score

  16. Using Pretest Results - Average level and distribution of student preparation. (Are you attracting the most diverse class possible?) - Identify students who may need help - Identify common misconceptions, and give homework problems and exam questions to rectify. - Disaggregate student completion/performance based on pretest score/prior preparation.

More Related