1 / 59

Designing and Evaluating Assessments for Introductory Statistics

Designing and Evaluating Assessments for Introductory Statistics . Minicourse #1 Beth Chance (bchance@calpoly.edu) Bob delMas Allan Rossman NSF grant PI: Joan Garfield. Outline. Today: Overview of assessment Introductions Assessment goals in introductory statistics

joben
Télécharger la présentation

Designing and Evaluating Assessments for Introductory Statistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing and Evaluating Assessments for Introductory Statistics Minicourse #1 Beth Chance (bchance@calpoly.edu) Bob delMas Allan Rossman NSF grant PI: Joan Garfield

  2. Outline • Today: Overview of assessment • Introductions • Assessment goals in introductory statistics • Principles of effective assessment • Challenges and possibilities in statistics • Overview of ARTIST database • Friday: Putting an assessment plan together • Alternative assessment methods • Nitty Gritty details, individual plans

  3. Overview • Assessment = on-going process of collecting and analyzing information relative to some objective or goal • Reflective, diagnostic, flexible, informal • Evaluation = interpretation of evidence, judgment, comparison between intended and actual, use information to make improvements

  4. Dimensions of Assessment • Evaluation of program • Evaluate curricula, allocate resources • Monitoring instructional decisions • Judge teaching effectiveness • Evaluating students • Give grades, monitor progress • Promoting student progress • Diagnose student needs

  5. Types of Assessment • Formative Assessment • In-process monitoring of on-going efforts in attempt to make rapid adjustments • Summative Assessment • Record impact and overall achievement, compare outcomes to goals, decide next steps • Example: teaching • Example: learning

  6. Bloom’s Taxonomy • Knowledge • Comprehension • Application • Analysis • Synthesis • Interrelationships • Evaluation

  7. An Assessment Cycle • Set goals • Select methods • Gather evidence • Draw inference • Take action • Re-examine goals and methods Example: Introductory course Example: Lesson on sampling distributions

  8. Reflect on Goals • What do you value? • Instructor and student point of view • Content, abilities, values • At what point in the course should they develop the knowledge and skills? • Translate learning outcomes/objectives • What should students know and be able to do by the end of the course • Must be measurable!

  9. Some of My Course Goals • Understand basic terms (literacy) • Understand the statistical process • Not just the individual pieces, be able to apply • Be able to reason and think statistically • role of context, effect of sample size, caution when using procedures, belief in randomness, association vs. causation • Communication and collaboration skills • Computer literacy • Interest level in statistics

  10. Possible Future Goals • Process (not just product) of collaboration • Learn how to learn • Appreciate learning for its own sake • Develop the necessary skills to understand both what they have learned and what they do not understand

  11. Assess what you value Students value what they are assessed on

  12. Example • Given the numbers 5, 9, 11, 14, 17, 29 (a) Find the mean (b) Find the median (c) Find the mode (d) Calculate a 95% confidence interval for m

  13. “Traditional” Assessment • Good for assessing: • Isolated computational skills, (short-term) memory retrieval • Use and tracking of common misconceptions • How many right answers? • Provides us with: • Consistent and timely scoring • Predictor of future performance

  14. “Traditional” Assessment • Less effective at assessing: • Can they explain their knowledge? • Can they apply their knowledge? • What are the limitations in their knowledge? • Can they make good decisions? • Can they evaluate? • Can they deal with messy data? • Role of prior knowledge

  15. Focus on what and how students learn, what students can now do Not on what faculty teach

  16. Nine Principles (AAHE) • Start with educational values • Multi-dimensional, integrated, over-time • Clearly stated purposes • Pay attention to outcomes and process • On-going • Student representation • Important questions • Support change • Accountability

  17. Select Methods • Need multiple, complimentary methods • observable behavior • adequate time • Need to extend students • less predictable, less discrete • Needs to provide indicators for change • Need prompt, informative feedback loop • On-going, linked series of activities over time • Continuous improvement, self-assessment • Students must believe in its value

  18. Focus on the most prevalent student misconceptions

  19. Repeat the Cycle • Focus on the process of learning • Feedback to both instructors and students • Discuss results with students, motivate responsibility for their own learning • Consider other factors • Collaborate • External evaluation • Continual refinement • Consider unexpected outcomes • Don’t try to do it all at once!

  20. Use the results of the assessment to improve student learning

  21. Challenges in Statistics Education • Doing statistics versus being an informed consumer of statistics • Statistics vs. mathematics • Role of context, messiness of solutions, computers handling the details of calculations, need to defend argument, evaluate based on quality of reasoning, methods, evidence used • Have become pretty comfortable with lecture/reproduction format • Traditional assessment feels more objective

  22. Challenges in Statistics Education • Reduce focus on calculation • Reveal intuition, statistical reasoning • Require meaningful context • Purpose, statistical interest • Meaningful reason to calculate • Careful, detailed examination of data • Use of statistical language • Meaningful tasks, similar to what will be asked to do “in real life”

  23. Some Techniques • Multiple choice • with identification of false response • with explanation or reasoning choices • with judgment, critique (when is this appropriate) • “What if”, working backwards, “construct situation that” • Objective-format questions • e.g., comparative judgment of strength of relationship • e.g., matching boxplot with normal prob plots • Missing pieces of output, background

  24. Some Techniques • Combine with alternative assessment methods, • e.g., projects: see entire process, messiness of real data collection and analysis • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics • Self-assessment, peer-evaluation

  25. What can we learn? • Sampling Distribution questions Which graph best represents a distribution of sample means for 500 samples of size 4? A B C D E

  26. What can we learn? • Asking them to write about their understanding of sampling distributions • Now place more emphasis in my teaching on labeling horizontal and vertical axes, considering the observational unit, distinguishing between symmetric and even, spending much more time of the concept of variability • Knowing better questions to ask to assess their understanding of the process

  27. ARTIST Database • First…

  28. HW Assignment • Assessment Framework • WHAT: concept, applications, skills, attitudes, beliefs • PURPOSE: why, how used • WHO: student, peers, teacher • METHOD • ACTION/FEEDBACK: and so?

  29. HW Assignment • Suggest a learning goal, a method, and an action • Be ready to discuss with peers, then class, on Friday • Sample Final Exam (p. 17) • Skills/knowledge being assessed • Conceptual/interpretative vs. mechanical/computational

  30. Day 2

  31. Overview • Quick leftovers on ARTIST database? • Critiquing sample final exam • Implementation issues (exam nitty gritty) • Additional assessment methods • Holistic scoring/Developing rubrics • Your goal/method/action • Developing assessment plan • Wrap-up/Evaluations

  32. Sample Final Exam • In-class component (135 minutes) • What skills/knowledge are being assessed? • Conceptual/interpretative vs. Computational/mechanical?

  33. Sample Exam Question 1 • Stemplot • Shape of distribution • Appropriateness of numerical summaries • C/I: 5, C/M: 3

  34. Sample Exam Question 2 • Bias • Precision • Sample size • C/I: 8, C/M: 0 • No calculations • No recitation of definitions

  35. Sample Exam Question 3 • Normal curve • Normal calculations • C/I: 4, C/M: 3

  36. Sample Exam Question 4 • Sampling distribution, CLT • Sample size • Empirical rule • C/I: 4, C/M: 0 • Students would have had practice • Explanation more important than selection

  37. Sample Exam Question 5 • Confidence interval • Significance test, p-value • Practical vs. statistical significance • C/I: 7, C/M: 2 • No calculations needed • Need to understand interval vs. test

  38. Sample Exam Question 6 • Experimentation • Randomization • Random number table • C/I: 4, C/M: 4 • Tests data collection issue without requiring data collection

  39. Sample Exam Question 7 • Experimental design • Variables • Confounding • C/I: 13, C/M: 0 • Another question on data collection issues

  40. Sample Exam Question 8 • Two-way table • Conditional proportions • Chi-square statistic, test • Causation • C/I: 5, C/M: 9 • Does not require calculations to conduct test

  41. Sample Exam Question 9 • Boxplots • ANOVA table • Technical assumptions • C/I: 7, C/M: 3 • Even calculations require understanding table relationships

  42. Sample Exam Question 10 • Scatterplot, association • Regression, slope, inference • Residual, influence • Prediction, extrapolation • C/I: 15, C/M: 0 • Remarkable in regression question!

  43. Sample Exam Question 11 • Confidence interval, significance test • Duality • C/I: 9, C/M: 2 • Again no calculations required

  44. Sample Exam • C/I: 79, C/M: 28 (74% conceptual) • Coverage • experimental design, randomization • bias, precision, confounding • stemplot, boxplots, scatterplots, association • normal curve, sampling distributions • confidence intervals, significance tests • chi-square, ANOVA, regression

  45. Nitty Gritty • External aids… • Process of constructing exam… • Timing issues… • Student preparation/debriefing…

  46. Beyond Exams • Combine with additional assessment methods, • e.g., projects: see entire process, messiness of real data collection and analysis • e.g., case studies: focus on real data, real questions, students doing and communicating about statistics • generation instead of only validation…

  47. Beyond Exams (p. 8)… • Written homework assignments/lab assignments • Minute papers • Expository writings • Portfolios/journals • Student projects • Paired quizzes/group exams • Concept Maps

  48. Student Projects • Best way to demonstrate to students the practice of statistics • Experience the fine points of research • Experience the “messiness” of data • Statistician’s role as team member • From beginning to end • Formulation and Explanation • Constant Reference

  49. Student Projects • Choice of Topic • Ownership • Choice of Group • In-class activities first • Periodic Progress Reports • Peer Review • Guidance/Interference • Early in process • Presentation (me, alum, fellow student) • Full Lab Reports • statweb.calpoly.edu/chance/stat217/projects.html

  50. Project Issues • Assigning Grades, individual accountability • Insignificant/Negative Results • Reward the Effort • Iterative • Encourage/expect revision • Long vs. Short projects • Coverage of statistical tools • Workload

More Related