1 / 18

Enhancing formative feedback in large cohort modules: a case study from CARBS

Enhancing formative feedback in large cohort modules: a case study from CARBS. by Cemil Selcuk Dept . of Economics Cardiff Business School. Outline. Rankings: Feedback is a major issue. This project: a controlled experiment Survey results and some analysis. Bristol, Cardiff, Lancaster.

kaelem
Télécharger la présentation

Enhancing formative feedback in large cohort modules: a case study from CARBS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enhancing formative feedback in large cohort modules: a case study from CARBS by CemilSelcuk Dept. of Economics Cardiff Business School

  2. Outline • Rankings: Feedback is a major issue. • This project: a controlled experiment • Survey results and some analysis

  3. Bristol, Cardiff, Lancaster Teaching is OK Feedback is an issue..! Source: NSS – http://unistats.direct.gov.uk/

  4. Large Modules • Econ modules (Micro, Macro, Econometrics, Money and Banking etc.) are taken by other business majors. • Providing feedback in large modules is a challenge. No magic solution..! • Most modules have tutorials, with less than 20 students in each session. • Idea: Involve TAs in the feedback process.

  5. BS2550: Microeconomic Theory • Module size: >180 students • 14 Tutorials groups, 13-15 students each. 3 TAs • Students advised to solve questions before tutorials.

  6. Feedback so far • Oral feedback in lectures and tutorials. • Answering questions via email. • Answers for past examinations. • Whole-class general feedback sheets (after Jan exams)

  7. A controlled experiment • Treatment – 25% of students • One TA committed to this: “Submit your own solution before the tutorial and I’ll return them in two weeks with corrections.” • Participation was voluntary; no actual grading. • Control – 75% of students • Remaining TAs did not practice this (did not have to).

  8. From Students’ Perspective • Voluntary participation; not graded. • Real effort by students. No incentive to cheat, so, monitoring is not an issue. • Self selection: students who care about feedback participate (Reduces hateful outliers in surveys, will come back to this).

  9. From TA’s perspective Not a lot of extra work because: • A TA sees at most 50 students and not every student participates. • Spread over two weeks. • Nobody fights back for extra points. • Corrections can be brief if time is short.

  10. Survey • Online survey at the end of the semester. • About 25% of all students were “treated”. • Question: Does the treatment make any difference on key feedback questions? • “I have received helpful feedback on my work.” • “Feedback (generic or individual) has helped me clarify things I did not understand.”

  11. Results Seems to be working...!

  12. Hateful Outliers • 1 = “I hate this course.” Usually accompanied by other 1s. • Occurs if “service received” falls below a certain threshold. • Treatment improved the score by reducing/eliminating outliers

  13. Distribution of Scores • In the uncontrolled group there is a significant number of “angry customers”. • They’re the main reason why the average is low. • Upper tails are alike = little difference in the number of “happy customers”. Angry Customers 34% in total

  14. Distribution of Scores • With the treatment the mass in the lower tail is eliminated and spread over. • Improving the average significantly..!

  15. Other Questions • No difference in attendance to lectures or tutorials. • Every student in the treatment group attempted to solve the questions. • In the control group 26% did not make an attempt or, worse, did not even look at the questions.

  16. Cardiff’s New Feedback Policy` • In line with the basic principles. • A good example for “feed-forward”.

  17. Suggestions if rolled out • Applicable only if there are sufficiently many TAs. Compensation? • TAs may need training. The objective is providing feedback; not marking. • Should not have any weight in grading, • Monitoring becomes a serious issue. • Kills the incentive to put in own effort.

  18. The End

More Related