1 / 34

Supporting Students through the Development of ‘Student Friendly’ Marking Criteria and Marking Workshops

Supporting Students through the Development of ‘Student Friendly’ Marking Criteria and Marking Workshops. Patricia Turnbull and David Morris (FHSC). Stimulus for Change. Anecdotal student comment indicating poor understanding of assessment.

holt
Télécharger la présentation

Supporting Students through the Development of ‘Student Friendly’ Marking Criteria and Marking Workshops

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Supporting Students through the Development of ‘Student Friendly’ Marking Criteria and Marking Workshops Patricia Turnbull and David Morris (FHSC)

  2. Stimulus for Change • Anecdotal student comment indicating poor understanding of assessment. • Student frustration with variance in assignment feedback. • External examiners’ comments.

  3. 2006/2007 • Evaluation of the assessment experiences of 307 student nurses highlighted the following….. ONLY • 13 % used the Criterion Referenced Grading (CRG) tool in assignment preparation. • 13.7% believed the terminology in the CRG tool easy to understand. • 8% considered their assignment feedback to reflect CRG tool descriptors.

  4. ACTION………. • Marking workshops developed as a pilot to assiststudents (n= 87) to engage with marking criteria. • During marking workshops, students were invited to indicate which words/concepts they found difficult to understand in CRG tool. • Participants given opportunity to read two scripts (different grades) that had passed the module and ‘mark’ these. • Extended discussion of new insights and the value of using marking criteria to prepare assignments. • Qualitative evaluation very positive.

  5. However….. • Introduction of University Generic Marking Criteria (UGMC) impacted on this development. • This required development of marking criteria to encompass UGMC, student understanding and professional issues in healthcare. • Marking criteria for levels 1-3 for essay, presentation and exam developed.

  6. Evaluation of Marking Criteria

  7. Evaluation of new CRG 2009 / 2010 • 380 students completed a self-administered questionnaire (response rate = 93.6 %). Participants had completed at least one module of pre-registration programme. • Feedback reviewed in a random sample of 100 scripts (23 markers) was reviewed to determine the validity of students’ views. • On-line questionnaire for staff, 39 staff participated (62% response rate). • 14 External Examiner Reports (4 external examiners) reviewed for reference to marking criteria.

  8. Student Views • 289 students (76%) found the CRG tool to be useful in preparing for assignments. (In 2006/7 study only 13.4% used marking criteria to prepare for assignments.) • 300 students (79%) found the words used in the CRG tool easy to understand. (In 2006/7 study only 13% found the terminology of marking criteria easy to understand.) Words such as ‘Anecdotal’ ‘Unsubstantiated’ ‘Colloquial’ were problematic for some.

  9. Student Views • 249 students (66%) believed feedback to be satisfactory, reflective of the new CRG tool and identified the strengths and weaknesses of their work. In 2006/7 study only 8% considered feedback to be satisfactory and reflective of marking criteria. However……..

  10. 71 students (19%) reported that feedback identified their strengths & weaknesses but did not advise how to improve their work. • ‘Some markers are always so critical about my work. I would respect their views more if they could offer advice on how I might improve.’ • ‘I always feel so disheartened by the feedback.’ • 14 students (4%) felt their mark and comments depended more on the marker than the marking criteria.

  11. Review of 100 Scripts 1. Were four headings representing the categories of the CRG tool used in feedback? Yes in all 100 scripts!!!

  12. Review of 100 Scripts 2. Did the feedback provided support students in identifying their strengths & weaknesses and advise on how to develop weaker aspects? • 15% of scripts had substantial feedback. Irrespective of grade, positive comments were provided and students advised how to improve weaker aspects. Under four categories, comments were consistently linked to examples of students’ work.

  13. Review of 100 Scripts 3. Did the feedback provided support students in identifying their strengths & weaknesses and advise on how to develop weaker aspects ? 64% of scripts had feedback relating to strengths and weaknesses – but with no actual examples of these. Advice on how to improve script was inconsistent. (Feedback mentioned poor structure, lack of clarity, limited analysis.)

  14. Review of 100 Scripts 19% of scripts had limited feedback that would not support a student in developing their work. • No personalisation of work, no advice on how to address weaknesses, only very occasional positive comments (even in higher grade scripts). 4% had 1-2 sentences of feedback in each category. • Learning outcomes were referred to a number of times (even though not mentioned in assignment guidelines or marking criteria). • Handwritten feedback generally had less content than typed (though was legible).

  15. Language used in feedback • 82 % of scripts had feedback that used terminology from CRG – and elaborated on these to varying degrees. • 18% had statements lifted directly from CRG with no further explanation.

  16. Language Used in Feedback • Explicitly positive and encouraging language was used in 15% of feedback – even negative aspects were couched in positive terms. • 73% feedback used a combination of positive and negative language – though more emphasis was on negative aspects of scripts. • 12% of feedback was solely negative….even when students had achieved a pass grade. • In 4% the phrase ‘could do better’ was evident.

  17. Staff Evaluation of CRG Tool 39 Facilitators (62% response rate)

  18. When do you use the marking criteria in preparing students for their assignment?

  19. When do you use the CRG tool in your marking of scripts / presentations?

  20. Comments…………………… • ‘For all my marking. I use the four headings and group comments into them although the comments are my own rather than simply paraphrase of the guide comments on the sheet.’ • ‘Always – though I tend to write more under the first two headings. I use both my own comments as well as the statements in the marking criteria.’ • ‘But I supplement with my own comments and tailor it to identifiers in individual scripts.’

  21. For which types of assessment do you find the marking criteria useful?

  22. Comments…………………… • ‘Exam is difficult due to many parts.’ • ‘Especially useful for oral presentations in terms of assisting the two markers to record comments and discuss feedback.’ • ‘Some presentations assessments have their own marking criteria which may or may not fit well together with the generic criteria.’

  23. Do you think that the marking criteria and feedback structure has increased the transparency of the marking process for students?

  24. In your opinion, do the marking criteria make it easier to fine grade an assessment?

  25. Comments…………… • ‘They provide consistency. I feel this is important as at times students have commented that they feel some markers are harsher/more lenient than others.’ • ‘However the exact nature of fine grading is unnecessarily precise. It would be better to mark work in bands rather than exact marks as the difference between an assignment scoring 54 or 55% is minimal while A,B or C indicates the academic level adequately.’ • ‘The students appear to understand these criteria and they are easy to explain to students.’ • ‘Need this for all levels, including CPD please!’ • ‘The tool makes the whole process more transparent for moderators, external examiners and for students. As a first marker I feel I can justify how and why I have awarded the mark.’

  26. External Examiner Comments Total of 14 reports from 4 external examiners. • 2 external examiners specifically noted that marks reflected the CRG tool and that comments were helpful. • 1 external examiner felt that the uniform approach to feedback was useful…but there were occasions when remarks could be rephrased ‘to enhance and encourage student development’.

  27. Recommendations • Continue with use of CRG tool and recommend for CPD modules. • All students to be inducted into use of marking criteria via marking workshop in module 1 and at beginning of every module. • All markers to provide feedback that will support student development (workshops for staff/monitoring by moderation teams and by module leaders). • Ensure that all staff provide feedback in relation to CRG tool and do not rely solely on previous experience to mark.

  28. Recommendations • Marking workshops to be offered to all new staff prior to them undertaking the marking of student assignments. • Learning outcomes for module should be reflected in guidelines for assignment – not considered as separate to these. • Marking criteria printed on A3 cream paper. • Use of marking criteria for exam to be reviewed by module team.

  29. Evaluation of Marking Workshops

  30. Evaluation ofMarking Workshops Marking workshops delivered for 160 students who failed a module at 1st attempt. First attempts not considered appropriate. Following resubmission success of attendees was checked against result recorded on SITS after Awards Board.

  31. Results

  32. Recommendations • Marking workshops continue to be used to support students in written reassessment. • Exemplars are updated to reflect any changes in module content. • Students who fail at first attempt continue to be provided with a written information sheet inviting them to attend a workshop.

  33. Recommendations • Marking workshops are identified on timetables and should become compulsory for failing students. • Explore the potential of providing visual (video/DVD) exemplars of presentation assessments.

  34. Thanks……… • Mark Warnes (INSPIRE) • Students and Staff of FHSC

More Related