1 / 41

Helping students understand ‘the rules of the game’ in university assessment.

Helping students understand ‘the rules of the game’ in university assessment. Sue Bloxham S.Bloxham@Cumbria.ac.uk. indispensable conditions for improvement (Sadler 1989). the student comes to hold a concept of quality roughly similar to that held by the teacher,

oro
Télécharger la présentation

Helping students understand ‘the rules of the game’ in university assessment.

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Helping students understand ‘the rules of the game’ in university assessment. Sue Bloxham S.Bloxham@Cumbria.ac.uk

  2. indispensable conditions for improvement(Sadler 1989) • the student comes to hold a concept of quality roughly similar to that held by the teacher, • is able to monitor continuously the quality of what is being produced during the act of production itself.. • And has a repertoire of alternative moves or strategies from which to draw (knows the standard or goal) (recognises the ‘gap’ between their work and the standard) (can take action to fill the ‘gap’)

  3. Contention • My contention today is that assessment criteria, rubrics, grade descriptors and statements of standards have been partially developed with the admirable intentions of making assessment more transparent to students. • However, the use of such artefacts to guide students in their assignments is poorly matched to the nature of the learning assessed at this level and our knowledge of professional judgement in grading.

  4. Techno-rational tradition in standards • Emphasis on transparency and creating explicit standards, e.g. professional standards, clarifying learning outcomes and assessment criteria • Perceives standards as something fixed, objective and measurable……. A GOLD standard. Based on assumptions that ‘knowledge is monolithic, static and universal’ (Delandshere 2001:127) • Located in an objectivist epistemology C D- F A+ A+ B

  5. A broad alternative critique of techno-rationalist perspectives This includes socio-cultural (Gipps, 1999), hermeneutic (Broad, 2003), social constructivist (Rust et al, 2005) and psychological perspectives (Brooks 2012). These approaches share an interpretivist approach to judgement and argue that the techno-rationalist’ approach tends to ignore: • beliefs, values, habits and purposes of tutors. • the situated nature of grading decisions. • the dynamic and contested nature of knowledge, • the constructedness of knowledge. There is both a theoretical critique and one based in empirical study of assessment and standards in action.

  6. Problems with explicit standards? • Difficulty codifying standards (Sadler 1987) – too general, abstract, hide complexity, mask diversity (Moss & Shultz 2001) • Written statements need individual interpretation • Holistic judgement – not using analytical standards in practice • Prof. standards don’t account for context • Not grounded in empirical research

  7. Data collection • 25 tutors, responded to invitation; • 3 Universities; • Recorded marking two assignments, as they verbalise their thinking (A&D in 2s/3s); • Followed by interview; • Supplemented by researcher’s field notes; • Subjects: teacher education; art & design, medicine, social science, humanities.

  8. Grading practice • Considerable focus on surface features, importance of introductions and other ‘heuristics’; • Tutors almost all using holistic judgement to grade/pass; • Internalised ‘Criteria’ used to confirm or check marks. • Explicit norm referencing is common Bloxham, S., Boyd, P. & Orr, S. (2011) Orr,s & Bloxham, S. (2012) Bloxham, S & Boyd P (2011)

  9. Focus of tutors Surface features: ...the work is not always as grammatical as it might be...proof reading errors and mis-spellings...create a bad impression...it hasn’t been marked down to any significant extent because of that... Global features: …the conclusion is rather hollow… …they have identified the play with genre which is good…

  10. Surface characteristics • Drives me crazy when students start sentences with numbers [] and consistent with all the other papers, all their references are put in the wrong place. (T23 medicine)

  11. Holistic marking • Analysis supports theoretical • arguments about the difficulty of analytical grading: Umm thinking about the essay for a while now and um glancing through it again, despite the comments that have been running through my mind about structure and the depth it does have you know, judging this from the point of view of a second year student rather than a usual history module it does have quite a lot of merit and I would not be disposed to give it a mark lower than a basic 2:1 but I would probably not go far above the 2:1 threshold. The essay has been fairly well researched I feel and although it deals in fairly general terms the sense I get is that it has used its research base fairly fully and certainly the research base stated in the bibliography is an enormous one. (T5)

  12. Explicit use of Criteria whilst marking • This was rare and, when used, involved a ‘threshold’ rather than standards approach to criteria Then she goes on to say why she chose the Vikings – because of its significance, its importance in understanding what it is to be British and where it fits into the standard Scheme of Work. All those are things in the criteria so again I’ll put a double tick in the margin just to remind me that I’ve ticked them off the criteria in my head as I do it. (T3)

  13. Checking grades Many tutors use explicit criteria/ objectives to check or confirm grades/pass as a final step: OK. Now I step back from the essay and try and get an overall perspective on it. I’ve been thinking all the way through that it was a 2:1 and now I’m wondering if there’s a possibility that it’s a First. So I’m going to the Faculty of Arts assessment matrix…. (T7)

  14. Norm referencing I would have a look…and satisfy myself that the range of the marks…did seem to reflect what I’d written about the different pieces of work. So I’m saying 62 for the first one, 58 for the second one but conceivably they could be stretched with the upper one, 63 or 64 and the other one – possibly down. I don’t think I would take it to 55, I would perhaps give it 56. (T5)

  15. Internalised standards ‘internalised’, ‘absorbed’, ‘instinctively’, ‘got a sense of’, ‘in my mind’, ‘subliminal’, ‘rooted in my mind’, ‘got a mind set’, ‘implicit’, ‘have things in our heads’, ‘feel’, ‘familiar’ and ‘an understanding’.

  16. ‘Personal standards frameworks’ • Personalised lens for marking, internalised and loosely linked to explicit criteria, Learning outcomes, etc. it’s a kind of you know almost subliminal level I’ve absorbed the outcomes and aims and I am using them. (T5)

  17. ……essentially the descriptions which exist in written documents which you’ve probably seen about what a First Class grade means, what a Second Class grade means and so on, they are rooted in my mind and have become part of my sort of experience really and I feel I can judge, I mean I could sit here and list all the criteria but there’s no point in that. I feel I can judge now myself without referring to any kind of written standards but we do operate in accordance with those standards. (T5)

  18. Individual differences in standards • Trigger qualities • Complexity of criteria: ‘It’s so multi-factorial you see’ (T1); • Informal guidance points up differences to students: the students have a success criteria grid and so according to that, and what I tell them, you know there are certain things they have to put in so there’s certain descriptive information that has to go in (T10).

  19. Shared Standards • Strong sense that standards are shared, if discipline specific: There are things that are kind of implicit and in fact sometimes difficult to articulate but which nonetheless are relatively sound, that are disciplinary. They are just shared by being in the same discipline and provide a framework for marking that might not be available to other people outside that context. (T1)

  20. What we say and what we do How do we communicate the reality of our professional judgement in a way which is understandable and credible to students?

  21. Discussion: Our views about grading practices • Do our findings have resonance or do you want to question some aspects of them? • How is practice in your department? What might need to change? • How do we communicate the reality of our professional judgement in a way which is understandable and credible to students.

  22. Mismatch • commonplace approaches to guidance and feedback do not help students to understand the required standard and their achievement in relation to that standard. • Practice has shifted the focus to communicating tacit knowledge • But it still implies judgement is fixed and analytical • We need different ways to help students acquire meaningful knowledge of standards

  23. Not getting it! • I think that’s the problem, that I can’t identify quite what I don’t get, it’s just a sense that, I’m like, I know this isn’t right, I know this isn’t quite what they want and I know this isn’t quite the right standard and I’m like, I’m at degree level, what kind of do I need to, what is degree level … (Jane) • You don’t really get an idea (of what’s right and wrong in your work) apart from when it’s something that gets peer marked then you do get a slight idea of that but if it’s just handed straight in….you don’t don’t get an idea ’til you get it back or you get your mark back (Robin) • I can’t at the minute (predict how well I’ve done). I haven’t got used to the way that they mark and I handed in one thing and I thought I was going to do really bad and then got a quite good mark and then the thing I thought I was going to do really well in I did really bad in so I haven’t got to grips with that yet (Carol) (from Bloxham & Campbell 2010)

  24. Getting inside tutors’ heads: recognising variation in standards • “I think most of the tutors should just really go over and just say what, what is expected of us because even though sometimes it is in the module guide, people, you know, people interpret in different waysand unless they actually say it, and you know, ‘cos you don’t actually sometimes know what the essay is actually about, you just write about it putting things in, not actually really, I don’t think, understanding what you’re writing”. (Bloxham & West, 2007)

  25. Learning the way tutors learn • emphasises holistic judgement processes; • is embodied in real judgements; • is dialogical; • it takes place over time; recognising that standards cannot be acquired in one attempt; • Recognises the nature of complex judgement and the context for University assessment.

  26. Holistic • Offering no analytic criteria, at least in the beginning? • Focusing on ‘overall quality’ (e.g. of exemplars) rather than individual criteria; • Giving feedback against overall purpose and meaning of task; • Being wary about asking students to determine analytic criteria – may not match marking

  27. Embodied in real judgements • Inductive learning, seeing ‘concrete’ examples of tacit standards; • Using exemplars on work similar to that produced by students at that level; • Several examples of same standard to recognise multiple ways of achieving quality; • Tutor commentary (recorded?) on exemplars to replicate normal judgement processes – what they see in a good piece/ poor piece.

  28. Exemplars: issues • Need to be accompanied by dialogue and/or activity (chance to use, discuss, ask questions - to make standards visible) • Short examples which focus on ‘archetypal problems’ or ‘good practices’ may be more useful (Handley & Williams 2011) • Copying (benefits outweigh risk?) • Not sufficient in themselves

  29. Dialogical from Bloxham & West 2007. • “The feedback….I got the other day was quite good because …..he said what I did wrong, he actually went through it and pointed out in the essay the bits that, and I understood what he meant then……….When the lecturer is sat next to me and says right this is…it was better when he actually said to me – sat me down and said.” (Female, high score, SS) • “I’d rather sit down with them and talk about which they are not always able to do….but I’d rather speak to them about it usually but they don’t always have time….they can break it down a bit more”

  30. Verbal clarification From Bloxham & Campbell 2010 • …go to the lecturer, see if they can explain the question or whatever you have to do slightly more so that you can understand it a bit better. (Robin) • ..with the feedback it doesn’t seem to be something that I understand as well, I mean written feedback’s so very narrow, it’d be nicer sometimes to discuss it. (Jane) • Yes, because I’ve found quite often in the past they’ve just given you a title, that’s the piece of work and you don’t really know what they want from it, so I find that if, a bit of explanation usually helps(Luke)

  31. What does dialogue mean • ‘repeated engagement in the appropriate activities in the company of those with expertise’ (Gonzales-Arnal & Burwood 2003) • More than an exchange of ideas – opportunity to ask questions, informal and verbal dialogue to make explanations and judgements clearer • give a richer and more accessible picture of expectations than mere statements of outcomes and criteria can manage, by • describe things in different ways, • provide different examples of what you are looking for, • use more everyday language,

  32. Creating dialogue • Making it easy for students to ask questions about guidance and feedback • Limit formal guidance, but answer questions (in class or online) • Peer assessing drafts in context of tutor’s explanation of requirements • Rust et al method • Peer learning schemes

  33. Takes place over time • Gradual process of participation, imitation and dialogue; • Repeated cycles of formative assessment (Gibbs & Dundar-Goddet, 2007 • No ‘magical faith’ in learning support; (Broad 2003) • Greater engagement with feedback, e.g. feedback on drafts;

  34. Complex judgement & assessment processes • Develop students understanding of professional judgement: complex, tacit, holistic, intersubjective; • Develop ‘idea of standards’ not ‘precise guidance’ conception amongst students (Bell et al 2013) • Peer assessment vital (‘insiders rather than consumers’ Sadler 1998); • ‘Standing in the Shoes’ of the marker • Transparent about Uni assessment processes – ‘transparency breeds trust’ (Carless, 2006);

  35. Conclusion • Tutor marking is holistic and is based on tacit knowledge and individualised standards’ frameworks • General approaches to guidance and feedback do not align with how professional judgement is made. It runs the risk of encouraging student dependence on tutors to explain their interpretation of language, and a fragmented view of each assignment; • Instead, we need to help students develop their own standards’ framework which replicates that of their tutors, using the informal and social learning methods that underpin tacit knowledge • Key principles in choosing methods are that processes should emphasise holistic judgement; be embodied in real judgements, be dialogical, continuous, and develop awareness of complex professional judgement

  36. Where now? Sharing: • What might be the key actions for you? • What might be key things to take back to your teaching team or department?

  37. The guidance and feedback loop: main steps Hounsell et al (2008)

  38. The guidance and feedback loop: main steps • Pacing the introduction of new assessment methods • Choosing collaborative and enquiry-based assignments • ‘Unpacking’ tasks holistically through questioning • Working with exemplars to ‘surface’ assessment criteria One-to-one dialogue with individual weak students to discuss feedback • Peer reviewing draft assignments • Practice assignments (for example, mock examination, practice presentations) • On-line self tests, • Discussing feedback with tutors/ peers • Comparing tutor feedback with self assessment and identifying ‘gaps’ • Discussing strategies to fill the gaps.

  39. References Broad, B., (2003). What We Really Value: Beyond rubrics in teaching and assessing writing. Logan, Utah: Utah State University Press. Brooks, V., 2012. Marking as Judgement, Research Papers in Education, 27(1), pp. 63-80. Bloxham, S. & West, A. (2007) "Learning to write in higher education: students’ perceptions of an intervention in developing understanding of assessment criteria", Teaching in Higher Education, vol. 12, no. 1, pp. 77-89. Bloxham, S & Campbell, E (2010) Generating dialogue in assessment feedback: exploring the use of Interactive cover sheets. Assessment and Evaluation in Higher Education 35: 3, 291 — 300 Bloxham, S., Boyd, P. & Orr, S. (2011) Mark my words: the role of assessment criteria in UK higher education grading practices. Studies in Higher Education 36(6) 655-670 ). Bloxham, S & Boyd P (2012) Accountability in grading student work: Securing academic standards in a 21st century quality assurance context. British Educational Research Journal, 38(4): 615-634 Carless, D (2006) Differing perceptions in the feedback process’ Studies in Higher Education, 31 (2) 219-233. Gibbs, G. & Dunbar-Godet, H. 2007, , The effects of programme assessment environments on student learning [Homepage of Higher Education Academy], [Online]. Available: http://www.heacademy.ac.uk/projects/detail/projectfinder/projects/pf2656lr [2008, 1/21] Gipps, C. 1999. Socio-cultural aspects of assessment. Review of Research in Education 24: 355-392.

  40. References con’t Gonzalez Arnal, S. & Burwood, S. 2003, "Tacit knowledge and public accounts.", Journal of Philosophy of Education, vol. 37, no. 3, pp. 377-391. Handley, K & Williams, L (2011) From copying to learning: using exemplars to engage students with assessment criteria and feedback. Assessment & Evaluation in Higher Education 36 (1) 95- 108 Hounsell, D., McCune, V., Hounsell, J. & Litjens, J. (2008) The quality of guidance and feedback to students. Higher Education Research and Development. 27(1), pp55-67. Orr, S & Bloxham, S. (2013) Making judgements about students making work: Lecturers’ assessment practices in art and design. Arts and Humanities in Higher Education 12: 234-253 Rust, C et al (2003) Improving Students’ Learning by Developing their understanding of assessment criteria and processes. Assessment & Evaluation in Higher Education, 28 (2) 147-164 Rust, C. et al (2005) A social constructivist assessment process model. Assessment & Evaluation in Higher Education 30 (3), 231-240 Sadler, D.R. 1989, "Formative assessment and the design of instructional systems", Instructional Science, vol. 18, no. 2, pp. 119-144. Sadler, D.R. (1998) Formative assessment: revisiting the territory Assessment in Education: Principles, Policy & Practice, 5 (1) 77-85

  41. Implications for marking • Recognise the risk of a surface focus and emphasise global features in guidance, grading and feedback. • staff awareness of the difficulty of consistent standards in marking • Staff awareness of the influence of other students’ work on our marking. • Recognise the significance of professional learning for tutors through good quality discussion in second marking and moderation processes.

More Related