1 / 26

PGCert

PGCert. Summative assessment & good practice in marking and grading. http://neillthew.typepad.com/pgcert/. Learning Outcomes for tonight. By the end of tonight’s session, a successful participant will be able to: Critically discuss good practice in grading and summative assessment &

lassie
Télécharger la présentation

PGCert

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PGCert • Summative assessment • & • good practice in • marking and grading. http://neillthew.typepad.com/pgcert/

  2. Learning Outcomes for tonight • By the end of tonight’s session, a successful participant will be able to: • Critically discuss good practice in grading and summative assessment & • Evaluate your their professional role and obligations in conducting assessment.

  3. Quick recap on formative assessment... • A model for improving performance • The centrality of feedback - effect sizes • What students think

  4. A basic model for improving our own performance • You need to know, clearly & confidently, what counts as an excellent / good / OK / not OK standard (we call this “internalizing standards”) • You need to be able to judge your own level of performance accurately and objectively (ideally combining your own insights with some outside input) • You need to know exactly how (or if?) your performance falls short of an excellent / good performance • You need to be able to identify what you need to do, or stop doing, or add, or change etc. to get your performance up to the excellent / good level • Finally, you need actually to be able to do this.

  5. Effect sizes • Feedback / Assessment for Learning (see Petty, 2006, Ch. 8) • Hattie’s effect sizes: • Feedback from teacher / peers / self = 0.81 • Testing (just measuring a student’s performance) = 0.31

  6. effect sizes • Marzano’s effect sizes: • Giving students feedback on the processes and strategies they were using to complete a specific task = 0.74 • If the feedback is highly specific (medal & mission) = 1.13

  7. What students say • The MESA (Managing Effective Student Assessment) project in the UK ran from 2002-4. Funded by HEA. Involved 10 Universities (representative of the H.E. sector). • Feedback from students about their experience of assessment was very clear:

  8. What students say • There is too much summative assessment • We do not perform creatively or well when assessment is threatening • Better timetabling is required to spread the load more effectively • We would like our coursework returning quicker • We’d like better feedback. (It was not always clear exactly what students meant here. More helpful? Detailed? Timely? Or just “more”?)

  9. Student v. Faculty perceptions of assessment • “We are assessed against implicit criteria” • Faculty: Never 69% • Students: Sometimes 50% • “Feedback improves learning” • Faculty: Frequently 49% • Students: Occasionally 72% • “Feedback prompts discussion with tutor” • Faculty: Frequently 63% • Students: Never 50% 9 Maclellan, 2001

  10. Student v. Faculty perceptions of assessment • “Feedback is helpful in detail” • Faculty: Frequently 49% • Students: Sometimes 73% • “Assessment motivates learning” • Faculty: Frequently 69% • Students: Occasionally 65%. 10 Maclellan, 2001

  11. So - moving on to summative assessment • Assessment as a driver for student behaviour • A case study of problems with valid & reliable assessment & grading • Developing and using clear, shared, understood (!) criteria (1) for performance & (2) for written work.

  12. Assessment quotations • Assessment is at the heart of the student experience. • Brown and Knight, 1994:1 • From our students’ point of view, assessment always defines the actual curriculum. • Ramsden, 1992:187

  13. Assessment quotations • Assessment defines what students regard as important, how they spend their time and how they come to see themselves as students and then as graduates …. If you want to change student learning then change the methods of assessment. • Brown, 1997:7

  14. Assessment quotations • Assessment systems dominate what students are oriented towards in their learning. Even when lecturers say that they want students to be creative and thoughtful, students often recognize that what is really necessary, or at least what is sufficient, is to memorize. • Gibbs, 1992:10

  15. List of references for quotations • Brown, G. (1997) Assessing Student Learning in Higher Education. London: Routledge. • Brown, S. & Knight, P. (1994) Assessing Learners in Higher Education. London: Kogan Page. • Gibbs, G. (1992) Improving the Quality of Student Learning. Bristol: TES. • Ramsden, P. (1992) Learning to Teach in Higher Education. London: Routledge.

  16. An assessment puzzle • There was a famous study - Diederich (1974) - in which 300 essays were given to a number of experienced university lecturers for marking - each essay was independently marked several times • The grading scale was 1-9: 1 = outstanding; 7 = the lowest pass; 8 = fail; 9 = bad fail ... • 101 of the essays (i.e. a third of them) received EVERY GRADE • So what’s going on here? 16

  17. Assessment validity? • Analysis suggested that all markers were using some / all of 4 ‘families’ of marking criteria • These were: Ideas / Skills / Organization / Personal Flair • However, markers were applying these unevenly • The main conclusion was that well-formed criteria are more significant than markers in producing reliable, fair results. 17

  18. Developing performance criteria • In small groups, pick a technical exercise • Figure out - and agree - in practice first ways of playing that exercise that are EXCELLENT / GOOD / OK or PASS / FAIL • Then agree how you’d describe those levels, and write criteria that you could give to students, aiming to make these as clear and helpful as possible.

  19. Developing criteria to judge writing • This time, we’ll start with the criteria before we put them into practice - BIMM’s generic B.A. grading system • How would you describe this to students in terms that are shared and clearly understandable - is it OK as it is, or does it need some ‘editing’ & ‘interpretation’? • The SOLO model might help with another perspective here.

  20. SOLO - Structure of the Observed Learning Outcome Extended abstract Uni-structural Pre-structural Relational Multi-structural The components are integrated and re-conceptualized, thereby creating an individual perspective. The components are integrated into a coherent whole with each part contributing to the overall meaning. Several aspects of the task are learnt but they are treated separately. One or a few aspects of the task picked up or used but understanding is nominal. Task not attacked appropriately. Student has not understood the point. Biggs, 1997,1999.

  21. Criteria • So now, let’s see what this might look like in practice ... • Divide the writing up here! - try to produce 4 paragraphs of writing in your group - one EXCELLENT / one GOOD / one OK / one a FAIL • The topic is: Critically evaluate the influence of one band or performer on your own musical style. • Explain why you think the writing is at the level you specified.

  22. References: • Biggs, J. (2003) Teaching for Quality Learning at University. Maidenhead: Open University Press. • Brown, S. & Glasner, A. (2003) Assessment Matters in Higher Education. Buckingham, SRHE & OUP. • Brown, S. & Knight, P. (1994) Assessing Learners in Higher Education. London: Kogan Page. • Butterfield, J. & Bailey, J.J. (1996) Socially engineered groups in business curricula: an investigation of the effects of team composition on group output, Journal of Education for Business 76, 2: 103-6 • Diederich, P.B. (1974) Measuring growth in English. Urbana, Ill.:Nat. Council of Teachers of English

  23. References: • Falchikov, N. (2005) Improving assessment through student involvement. London: Routledge Falmer. • Gibbs, G. (1992) Improving the Quality of Student Learning. Bristol: Technical & Educational Services • Latting, J.K. & Raffoul, P.R. (1991) Designing students’ work groups for increased learning: an empirical investigation, Journal of Social Work Education 27, 1: 48-59 • Lejk, M (1999a) Successful group assessment case studies, in M. Taras (ed.) Innovations in Learning and Teaching: Teaching Fellowships at the University of Sunderland, Sunderland: University of Sunderland Press.

  24. References: • Lejk, M. (1999b) Group assessment on undergraduate computing courses in higher education in the UK, PhD thesis, University of Sunderland • Maclellan, E. (2001) Assessment for Learning: The Differing Perceptions of Tutors and Students, Assessment & Evaluation in Higher Education, 26, 4, 307-318 • Ramsden, P. (1992) Learning to Teach in Higher Education. London: Routledge. • Tang, C. (1991) Effects of two different assessment procedures on tertiary students’ approaches to learning. Unpbl. Doctoral thesis, U. of Hong Kong.

  25. Learning Outcomes for tonight • By the end of tonight’s session, a successful participant will be able to: • Critically discuss good practice in grading and summative assessment & • Evaluate your own professional role and obligations in conducting assessment.

More Related