220 likes | 498 Vues
Assessment of practical work skills: why, what, & where?. Maggie Nicol Professor of Clinical Skills. Practical skills in nursing. 1980s demise of the practical room and ‘apprentice’ type nurse training Early 1990s healthcare education moves into HE
E N D
Assessment of practical work skills: why, what, & where? Maggie Nicol Professor of Clinical Skills
Practical skills in nursing • 1980s demise of the practical room and ‘apprentice’ type nurse training • Early 1990s healthcare education moves into HE • Practical experience is less and more varied • Less time spent in hospital wards • More focus on socio-cultural aspects • Late 1990s skills laboratories begin to emerge • 50% (2300 hours) of the 3-year programme is spent in work placements
Assessment – what is it for? • To motivate students to learn • To punish those who do not! • To provide feedback – ‘it’s about getting to know students and the quality of their learning’ (Rowntree 1987) • To improve the quality of the learning • As a ‘quality control’ check for our teaching • With vocational courses, to ensure graduates are ‘fit for purpose’ as well as ‘fit for award’
Assessment of practical skills – why? • Practical skills are central to professional practice • It defines what students take to be important (Rowntree 1987) • If delegated to staff in placements: • Lack of consistency between assessors • Seen as less important than other subjects • To assess competence
Competence • The acquisition of knowledge, skills and abilities at a level of expertise sufficient to be able to perform in an appropriate work setting (Harvey 2004) • Competence - what the person is capable of doing • Performance - what the person does in his or her day-to-day practice • One needs to be competent in order to assess competence; professionals need to be assessed by professionals.
Conscious competence model Unconscious competence Conscious competence Conscious incompetence Unconsciousincompetence
Benner (1984) • Expert • Proficient • Competent • Advanced beginner • Novice
Assessment of practical skills – how? In the workplace • Students work with one or more supervisors or mentors • Continuous assessment - the supervisor observes the student’s performance over a period of time and indicates when competence is reached • Can take into account the views of other staff members • Assessment usually developed in partnership with academic staff
Assessment of practical skills – how? Using simulation • Objective Structured Clinical Examination (OSCE) • Simulation is used to make it feel real • Assessment is a ‘one-off’ assessment of performance on the day • One assessor for each skill • Agreed checklist for each skill • Assessment usually developed in partnership with staff in the workplace
5 required attributes of an assessment process (McKinley et al. 2001) • Reliability – consistency of assessors rating the same performance • Validity – degree to which the assessment assesses what should be assessed • Face validity often high but are we assessing what we should or what we can assess? • Acceptability – to all stakeholders (assessors, student and the public)
5 required attributes of an assessment process (McKinley et al. 2001) • Feasibility – can it be delivered to all who need to be assessed within the cost constraints (time & staff) • Educational impact – the degree to which the assessment will help the student to improve his or her performance. This requires: • Feedback on strengths & weaknesses • Strategies for improvement
Simulation Agreed checklists mean less subjectivity Criteria for assessment clearly defined Moderator to ensure fairness and consistency Can be video recorded The student feels watched Usually one-off performance and may be a ‘bad day’ Workplace Wide variety of assessors involved Student’s previous performance may influence the assessment Have to ‘do it our way’ Informal assessment usually occurs on several occasions before it is formalised The student may not realise he or she is being assessed Practitioners sometimes ‘fail to fail’ Reliability
Validity – Simulation • Simulated setting which may not feel real, despite good simulation • Advantages those who can act • Students do not know their patient; may know assessor • Able to assess skills that are not available ‘to order’ in the workplace e.g. emergency resuscitation • Environment can be controlled to the level of the student • Assessors trained for and observed during assessment • Fair – all students do the same assessment
Validity – workplace • Real workplace - authentic assessment • Students know their patients/clients & the assessor • Safety takes precedence – cannot allow student to make mistakes • Reliant on the experience/patients available at the time • Assessment may vary considerably between students • The competence of the assessors is assumed • Experts in practice does not necessarily = expert assessors
Acceptability - Simulation • Students find it very stressful • Results/feedback several weeks later • Assessment of practical skills more visible • Makes practical skills as important as other subjects • Lay public can be involved in the process • Environment can be controlled
Acceptability - workplace • Often viewed as a easier than assessment using simulation • Not a ‘one-off’ performance • Includes an element of self assessment - can usually choose when to be assessed • Receive immediate feedback on their performance
Simulation Costly in terms of facilities, resources and lecturer time Hard to accommodate large numbers of students Need large numbers of lecturers at the same time Needs a lot of preparation and organisation Workplace Cost of assessment is spread throughout the placements Can occur at a convenient time Student may have to ‘nag’ the assessors to arrange the assessment Feasibility
Educational impact Simulation • Motivates students to learn/ revise practical skills • Detailed feedback to indicate where strengths and areas in need of improvement • Checklists may be available in advance Workplace • Motivates students to seek appropriate experiences and practise skills
Conclusion • The main purpose of assessment should be to encourage learning; assessment for learning rather than of learning • Given the strengths and limitations, we probably need ‘blended assessment’ both in the workplace and using simulation • That way we will assess their competence (what they are capable of) and their performance (what they actually do)
Conclusion • Subjectivity, bias and inter-rater reliability are issues with all forms of assessment • We need to find the least worst method • We need to ensure that we assess what we should assess rather than what we can assess • To paraphrase Florence Nightingale, ‘above all, assessment processes should do the student no harm’ • ‘Students can, with difficulty, escape from the effects of poor teaching, they cannot (if they want to graduate) escape the effects of poor assessment’ Boud (1995)
References • ASKe Centre for Excellence in teaching & Learning. Oxford Brookes University Business School (2007) Assessment standards: A manifesto for change. • Benner P (1984) From novice to expert: excellence and power in clinical nursing practice. Menlo Park: Addison Wesley. • Boud D (1995) Assessment and learning: contradictory or complementary? In P. Knight, (Ed.), Assessment for Learning in Higher Education, pp.35-48. .. • Harvey, L., 2004, Analytic Quality Glossary, Quality Research International www.qualityresearchinternational.com/glossary/ • McKinley R M; Fraser R C & Baker R (2001) Model for directly assessing and improving clinical competence and performance in revalidation of clinicians. British Medical Journal, 322: 712-715. • Rowntree D (1987) Assessing students: how shall we know them?. London: Kogan Page