1 / 24

Principles and Good Practice in Video Assessment

Principles and Good Practice in Video Assessment. Mary Gobbi & Eloise Monger mog1@soton.ac.uk School of Health Sciences. Aim of Session. Discuss some of the advantages and disadvantages of using video assessment for simulation education and research.

nellie
Télécharger la présentation

Principles and Good Practice in Video Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Principles and Good Practice in Video Assessment Mary Gobbi & Eloise Monger mog1@soton.ac.uk School of Health Sciences

  2. Aim of Session • Discuss some of the advantages and disadvantages of using video assessment for simulation education and research. • Identify some criteria that could be used for assessment or evaluation purposes

  3. General Principles of Assessment • Why, What, How, to assess? • How to interpret the assessment? • How to respond to the assessment? (Rowntree, 1987) I would add -When and Where to assess? Concerns in practice over: • Reliability • Validity • Credibility • Authenticity • Inter-rater reliability • Failure to fail

  4. Simulation Issues Some basic principles

  5. 11 dimensions of simulation • Aims and purpose of the activity (training, safety, educational, performance assessment and feedback) • Unit of participation (individual or team) • Experience level of the participants (novice or expert) • Health care domain (or discipline domain) • Professional or subject discipline of participants • Type of knowledge, skills, attitudes or behaviours addressed • The simulated patient’s age • Technology applicable or required (e.g. high to low fidelity mannequins or models, screen or virtual reality based technologies) • Site of simulation (home, laboratory, workplace) • Extent of direct participation (vicarious, detached or fully engaged in space and time) • Method of feedback used (observer, mechanical, structured, web based) Adapted from Gaba D M 2004 The future vision of simulation in health care. Quality Saf Health Care 13 (i2-i10) BMJ Publishing Group ltd)

  6. Some simulation aspects that influence assessment

  7. Particular issues with video assessment • Analysis and assessment of student performance and or competence is typically concerned with: • events (DiGiacomo et al, 1997) • processes (Ram et al, 1999), • Objective Structured Video Examination ( Humphris and Kaney, 2000 ; Vivekananda-Scmidt et al, 2007). • Time and standards • Attributions of activity in group situations • Being ‘Off camera’ • Issues of ‘immersed’ behaviours versus non immersed

  8. Methodological Challenges: multimedia, multidata • Ethics, governance, data protection, confidentiality-development of protocol • Attribution and causation • Halo or Horn effect- first simulation for students • Individuality of prior and subsequent student experience. • Analysing the data gathered-Linking different data to inform judgements • Accessing students, attending to their views and follow up • Analysis –can audio/video analysis enable robust assessment tools?- what criteria to use? • Interaction –how can student/group/assessor feedback be managed through these media?

  9. Simulation-task versus client based scenarios Task focus – • short 5-10 minutes, clearly defined , established sequences • assessed using checklists, pre-determined criteria Scenario focus – • longer duration 45-60 minutes, open to interpretation - may include practical skill/s but also decision making, team working, communication and problem solving • assessment involves qualitative as well as pre-determined criteria • Benefits from an ontology –hence the semantic annotation

  10. What can we observe and measure?What do you see and interpret in the next slides?

  11. Look at the next slide • In these multidimensional simulations that replicate clinical decision making, to whom, and about what, should feedback or assessment be based? • This is a challenge when the performance has been video captured. • Who is responsible for what?

  12. Feedback and Assessment:Role of the Facilitator- Assessor(see ‘good practice tips’)

  13. Some student views- • ‘If the attention of the marker is distracted – you might] fail, therefore if you are filmed it is clear, therefore filming is good, therefore particularly good for hand washing – fewer people in the room’ (B) • ‘In the OSCE, watching in close proximity is off-putting; via the camera it is not so bad’ (X) • ‘You learn more if it is formative’ and you could ‘get a mark-formative or summative- formative is good’ (X). • ‘Yes, I would like to do a real assessment with a stranger – being watched by a mentor – true competence then.’ (E)

  14. Feedback and Pedagogy • Associative - behaviour modification and reinforcement • Cognitivist/Experiential/Reflective – communication explanation, recombination, contrast, inference and problem solving • Situative – imitation, modelling, joint construction of knowledge

  15. Individual or group feedback? “Group feedback was a fairly brief, concentrated and largely 1-way process from assessors at the end of a demanding 3 hour session. This format was felt to be unsuccessful, in that there was insufficient time to address leaning points in detail. Individual written feedback for each scenario was not available until the day after the IPPI and therefore took place outside the group experience. Moreover this feedback was not moderated or explained by assessors.” (Kneebone et al 2006 : p1111)

  16. Perceived Limitations • Replay entire recording- time consuming • Or skip - Near perfect memory of event • Looking stupid on camera- self esteem • Variation in volume and quality of feedback • Facilitator fatigue • No record /written feedback - risk misinterpretation, positive reinforcement of poor practice

  17. Findings • Expectations • Reality • Assessments and OSCEs/exams • Being watched • Mentors • Theory/practice gap/professionalism • Debriefing • ‘feelings’ Do you think you could assess students this way?

  18. Some student comments • Assessment should be real- ‘OSCEs aren’t real’ (Y). • Assessments (and simulations) can be associated with an examination, ‘I thought it was going to be an exam – how scrutinised – OSCEs build up tension’ (C). • It is uncomfortable being watched continuously, ‘very uncomfortable, knowing that there is someone sitting and looking at me, not like it would be in real life’ (E). • ‘[it was] difficult today, doing it with friends. In work you have a professional persona – very different to how you are with your friends. Because at work, with your mentor watching, it is intermittent observation, they are not always there – I answered the phone the other day and fluffed it up because I was being watched.’ (E)

  19. Other factors to connect to assessment • Analysis of web server logs: • Time spent on different parts of the website; Comparison to an ideal path that experienced staff would take through the web site; Sequence analysis of different paths through the website • Do students taken different approaches to the material? • Event history visualisation to see what we can see; • Modelling of using data available from information about the students; • Comparing different kinds of session (taught, revision etc). Implications of the students who consent (or not)- may be group specific and not random. • At present annotation is still developing- however it can give us insights with respect to inter-rater reliability. • Focused annotation may be the next step • Are the qualitative features associated with web or log analysis or annotation analysis? • Can we benchmark for different groups? Professor J W McDonald

  20. Some final critical thoughts for evaluation and reflection • Did the facilitator/supervisor influence the participants? • Ethical issues? Parity and Equity? • Realism, credibility and fidelity. • Is the observed and reported account a true reflection of the experience? • Are the conclusions offered by the assessor/ researcher convincing and supported by evidence or data?

More Related