360 likes | 377 Vues
This project aims to improve student engagement with lecture transcripts through collaborative transcription, annotation, and URL list trials. It explores changes to the transcription workflow, impact of the trials, and changes in project focus. The findings indicate the need to focus on deep learning and the importance of context during transcription.
E N D
Student EngagementWith Lecture Transcripts Philip Phelps Department of Computer Science & Creative Technologies University of the West of England Project kindly supported by:- This version may differ slightly from that presented at the 12th HEA-ICS Conference, Belfast. The final version my be obtained from Philip Phelps(philip3.phelps@uwe.ac.uk)
Team members who aren’t here Marcus Lynch Project leader & tester 2009-2011 Dan Buzzo Project leader & tester 2011-future
Quick overview • Collaborative transcription trials (2009-2010) • Changes to transcription workflow (2011) • Annotation of transcripts (2011) • URL list trials (2011) • Impact of trials (2009 and 2011) • Collaborative transcription and URL lists • Changes to project focus (2012-future)
2009 UWE funded project • Automated segmentation of lecture recordings • Each segment transcribed automatically • Used Dragon Naturally Speaking for speech recognition • Required manual review of machine generated drafts • Single-user web based transcript editing • Faster than transcribing manually from scratch • Still fairly laborious each segment must be edited
Problems with 2009 approach Lecturer time is limited Lecturer must spend more time correcting transcripts manually Speech-recognition training time is limited Poor accuracy of machine-transcripts
2010: HEA-ICS funded project • Multi-user system – crowd-sourced editing • Students collaboratively corrected mistakes in machine generated drafts • Lecturer moderates student submissions • As an incentive: students could earn 0.5% of module total marks for each 42 seconds of transcript editing (up to maximum of 5% of module total marks) • Hypothesis: Increased engagement, through transcription, would improve exam performance
2010 project results • Partially solved lecturer time problem • Students collaboratively produced full transcripts • … but lecturer time greatly occupied by moderation of student corrections (spelling/grammatical mistakes, etc) • No improvement to exam performance • Only 23% of students increased their performance • 72% of students reduced exam performance (post-transcript interaction) compared with coursework performance (pre-transcripts) • 5% of students did not interact with the transcripts and increased performance regardless
Cohort comparison 2008/9, 2009/10 Note: The general trend in the previous year was also reduced exam performance compared to coursework
2010 project conclusions • Why no exam performance improvement? • Student engagement ≠ Deep learning (our belief in “learning through transcription” was misguided) • We focused on learning styles and learning reinforcement, not necessarily deep learning • Students likely approached transcription as a task – focusing on “getting this typed out and done with” rather than learning from the content • Some students suffered from lack of context during transcription – although 40/67 students transcribed from contiguous segments
2011: Focus on different things • Single-user transcript editing • Avoid crowd-sourced transcript editing entirely • Spend more time training speech recognition engine • Improve machine transcription workflow & accuracy • Trade-off between accuracy and manual interaction • Expand transcript search/indexing tools • Support annotation of transcript segments
2011 Changes and improvements • Consolidated tools for single hardware host • Previously • PPC MacOS: Audio segmentation and transcript editing • Win32: for “Naturally Speaking” speech recognition • Now • Audio segmentation tools recompiled for Intel architecture • Intel MacOS “Scribe” speech recognition (Nuance/Dragon engine) with custom apple script for batch processing • Server-based single-user transcription • Support for remote transcription when off campus • VNC screen sharing & AUNetSend audio sharing • Web based front-ends for audio segmentation
Transcription server utility • Connects to MacSpeech “Scribe” transcript server • Speech recognition voice profile is updated after each segment is transcribed/corrected Sets up SSH tunnels, VNC, and audio sharing
Some more changes • Widening participation at our institution • Preparing transcription workflow for integration with Blackboard VLE • Scaling existing transcript search/index tools into transcript repository • Support for subtitling of video lectures • Exposing more configuration options of audio segmentation tools • Annotation of transcripts • Integrating “Marginalia” annotation engine (www.geof.net/code/annotation)
URL lists trial • Not possible to trial annotation system • Instead, asked students to submit URLs via a web questionnaire (surveymonkey.com) • Required that each URL complemented, corrected, or augmented lecture content • Each submission must be accompanied by related keywords • To entice students to participate: • Students could earn 0.5% of module total marks, per URL, up to a maximum of 5% of module marks.
Results of URL lists trial • 164 students across two modules participated • (109) Multimedia Systems: Contexts and Applications • (55) Computing, Audio, and Music • Submissions processed before publishing • to indicate URLs already part of lecture content • ~1000 URLs were fed back to students for revision purposes • URLs were organised by lecture topic, and fully searchable by keyword.
Students taking the final examination were asked: “Did you refer to the shared URLs as part of your revision?” • 54 said yes, 55 said no, 5 did not respond • Also: “How helpful for your revision do you consider the shared URLs to have been?” • 8 Very helpful • 45 Moderately helpful • 45 No opinion or did not respond • 10 Moderately unhelpful • 6 Very unhelpful Student usage of URL lists
Of the people who used the URLs as part of revision • 46 reported moderately/very helpful • 8 reported no opinion/moderately/very unhelpful • Interesting responses from students to the question “Do you have any further comments about the shared URLs?” • It was helpful to find and submit [URLs] although I'm not sure I will ever have enough time to look at them all. I probably will look at some but purely for 'fun' rather than revision …student usage continued
I didn't dedicate enough time to this as a result I didn't gain too much from it apart from the 10 marks • Although not for revision for this exam, I have used [the URL lists] for reference for the group project • To many to look through all of them properly, but found some very useful to revising • The idea of getting students to help produce a resource for the module works well, as it can be better tailored to suit the needs of the students • Good for reading out of self interest but to wide a scope of information that is not directly relevent to the specific exam questions that we have to answer • Some of the URLs provide very good information, however some information is not relevant or outdated. …student responses
Trialing full transcript annotation system could: • Promote personal study – sharing of tips/concept explanations with other students • Encourage a two-way dialogue between students and lecturer Community of co-learners
Comparing impact of trials • Statistics from three cohorts of “Multimedia systems: contexts and applications” • 2008/09 – Before trials • 2009/10 – Collaborative transcript correction • 2010/11 – Collaborative URL list compilation • Good news: • Steady increase in percentage of students who achieve 70-100 for coursework 28% in 2008/9, 31% in 2009/10, and 52% in 2010/11
…comparing impact continued • Bad news: • Majority of students across all three years achieve exam marks in the 40-49, and 50-59 bands • Steady decrease in percentage of students who maintain or improve on their coursework mark in the exam • 26% in 2008/9, 25% in 2009/10, just 12% in 2010/11 • Exam performance has dipped then restored • In 2008/9, the majority of the cohort was 44.6% - earning exam marks in the 50-59 band. • In 2009/10, the majority was 43.8% - earning exam marks in the lower 40-49 band. • In 2010/11, the majority was 41.0% - earning exam marks in the higher 50-59 band again
Why is our approach not helping? Perhaps students view these trials as tasks and do not engage in reflective learning? Perhaps students do not WANT to read around subject, or to engage any more than necessary to pass modules? So far: Support systems under trial have only been made available toward end of each academic year. Providing access from day one may help?
Plans for 2011/12 • Work already underway • Augment transcription workflow to support subtitling of video lectures • Widen participation of lecture transcription at our institution • Develop open-source toolkit for other institutions to use • Work still to do • Trial collaborative annotation system • Annotation of video materials
Thank You for Listening • Any questions or comments? • Software links • https://github.com/uwePhillPhelps/repositories • http://www.cems.uwe.ac.uk/~p2phelps • Contact philip3.phelps@uwe.ac.uk or dan.buzzo@uwe.ac.uk for more details, further enquiries etc.