1 / 21

Evaluating e-learning

Evaluating e-learning. Oxford Centre for Staff and Learning Development Oxford Brookes University rsharpe@brookes.ac.uk. Critical success factors. Implementations which tackle real and relevant problems at the course level

venus-pugh
Télécharger la présentation

Evaluating e-learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating e-learning Oxford Centre for Staff and Learning Development Oxford Brookes University rsharpe@brookes.ac.uk

  2. Critical success factors • Implementations which tackle real and relevant problems at the course level • There may be advantages to using a poorly defined term in institutional change • Institutional rationales which are contextualised and specific • Students’ understanding their own learning and the role of resources and technology in learning • Course designs or redesigns undertaken as a team, developed iteratively over a number of years in response to student feedback Sharpe et al (2006) Review of undergraduate experience of blended e-learning, for the HEA.

  3. Some examples • Brookes Virtual • Brookes Partnerships in Practice • JISC Learner experience programme • Brookes Pathfinder

  4. 1. Brookes Virtual

  5. Perceptions and satisfaction University of Wales, Swansea, Blackboard User experience survey 2003

  6. “We’ve never done any surveys, ever, that have given other than the students want more of it, wider and deeper”(Longside 2)

  7. 2. Partnerships in Practice “We just thought … we’ll just use our ordinary module evaluation. Well it wasn’t going to work was it? It’s not asking the right sort of questions.” (Deepshire 1)

  8. 2. Partnerships in Practice • Collected feedback through • student perception questionnaire • student SPOT analysis in groups • Students used VLE to • Access resources • support groupwork processes.

  9. “reading through all the feedback data from students and tutors is like standing at the apocryphal spaghetti junction and watching cars going every which way. Some students call for more group work: others want none at all. ….. Advice fumes the air”. (Mason & Weller, 2000)

  10. 3. In their own words 9 funded projects, all using qualitative methods to capture the learner voice e.g. audio logs, video diaries, interviews.

  11. “I'm addicted, it's the first thing I turn on in the morning before I even wake up and actually it's very, very bad. I think in the future people can't cope without their laptops. (Undergraduate Business student, LEX Final Report)

  12. Coding examples Document with stripes showing coding at nodes Context of document showing coding at ‘Motivation’ node

  13. “Besides all the complexity created by marked differences across subject areas and myriad individual differences among both staff and students which prevent simple patterns emerging, there are additional crucial differences between the idealized world described by research and the actual world experienced by the participants.” Entwistle, N., McCune, V. and Hounsell, J. (2002). 'Approaches to Studying and Perceptions of University Teaching-Learning Environments: Concepts, Measures and Preliminary Findings.' onlinehttp://www.ed.ac.uk/etl/docs/ETLreport1.pdf.

  14. 4. Brookes Pathfinder • Combined-methods approach where various types of data have been collected • Series of seven case studies to evaluate the nature and the impact of innovation on learner experiences • Questionnaire on learner perceptions of learning technology use

  15. Questionnaire • Provides descriptive statistics on learner characteristics • Investigates patterns in learner uses of technology • Analyses which how study strategies (such as the use of peers or help-seeking) or views of learning (such as the perceived degree of independence) are related to various dimensions of technology use

  16. What we’ve seen 1. Brookes virtual – audit 2. PiP – course improvement 3. Jisc learner voice – discovery 4. Pathfinder - understanding

  17. Possible functions of evaluations • Is it being used? • How is it being used? • Do the staff and students like it? • How do the students experience it? • Is it effective? • How could we make it better?

  18. Purposes of evaluation • Formative: To ensure that a product reflects the intentions of its designers AND meets the needs of the users • Summative: To test a product • Integrative: To integrate a product into a new environment • Pragmatic: To discover the unexpected benefits • Illuminative: To get a project funded Goodyear (2001) Effective networked learning in higher education: notes and guidelines’ (p. 37-40)

  19. Thinking about your purposes Use the lists of purposes and questions to complete the stakeholder template. Who are your key stakeholders? What are their priorities? What questions are of interest to them?

  20. References Ramanau, R., Sharpe, R. and Benfield, G. (2008). Exploring Patterns of Student Learning Technology Use in their Relationship to Self-Regulation and Perceptions of Learning Community. Paper to be presented at Networked Learning Conference, May 288, Halkidiki, Greece. Sharpe, R., Benfield, G., Roberts, G. & Francis, R. (2006) The undergraduate experience of blended e-learning: a review of UK literature and practice undertaken for the Higher Education Academy. At www.heacademy.ac.uk/4884.htm Sharpe, R. & Pawlyn, J. (2008) The role of the tutor in blended e-learning: experiences from interprofessional education in R. Donnelly (ed) Applied eLearning and eTeaching in Higher Education

More Related