1 / 17

The Evaluation of Innovative Learning Technologies: the Case of Mobile Learning

The Evaluation of Innovative Learning Technologies: the Case of Mobile Learning. John Traxler & Agnes Kukulska-Hulme University of Wolverhampton & Open University. Evaluating Mobile Learning. Our motivation – to improve the evaluation of mobile learning by exploring current practice

brenna
Télécharger la présentation

The Evaluation of Innovative Learning Technologies: the Case of Mobile Learning

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Evaluation of Innovative Learning Technologies: the Case of Mobile Learning John Traxler & Agnes Kukulska-Hulme University of Wolverhampton & Open University

  2. Evaluating Mobile Learning • Our motivation – to improve the evaluation of mobile learning by • exploring current practice • reflecting on emerging issues • making recommendations • Our belief • Evaluation rests on one’s definition of mobile learning

  3. Definitions of ‘Good’ Evaluation • Few and far between • Possibly some that could be imported from e-learning • Or perhaps from educational research • Choice, validity and transferability issues? • No consensus • Perhaps a research question • Delphi or ‘contrived’ techniques amongst ‘experts’ or stakeholders? • No ‘a priori’ definition • But possibilities in terms of internal self-consistency • Alignment of aims, methods, epistemology, evaluation

  4. Definitions of ‘Good’ Evaluation • Perhaps every evaluation is contingent • We can learn nothing! • Perhaps every evaluation is driven by its readership • Or its funders

  5. An Early Attempt at ‘Good’ Evaluation • Rigorous • conclusions must be trustworthy and transferable • Efficient • in terms of cost, effort, time • Ethical • in relation to the nuances of evolving and untried forms of provision • Proportionate • not more ponderous, onerous or time-consuming than the learning experience or the delivery and implementation of the pilots themselves • Appropriate • to the specific learning technologies, to the learners and to the ethos of the project concerned • ideally built in, not bolted on • Consistent • with teaching and learning philosophy, conceptions of teaching and learning of all participants • Authentic • in accessing what learners (and perhaps teachers and other stakeholders) really mean, really feel • sensitive to the learners’ personalities within those media • Aligned • to the chosen medium and technology of learning • Consistent across • different groups or cohorts of learners in order to provide generality • time, that is, the evaluation is reliably repeatable • whatever varied devices and technologies are used

  6. Evaluation as the process of looking for change Layers of Change social & economic affective informal meta-cognitive formal cognitive Realist: confounding/extraneous variables in outer circles Constructionist: outer circles ‘construct’ inner circles

  7. Reasons for Wanting to Conceptualise ‘Good’ Evaluation • credibility and trustworthiness of outcomes, findings, results • requirements from funders • improved critical and reflexive practice • parity with education and other areas of social research

  8. Some Observations • Few accounts • had an explicit position on pedagogy or epistemology • cited anything from the literature of evaluation • cited anything from the literature of the ethics of evaluation • mentioned any ethical issues in relation to their evaluation

  9. Some Observations • Most accounts used • focus groups, interviews, questionnaires • And assumed what they were told was true • some observation, some system logs • A few accounts used • several techniques, and triangulated • No one mentioned • external evaluators

  10. Some Observations • Most accounts used • only one or two techniques • Few accounts mentioned • confidence, piloting, reliability • Few accounts provided • their instruments • Few accounts mentioned • using mobile devices

  11. Some Examples “….. and, when finished, are asked to fill out a questionnaire about the experience. The questionnaire is composed of 21 statements with which the participants have to express their agreement using 10- and 5-point Likert scales.” (Zancanaro et al., 2004)

  12. Some Examples “The interviews in the pilot study were semi-structured. The first interview was recorded on a mini-disc, but during the interview the students supported what they were saying by referring to their handheld, so it seemed that it would be more fruitful to film the interviews. As a result, the information saved on the students’ handhelds was also recorded.” (Mifsud, 2004)

  13. Some Examples “data from an end-of-course questionnaire about student study habits, external access to technology and their attitudes to the XDA (it also covered preferred input methods and feelings about the usefulness of the software and functionality provided)” (Luckin et al., 2004)

  14. Some Examples "very informal conversations with (an admittedly small number of) students" "engage youngsters between 16-22 years to... take part in a number of student panels to discuss possible applications ...During the panel sessions participants are interviewed ... they are also asked to evaluate customised educational software applications"

  15. Some Examples “Classroom observations and informal interviews with pupils and teachers were undertaken. The first design of the pilot study opened up for observation not only ‘inside’ the classroom but also ‘outside’ – in the canteen, library and school-yard.” (Mifsud, 2004)

  16. Some Examples “The pilot tour was taken by 852 visitors who completed evaluation forms recording their experiences. In addition, qualitative focus group studies were conducted by the Susie Fisher Group. The software system used in the trial also logged all uses of the MMT and provided a statistical picture …..” (Proctor & Burton, 2004)

  17. Some Examples “After two weeks of observation, some students were interviewed. Two different forms of interview were used – group or focus interviews and the more traditional one-to-one interviews. The lessons observed were taken as the starting point of the interview.” (Mifsud, 2004)

More Related