1 / 43

The Academy Research Observatory: development and community engagement

The Academy Research Observatory: development and community engagement. Martin Oliver London Knowledge Lab & Higher Education Academy. Overview. Context: what’s motivated this? Evidence, practice, policy and e-learning Projects at the IoE The Academy’s Research Observatory

mikhail
Télécharger la présentation

The Academy Research Observatory: development and community engagement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Academy Research Observatory:development and community engagement Martin Oliver London Knowledge Lab & Higher Education Academy

  2. Overview • Context: what’s motivated this? • Evidence, practice, policy and e-learning • Projects at the IoE • The Academy’s Research Observatory • The landscaping study • Early pilots • Current developments • Future plans • Discussions • Your uses of evidence; the Observatory; etc

  3. Some background • Familiar account of the rise of New Managerialism • Monitoring, accountability and performance indicators • Private, professional judgements (e.g. around pedagogy) made into strategic, manageable issues • “What gets measured gets done” • Rational and pragmatic

  4. Evidence-based practice (a caricature) • Draws from a positivist tradition of enquiry • Evolution in medicine • The hierarchy of evidence – reliability as key • Summarise and aggregate studies appropriately (and often, mathematically)

  5. However… • Medical model doesn’t necessarily work in e-learning • Qualitative research acceptable or favoured • Education as a theory-building field (whilst evidence-based practice movement eschews theory for pragmatics) • Practical (and political) issue of de-centred field (no Cochrane database… what fields should we cover?) • Evidence of benefits from involving practitioners in research, rather than differentiating researchers from practitioners (the studied) • What alternatives might there be…?

  6. And more generally… • Clegg’s critique of systematic review • Appeal to evidence as part of wider erosion of professional judgement • ‘Consumer’ represented in the data, but only as constituted and spoken for by the researchers • Reviews undertaken to shore up policy, not inform it • ‘Black box’ of favoured methods fails to explain what works – which is important given complexity of teaching and learning • Consequent gap between reviews and practical application

  7. A theorisation • Wenger’s communities of practice model • Practice and Reification as necessary but fundamentally different (experience and abstraction) • Reifications received by communities and meaning negotiated in relation to practice • A social account of acceptable (policed) interpretation • This has implications for how we make sense of research

  8. An aside about legitimation • Lyotard’s account of knowledge in post-modernity • Separation of ‘knowledge’ from the knower • Knowledge as a commodity, valorised through use (consumption) • Value to researchers? To practitioners? To policy makers? • Scepticism about grand narratives • Language games as the means of legitimising claims • (…somehow, this should all connect…)

  9. So… • More general interest in ‘evidence informed’ practice and policy • Problems perceived to exist at various stages: • Finding research • Making sense of it • Acting on it • Not strongly visible in UK policies • cf. No Child Left Behind: “scientifically-based research”

  10. …but evident in policy discourse • “We need social scientists to help determine what worked and why, and what types of policy initiative are likely to be most effective” • Blunkett, 2000

  11. Harnessing Technology • A practice-based research environment • 64. Our best understanding of how to improve practice comes through learning from others – from experimentation, collaboration, and dissemination. We need an active, practice-oriented R&D forum to bring together teachers, lecturers researchers, and industry to develop the leading edge applications. It would combine the practitioner knowledge of teachers and lecturers with the specialist knowledge of learning technologists and industry suppliers. It should help to build the evidence-base for the value and impact of e-learning.

  12. An interesting start • A social account of knowledge-building • Explicitly linked to practice • Although… • Political imperative: unifying practice, support and the market • Evidence “for” (not to change?) value and impact

  13. Research into the ‘teaching-research nexus’ • This research indicates that staff who see their research as tentative and as part of a wider debate in the discipline, and see their teaching as supporting student conceptual change, are more likely to bring their teaching and research together. • By contrast staff who see their research as atomistic investigations and their teaching as concentrated on teacher-focused transmission of information are less likely to experience strong connections between teaching and research.

  14. At departmental level teaching and research are now often organised separately, and in many cases limited thought is given to, and few explicit policies determine, how they might be linked. Indeed in some cases it may well be that the pressures for research selectivity, such as those of the RAE, are causing increased fissures between teaching and research.

  15. The ‘nexus’ isn’t one thing • Differentiation between: • Research tutored • Research resources central to curiculum • Research based • Inquiry-based learning • Research led • Oriented to subject content • Research oriented • Focus on process of knowledge construction

  16. Meanwhile, back at the ranch… • IoE participation in Academy-funded e-Learning Benchmarking work (Mellar et al) • A structured programme of self-assessment • But… “Whilst the changes to strategy are important, it is clear that these in themselves will not be able to bring about the changes needed. The rationale for participation thus became more one of what we could learn generally from other participants’ experiences: about the meaning of e-learning for them, and the way in which they organised and managed e-learning in their institutions.” (Benchmarking blog entry)

  17. PREEL • From Pedagogic Research to Embedded e-learning Building on the Benchmarking project Funded through the Academy Pathfinder funding Within our institution (and we suspect in many others) there are pockets of excellent practice in e-learning, and several strong research communities (as well as specific work undertaken by individuals) but there is a lack of coordination, and a recognition that this wealth of experience is not coming together to provide a unified e-learning experience for all our students. This project sets out to bridge this gap between e-learning research and e-learning practice within the IoE, and to demonstrate how this was done in such a way as to be of value to other HE institutions.

  18. PREEL • Led from the LTU • Projects to redevelop curricula, leading to publications • A network built around this work • Workshops to share existing research • Report summarising research, tools, resources • A real person to talk to! (A key element) • A focus on involving people in research

  19. PREEL2 • Building on the work in Quality Assurance at a national level • e-Learning QA/QE Special Interest Group • A community-led initiative, involving QAA

  20. London Pedagogy Planner • Laurillard et al – building on existing research into curricula and technology • Development of a “prototype for a collaborative online planning and design tool that supports lecturers in developing, analysing and sharing learning designs” • A formal, computable representation of course-related decisions

  21. Developing strategies for translating e-learning research into practice • Building on PREEL, LPP, etc Mellar, Laurillard & Hadjithoma-Garstka PREEL process: “practitioners could not see the links between the research and their particular needs” LDSE (Learning Design Support Environment): “It will be a space for teachers to get support in designing their lessons with integrated technology, share their practices of technology enhanced learning and make it easier for other to embed TEL in their practices.” Action research as a way forwards?

  22. Meanwhile, somewhere in York…

  23. The Observatory • A service promoting and exploring the use of practice- and research-based evidence to influence policy and practice in teaching and learning in Higher Education Tools to enable access to evidence and syntheses of this evidence (e.g. prototype repository and syndicated search, wiki) Spaces (real and virtual) to help communities explore evidence-based practice and its implications for students' learning

  24. A work in progress • Initial proposal to HEFCE for an e-Learning Research Observatory • Landscaping study • https://mw.brookes.ac.uk/display/hearoc • Judged to have wider relevance • A research observatory for Higher Education • e-Learning, Widening Participation, Employer Engagement • Other ‘strands’ may be added - although the strands may not be explicit represented in the final structure

  25. Overview of development • August 07 – July 08 • Scoping the Observatory • Landscaping report for e-learning (exemplar area) • Generation of pilot resources and services • Proof of concept piloted at Academy conference • July 08 – July 09 • Development phase • Pilots focusing on community engagement • Wider consultation • Promoted at Academy conference July 09

  26. Landscaping evidence use in e-learning • Series of exploratory studies within e-learning Beetham, Sharpe & Benfield • ‘Landscaping’ consultation • Interviews with key informants • Survey (116 responses) • Follow-up interviews • https://mw.brookes.ac.uk/display/hearoc/

  27. Yes, but… • Keen on single point of access; research reviewed, evaluated and synthesised • After that, great variability • “The impossibility of categorising respondents as users, producers, policy makers or intermediaries for research is in itself is an important outcome.”

  28. HB: What kind of research or evidence should a research observatory focus on? • I: Evidence that e-learning really works. • HB: What would that evidence look like? • I: It would need to show real improvements to learning outcomes, across a large number of students. It would have to have credibility and rigour. • HB: Can you think of an example of research evidence of that kind? • I: Not off the top of my head, no. • HB: So does this research really exist?

  29. I: No, the observatory would have to fund it. OR Yes, it is out there, the observatory will have to work really hard to find it. • HB: Can you think of a situation when evidence like this has really changed people’s practice or understanding, in your experience? • I: Well, the cynics always ask for evidence that e-learning really works. • HB: Do you think that evidence, if you had it, would lead them to change their minds? • I: No, they would find arguments against it from their own discipline perspective.

  30. HB: So what about people who are actually open to change? • I: They never ask for evidence. They ask for examples, especially from their own subject area, and practical ideas. They are really responsive to other people in their discipline who have tried something and made it work.

  31. Identified strategic choices, approaches and risks • Strong persuader or neutral observer? (Setting agendas) • Funder? • Push or pull communications? • Quality assured, selective research or evidence and examples? • Audience: researchers, intermediaries, practitioners or policy makers? • Building authority or democratic knowledge? (Who can write?) • Central or local knowledge management? • Face-to-face or technology supported networks?

  32. Building from this: e-Learning pilot • How do communities produce, share and use evidence? • QA/QE in e-learning SIG • Expert review about e-portfolios • Discussion about professional role (M25 Learning Technologists group) • Open peer commentary on the national development programmes • Review processes, to identify approaches that may have wider value • Feed these back to inform the development of the whole observatory

  33. Echoed conclusions from IoE projects • Resources provided for people were not necessarily (ever?) taken up • Communities rise and fall • Expert review uninviting to others (useful, but not an invitation to contribute) • Open peer commentary engaged invitees then stopped

  34. Pilot site for preliminary consultation

  35. Building from this pt2: wider consultation • Inviting contributions from wider groups • Employee learning and widening participation communities • TLRP conference workshops • ELESIG meeting (Thursday) • Case studies with communities (Heads of e-Learning Forum) • Input into specifying the Observatory; documented cases of evidence generation and use

  36. Building from this pt3: e-Learning projects • Small grants for research • Projects awarded, visited once, conclude by generating reports • New model • Projects awarded, brought together, given technical infrastructure (wiki, Ning) • Will be visited, encouraged to use Web2.0, brought together mid-project and again at end

  37. Wiki for Academy-funded projects

  38. However… • Continued visibility of central resource • Yes, it’s a repository… but it’s not just a repository • If it’s just a website it’ll be pointless • Invitation to engage does not guarantee engagement

  39. And what next? • New round of projects (provisionally) • Smaller-scale reviews • Special Interest Groups • Possibly projects about technology to support evidence-informed practice • Emphasis on engagement (I hope) • Wiki-based reviews for open engagement • Social networks – finding people, not just research outputs

  40. Conclusions? • Scepticism about centralised determination of evidence and method • Authoritative or authoritarian? • Interested in community production and use of evidence • Community-owned tools and spaces as a worthwhile experiment

  41. Some possible starting points for discussion • A personal bias towards a social account of negotiating the meaning of encounters with research evidence • How credible is the approach represented here? • Is this approach to evidence the right one? • How should the choices outlined earlier be responded to? • Will this be able to help change practice?

  42. How does this relate to you and your practice? • What kinds of decisions do you draw on explicit evidence for? • How could these be supported? • How could you share your conclusions? • What kinds of synthesis or structuring would be most useful and credible?

  43. Emails - m.oliver@ioe.ac.uk • martin.oliver@heacademy.ac.uk • observatory@heacademy.ac.uk • More information at: • http://www.heacademy.ac.uk/ourwork/research/observatory • Piloting site: • http://academy-research-observatory.pbwiki.com/

More Related