1 / 21

Discerning Futures

Course Leaders’ Conference 2013. Discerning Futures. Plenary Introduction and Context. Professor Sally Glen Deputy Vice Chancellor, Student Experience. Improving student learning experience through improving courses. Professor Graham Gibbs. ‘Dimensions of Quality’.

mircea
Télécharger la présentation

Discerning Futures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Course Leaders’ Conference 2013 Discerning Futures

  2. PlenaryIntroduction and Context Professor Sally GlenDeputy Vice Chancellor, Student Experience

  3. Improving student learning experience through improving courses Professor Graham Gibbs

  4. ‘Dimensions of Quality’ Literature review to inform debates about: whether UK HE is comparatively good whether university league tables are valid whether the NSS and KIS provide info students can trust

  5. ‘Implications of ‘Dimensions of Quality’ in a Market Environment’ Review of institutional behaviour is how universities are responding to their PIs likely to “drive up quality”? which enhancement strategies are working?

  6. ‘Presage’ variables Resources per student predict much less than one might expect (but learning resources predict effort) Selectivity predicts performance, but not learning gains, or engagement, or use of pedagogies known to enhance engagement Research predicts performance, but not engagement, and negatively predicts satisfaction & measures of learning gains. Who does the teaching predicts performance and gains Reputation predicts only selectivity, funding & research Peer ratings only reflect reputation (US and TQA)

  7. ‘Process’ variables Cohort size, class size, ‘close contact’ with teachers (SSRs) (cohort effect avoidable...) Not class contact hours but total study hours Quality of teaching: training, student ratings, but not teachers’ research Quality of research environment: not at u/g level Consequences for learning: Deep and surface approaches Engagement: close contact, high and clear expectations, good quick feedback, active and collaborative learning, time on task

  8. ‘Product’ variables Degree classifications Retention Employability ... too many confounding variables to be able to make much sense of any of this data, and degree classifications and employability data are highly unreliable

  9. What to pay attention to in terms of pedagogy? Changing students: effort, internalisation of goals and standards, meta cognitive awareness, self-efficacy Changing teachers: who, and how sophisticated Moving from solitary to social learning Changing curricula: Focussing course design, review and evaluation around learning hours Shift from summative to formative assessment Making programmes coherent, with comprehensive changes implemented by course teams, not only by individuals (no matter how wonderful)

  10. Departments and social mediation of quality Programmes vary widely in quality within institutions (except where ‘institutional pedagogy’) It can be very difficult for individual teachers to adopt effective pedagogies if no-one else does Institutions with no QE focus on programmes have problems Communities of practice (Havnes) Talking about teaching at programme level (TESTA) Employment practices (adjunct faculty, pseudo departments, Fordism) Modular structures, no assessment (or even shared understanding) of programme outcomes ...implies increased developmental focus on depts or or course teams (Lund, Oslo, Finland, Utrecht)

  11. The ‘how’ of change... 1 Using teaching PIs to improve quality 2 Unanticipated impacts on curricula 3 Managerial vs devolved change 4 Student engagement 5 QA

  12. 1 Using teaching PIs to improve quality Unprecedented attention to quantitative PIs Average NSS scores up every year Some institutions climbing rankings every year ...by paying attention and using clever change processes Exeter Coventry Winchester: TESTA assessment and feedback

  13. 1st degree programme at Winchester to use TESTA, now top ranked nationally

  14. University of Winchester

  15. 1 Using teaching PIs to improve quality Unprecedented attention to quantitative PIs Average NSS scores up every year Some institutions climbing rankings every year ...by paying attention and using clever change processes Exeter Coventry Winchester: 24 Universities now using TESTA

  16. 2 Unanticipated impacts on curricula Whole is less than the sum of the parts (OU, Plymouth, module level NSS scores) Course rationalisation, abandoning joint degrees Abandoning modularity altogether Bigger, longer, fewer modules, fewer in parallel Planned programme assessment regimes, including programme level learning outcomes

  17. 2 Unanticipated impacts on curricula Whole is less than the sum of the parts (OU) Course rationalisation, abandoning joint degrees Abandoning modularity altogether Bigger, longer, fewer modules, less in parallel Planned programme assessment regimes ... but this may cause Less choice, less engagement Larger classes

  18. 3 Managerial/centrist vs devolved change Institutional vsDept level targets for PIs Volume of feedback Criteria and standards (and hence learning outcomes) Institutional learning outcomes/graduate attributes Volume of assessment Class size Use of VLE

  19. 4 Student engagement Students as change agents across departments (Exeter) Students as educational researchers across programmes (Winchester) Student teams as developers across Faculties (Sheffield) Changed practices, changed student attitudes Better engagement in studies (USA, NSSE) Improved NSS scores (2008-12 7%, Av 2%)

  20. 5 Quality Assurance Annual reviews of NSS scores trumping all other QA and QE processes Valid dimensions of quality entirely missing from formal quality reviews (e.g. formative-only assessment, Jessop 2012; student effort)

  21. Conclusions Teaching quality PIs in the public domain are changing the market and will become more valid, more useful and more influential – and they operate at programme level It is possible to improve your PIs faster than the others The best way to do this is to take local responsibility at programme level and change the institutional infrastructure to enable this to happen involve students in the change process Local leadership of teaching is the new key role in universities

More Related