1 / 31

Policy Futures: Measuring Students’ Learning Gain and Engagement

Policy Futures: Measuring Students’ Learning Gain and Engagement. Deans of Arts, Social Science and Humanities Dr Camille B. Kandiko Howson @cbkandiko King’s College London 12 May 2016. Overview. Student engagement Student expectations Enhancing the student experience

droxanne
Télécharger la présentation

Policy Futures: Measuring Students’ Learning Gain and Engagement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Policy Futures: Measuring Students’ Learning Gain and Engagement Deans of Arts, Social Science and Humanities Dr Camille B. Kandiko Howson @cbkandiko King’s College London 12 May 2016

  2. Overview • Student engagement • Student expectations • Enhancing the student experience • Surveys, metrics and learning gain

  3. Background: Student engagement (UK) The participation of students in quality enhancement and quality assurance processes, resulting in the improvement of their educational experience (QAA Quality Code, Chapter B5)

  4. Background: Student engagement (US) “the time and effort students devote to activities that are empirically linked to desired outcomes of college and what institutions do to induce students to participate in these activities” (Kuh, 2009: 683)

  5. Student expectations Framing of ideology Consumerist ethos: Student perceptions of value

  6. Student expectations Framing of practice Student expectations of the learning environment: Clear benchmarks Framing of purpose Student expectations for employability

  7. Student expectations Their course • Evaluation, feedback and feed-forward • Staff: Attributes, practices and attitudes • Equity of opportunity: Personalisation versus standardisation The institution • Students as stakeholders? Community, engagement and belonging • Transition into higher education

  8. Enhancing the student experience • Individual “You are not in contact with an actual person, you know, you’re just filling out a survey, that’s really not engaging whatsoever” (International politics, female, research-intensive) • Minimum benchmarks • Instrumental, organisational, interpersonal, academic

  9. Value for money • Finances: How are tuition fees spent • Value: More ‘high-quality’ contact time, in small seminars and tutorials run by qualified teaching staff, not simply more lectures • Information: How can students find out if they are going to be (and what proportion of the time) taught by well-qualified, trained teaching staff in small settings? Where do tuition fees go and why?

  10. Advice and guidance • More realistic information about a course, what students should expect and what is expected of them • Opportunities for internships, placements and work experience • Promotion and coordination of student services and Students’ Unions activities

  11. Surveys • NSS – under review, follows from Australian CEQ developed in the 1980s • UKES – in fourth year, follows from US NSSE developed in 2000s and run in Canada, Australia, NZ, South Africa, China and recently in Ireland

  12. Current review of the NSS new questions on engagement some changes to existing questions some deletions Report on consultation published February 2016/early May…. Pilot new survey Spring 2016 New survey January 2017 9

  13. Review of the NSS Critical thinking Learning Community 9

  14. Review of the NSS Student voice New teaching question 9

  15. UKES asks: how students spend their time, what students feel they have gained from classes, student evaluation of the quality of their interactions with faculty and students the value of other important educational activities.  2

  16. Key questions that informed the review of engagement Qs in the UK How do students understand the individual questions? What do students mean by their response to survey questions? How are the questions answered both with and without prompted response categories? How do students interpret the questions as cohesive scales? Do students think these are important questions? Do students have suggestions for changes or additional questions? How do students respond to similar questions? 2

  17. Key findings • Diversity of students’ experiences and institutions’ feedback • Institutional-type and subject differences • Engagement surveys as student voice mechanism • Engagement surveys compared to satisfaction surveys 5

  18. Student comments I think this one’s better because it makes you like actually pin point, like you’re given a set of like descriptive answers…questions and you like…it helps you pin point what your issue is. Whereas, in a satisfaction survey, I feel like it’s just disengaged and it just gives you an opportunity to rant rather than see what’s positive and negative at the same time. I feel like if you say satisfaction to a student, will automatically go to negative. If you say, like, survey about how you go on in university, just saying this was good, this was not, I think it’s better because it’s like a more balanced view. (Female, 22, European Studies) Well, I thought the NSS was rubbish. (Male, 22, History, UK student) 6

  19. Institutions’ feedback to and from students Students were very positive about institutions exploring the various dimensions of student engagement with different elements of student life Voiced discontent that institutions did not seem to do much, if anything, with the various forms of feedback students provide 9

  20. Green Paper, White Paper and TEF New positioning of students Proposed new architecture for HE …and Learning Gain! 11

  21. Learning Gain: Challenges • Discipline bias in standardised tests • Motivating students to invest in tests that don’t contribute to assessment • Comparability of some entry and exit measures • Context of UK higher education • Reliability of student self-reports But closest current proxies for learning are satisfaction…

  22. Learning Gain Pilot Projects HEFCE funding 13 projects involving 70 institution over three years, looking at: • Grades • Self-reported surveys • Standardised tests • Other qualitative methods • Mixed methods Also large mixed methods evaluation study and cognitive skills questionnaire

  23. Project types I Conceptual Embedded External Applied

  24. Project types II • Existing data: high numbers, lots of sources • But what is it telling us? • New data: smaller numbers, selected subjects • How do we interpret findings?

  25. Outcome Measures Project pipeline • Cognitive skills • Disciplinary • Generic • Employability • Career • Adaptability • Readiness • Sustainability • Satisfaction • Grades/GPA Entry Qualifications

  26. Process measures Affective • Self efficacy • Well-being • Disposition • Confidence Behavioural • Placements/ work-based learning • Engagement • Co-curricular • VLE engagement • Skills self-assessment • Learning patterns • Learner analytics

  27. Points for discussion • Contact time and time-on-task • Individual versus collective student engagement • Broadened notion of student experience (transition, development, employability)

  28. References Kuh, G. D. (2009). What student affairs professionals need to know about student engagement. Journal of College Student Development, 50(6), 683– 706. Quality Assurance Agency (2012). UK Quality Code for Higher Education. Part B: Ensuring and Enhancing Academy. Chapter B5: Student Engagement. Gloucester: QAA.

  29. Questions? Dr Camille B. Kandiko Howson King’s College London camille.kandiko_howson@kcl.ac.uk @cbkandiko Thank you! http://www.hefce.ac.uk/lt/lg/

More Related