190 likes | 340 Vues
Measuring impact of LfE and LiE activities. Dr Anett Loescher Research, Development, Partnerships. Indicators of success. 31% engagement of stakeholders 31% maturity 25% sustained external collaborations 12.5% skills articulation 12.5% self-awareness, confidence
E N D
Measuring impact of LfE and LiE activities Dr Anett Loescher Research, Development, Partnerships
Indicators of success • 31% engagement of stakeholders • 31% maturity • 25% sustained external collaborations • 12.5% skills articulation • 12.5% self-awareness, confidence • 12.5% numbers in placement • 12.5% destination • 12.5% improved, enhanced learning opportunities • 12.5% employers role in curriculum design • 6.25% placement numbers • 6.25% decrease in withdrawal • 6.25% increase in return after study break • 6.25% work force development • 6.25% curriculum design addresses learner needs
Issues, hindrances • 31% student ability • 25% lack of support and recognition for staff • 25% singularity of activity and frictions with academic structures • 18% maturity • 18% lacking engagement • 18% delivery, design too traditional • 18% regulation, admin, compliance • 12.5% lacking means to measure impact • 12.5% costs, fees • competition for placements • competition from other activities • lacking resources • pay expectations of placement students • culture clash
Lessons learnt • 37.5% encourage, develop student engagement • 31% address learner needs and demands re delivery • 18% build, maintain relationships • learn lessons • be clear about requirements, expectations, commitments • realism re time, work, engagement necessary • bridge gap theory – practice • accept fluctuations in engagement • realism re student ability • management, admin fit for purpose • staff support, recognition • ensure learner understanding
Activity is desirable because... • fosters collaboration • makes provision attractive (recruitment drive, differentiates offer) • identifies, remedies skills gaps • supports higher level skills • prepares for process of finding, securing employment • improves retention and success • engages employers tangibly • drives innovation in developing, delivering learning opportunities • supports inst. mission, strategy
Quality assurance, management • cross-institutional input • research, scoping to asses demand, existing provision, learner needs • determined by external frameworks • sought external expertise • representation of stakeholders in management structures • visits to partner providers • clarified requirements and expectations • due diligence
Staff involved, staff structures • dedicated core plus cross-institutional • dedicated core • staff cross-institutionally drawn as needed • external structures and institutional core • external structure and staff cross-institutionally drawn as needed
Support to staff • coaching, collegiate exchange, self-evaluation • work shared according to expertise • targeted training • external structures manage, admin activity • established management framework • time allowance
Other resources • none (50%) • technology for communication, data collection and analysis, delivery • external funding for operations and management • policy group • flexibility
Measuring impact – how did you plan to do it? interest take-up academic success employment take-up completion participation employability participation success progression DLHE tracker into employment reflection employment career progression exam, assessment feedback when and how set by external framework not measured
Measuring impact – planned at what stage? start throughout after throughout throughout end end start end
Critical success factors - 25% participant numbers - 18% employment, destination after graduation - 18% positive student feedback - 18% learners’ progress, achievement, competence - maturity - number of links with industry - quality of employer engagement - completion - retention - numbers on placement - public awareness compare with ‘indicators of success’
Approach, method to measure impact • use institutional management information and combine as appropriate (31%) • course and group based • no formal way, not done • external structure gathers specific data/information, evaluates it
How is impact assessed on ...learners, beyond learning outcomes ...those involved in the activity not done feedback through external structure success rate of students evaluation • progression statistics • questionnaire • progression stats and questionnaire • feedback • learner reflection • anecdotal • not done
How is feedback collected from learners partners 31% not done 31% feedback through external structure representation anecdotal mix of anecdotal and formal success rate of students evaluation • 31% evaluation • 18% questionnaire • 18% not done • feedback • student representative
Results from impact assessment used for • 31% targeted improvement (structural, strategic) • admin and management • tie-in with strategies and plans • 25% development, enhancement of learning opportunities • modification of operations, delivery, content • Information about provision • reporting • promotion • comparison with baseline, measure progress • partners more involved in curriculum development • feedback gathering formalised
Activities... have been around for... • less than five years (62%) • more than five years (18%) • more than ten years (7%) are... • mandatory (37%) • optional (25%) • not affiliated to any programmes (31%) • can be both mandatory and optional
Uptake Targets set... Targets met... can’t say exceeded (31%) missed (25%) competition from other activities re-structure of programmes target group expectations incompatible designs • entire cohort (50%) • not known or set (18%) • 50-60 • 30-40 • below 20
How is impact measuring useful • tie back outcomes of activity to intended aims, objectives – does it work, where are modifications necessary • articulate benefits, recognise value • ‘ skills articulation’ • effect on students • differentiate effect/value on academia, and on partners (employers)