1 / 12

MIRROR: Evaluating the Effectiveness of Reflective Learning at Work

MIRROR: Evaluating the Effectiveness of Reflective Learning at Work. Marina Brati ć , Gudrun Wesiak, Angela Fessl. Agenda. Evaluation in MIRROR Specifying Summative Evaluation Criteria Evaluation Toolbox and Procedure Challenges with regard to summative evaluation.

amity
Télécharger la présentation

MIRROR: Evaluating the Effectiveness of Reflective Learning at Work

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MIRROR: Evaluating the Effectiveness of Reflective Learning at Work Marina Bratić, Gudrun Wesiak, Angela Fessl

  2. Agenda • Evaluation in MIRROR • Specifying Summative Evaluation Criteria • Evaluation Toolbox and Procedure • Challenges with regard to summative evaluation

  3. I. Evaluation in MIRRORIntroduction • Why summative evaluation? • Methodologyfor the evaluation • Indicators of reflection and its effects at • individual • inter-individual • organizational levels • A set of tools and instruments • Individual extensions of the research methodology

  4. I. Evaluation in MIRROR Evaluating learning effectiveness “in the wild” • Diversity of test beds • Challenging aspects of evaluating learning by reflection • Different types of evaluations in MIRROR • Formative evaluations • Evaluation with summative aspects • Summative evaluation

  5. I. Evaluation in MIRRORApproaches • Theory-based approach: the evaluation of MIRROR guided by the conceptual model of reflective learning at work (CSRL model) • Goal-based approach: assess whether reflective learning goals and objectives of relevant stakeholders within and beyond the project consortium are met • i* model (Yu & Mylopoulos, 1994) • Adapted evaluation approach of Kirkpatrick (Kirkpatrick & Kirkpatrick, 2006)

  6. II. Specifying Summative Evaluation Criteriai* Model of Reflective Learning at Work A: Worker B: Individual Reflector C: Individual Team Reflector D: Collaborative Team Reflector E: Organisational Reflector

  7. II. Specifying Summative Evaluation Criteria Kirkpatrick Model – 4 levelsofsummativeevaluation • Level 1: Reaction To what degree do participants react favourably to our MIRROR apps? • Level 2: Learning To what degree do participants acquire knowledge, skills, attitudes, confidence, and commitment? • Level 3: Behaviour To what degree do participants apply what they learn? • Level 4: Results To what degree do targeted outcomes occur as a result of MIRROR?

  8. II. Specifying Summative Evaluation Criteria Levels of Evaluation and Evaluation Criteria i* model summativeeval.criteria Level 4 Results Business Impact Level 3 Behaviour Work-related Criteria Work Kirkpatricklevels Outcome Criteria Learning Outcome Level 2 Learning Process Criteria Learning Process (Reflection) Level 1 Reaction General Criteria App Usage

  9. II. Specifying Summative Evaluation Criteria Levels of Evaluation and Evaluation Criteria • KPI measures • Work improvement Learning Process: • Short Reflection Scale • App-specific reflection questions Learning Outcome • Log file data of app usage • Self-report of app usage Evaluation criteria type Processes & outcomes Realisation in MIRROR

  10. III. Evaluation Toolbox and Procedure • Evaluation Toolbox specifies research instruments and measures • E.g. Questionnaires, Log file data, Interviews after app usage, Observations, Self-assessments, KPIs … • Evaluation procedure specifies how is the Evaluation Toolbox used by the developers and test beds to evaluate the success of the MIRROR apps • transparent evaluationprocess • assist the developers and test beds during the app testing • a high degree of freedom for developers and test beds • a strong summative evaluation at the end of the project

  11. IV. Challenges with regard to summative evaluation • Our evaluation approach... • requires time upfront to specify the conceptual model, links between stakeholders, processes, activities, and outcomes • requires clearly articulated goals and objectives • must capture all relevant processes and all important aspects of reflective learning at work • must be able to deal with unexpected results • expects strong commitment from the test beds

  12. Thank you for your attention! Do you have any questions?

More Related