1 / 23

1 st UK Workshop on Constructive Alignment

1 st UK Workshop on Constructive Alignment. 23 rd February 2006 Cobham - Jacques. Constructive Alignment - Reflections on Implementation David Cobham Head of Undergraduate Studies Kevin Jacques Senior Lecturer in Computing Substantial credit is due to Paul Reeve

hayes
Télécharger la présentation

1 st UK Workshop on Constructive Alignment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 1st UK Workshop on Constructive Alignment 23rd February 2006 Cobham - Jacques

  2. Constructive Alignment - Reflections on Implementation David Cobham Head of Undergraduate Studies Kevin Jacques Senior Lecturer in Computing Substantial credit is due to Paul Reeve Implementation process from 2000 onwards This is not intended to be a prescription Background Information

  3. Curriculum alignment Process Validation Inception Assessment strategies Paradigm shift Criterion Reference Grids Procedural implications Pedagogic Academic First iteration reactions Students Externals Staff Enhancement Strategies Virtual Learning Environment De-emphasis of ‘The Unit’ Conclusion Additional Benefits Potential Pitfalls Structure

  4. 1999 Curriculum

  5. Existing Unit LOs New Unit LOs Bloom’s Taxonomies Computing Benchmarks Curriculum Alignment Alignment as a Process

  6. Initial process undertaken by Theme - validation was across the Department Sanity check Unit LOs checked for consistency, typos and duplication Mapping exercise Unit LOs numbered and mapped against programme level LOs Verb analysis Some refinements required Minor modifications to units Process Validation

  7. Inception • Critical factor - assessment strategies • 3 Year phased implementation • Awareness Sessions • Staff and Students • September and January • Led by LO ‘champions’

  8. A Paradigm Shift Curriculum alignment based on competency Computing benchmark statements address ‘Threshold’ and ‘Modal’ performance Based on these we identify an expectation to our students for each assessment at 3rd and 2:2 level We then add 2:1 and 1st class benchmark statements For each LO We then GRADE we do not MARK Some LOs need decomposition so we provide criterion statements to identify what is being assessed Articulated through a Criterion Reference Grid (CRG) Assessment strategies

  9. Criterion Reference Grid

  10. Pedagogic Competency approach removes ‘aggregation of failure’ Change in the mechanisms for retrieval retrieval by LO not by Unit removal of monolithic resubmission papers Academic Student Briefing and Feedback CRG became the mechanism for assessment brief and feedback Assessment regulations Change to wording of late submission penalties University still requires numeric input of ‘marks’ by unit Assessment calculations Initial approach was based on boundary marks Procedural Implications

  11. Suggested problem of ‘multiple opportunities to fail’ Student engagement Indications that ‘aiming low’ was problematic Significance of the CRG was sometimes misunderstood CRG as a feedback mechanism was not initially popular The University ‘Doubts’ External Examiners Entirely supportive Initial Perceptions

  12. ‘Multiple opportunities to fail’ Offer multiple opportunities to pass Students ‘aiming low’ Incremental assessment strategies Increase in selective non-submissions Compulsory submission Unit mark calculations Mid-point averaging Limit on LO per unit and Criterion per LO Introduction of ‘refer’ and ‘distinction plus’ grades Feedback Tailored feedback was given in addition to the CRG feedback Response

  13. Calculation Method academic_grade = if (no _component_non_submissions > 0) then 0 else if (mean_criterion_grade >= 4) then if (no_passed _criteria > (no_failed _criteria + no_referred_criteria)) then mean_criterion_grade else 3 else mean_criterion_grade final_grade = if ((interim_grade – presentation_penalty) >= 4) then interim_grade – presentation_penalty else interim_grade where interim_grade = (academic_grade – late_submission_penalty) Criterion Referenced Assessment Model – P Reeve, University of Lincoln

  14. Advantages Grading assessments much easier Consistent message to students Re-use of appropriately written grids (especially at level 3) Consistency across campuses and cohorts Disadvantages Generating CRGs is a skilled task 2 tiered engagement New staff issues Reaction to strict banding Perceived ‘Grade Inflation’ Difficulties in applying the approach to numerate subjects Staff Reactions

  15. Ease of Application 1 2 3 Level Application Matrix Numeric Discursive

  16. Virtual Learning Environment We could not find a VLE without the base unit of assessment component as the Unit / Module Bespoke Intranet - designed to support the DCI approach Specialist Learning Environment (SLE) The repository for all assessment and delivery support Automated grading system (including feedback) Automated end of session reporting Development was incremental Places a heavy support burden on the Department Some downtime problems Improvement Strategies

  17. De-emphasis of ‘the unit’ LOs form the base unit for assessment and learning Change in terminology Learning Packages and Assessment Packages Students can revisit to improve on prior learning Or can visit to broaden student learning Aspiration to de-couple Unit  Tutor Potential for broader usage off campus Partner Colleges Distance learning Improvement Strategies

  18. Assessment Moderation Both Internal and External is largely perfunctory – no requirement to demonstrate LO coverage in each assessment Learning Outcome Coverage Student coverage of Level and Programme outcomes is automatic Pedagogic Shared understanding of why we do what we do Curriculum Development 34 Awards validated in 2005 Additional Benefits

  19. Is Pure too Pure? Conflict with University Culture Conflict with University Regulations Conflict from differential staff buy-in Some tensions over mid-point marking Do our new externals really believe in it? Is competency based assessment problematic in our current market? Retention issues Central office perception Identified Issues

  20. Constructive alignment is very tough in the early stages A shared vision is imperative If you take our Pure Approach… Keep raising awareness after introduction Carefully consider the support mechanisms available to you Keep your externals on board Be prepared to fight your corner Is it worth it? Conclusions

  21. David Cobham dcobham@lincoln.ac.uk 01522 886120 Kevin Jacques kjacques@lincoln.ac.uk 01522 837372 Department of Computing and Informatics, University of Lincoln, Lincoln LN6 7TS Contact Details

More Related