1 / 36

2 nd EAC-MCED Dialogue

2 nd EAC-MCED Dialogue. Megat Johari Megat Mohd Noor Universiti Teknologi Malaysia International Campus Kuala Lumpur 22 nd February 2010. Topics. Introduction Best Practices Concerns Causes Development Involvement Feedback Conclusion. Introduction. Objectives of Accreditation.

Télécharger la présentation

2 nd EAC-MCED Dialogue

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2nd EAC-MCED Dialogue MegatJohariMegatMohdNoor UniversitiTeknologi Malaysia International Campus Kuala Lumpur 22nd February 2010

  2. Topics • Introduction • Best Practices • Concerns • Causes • Development • Involvement • Feedback • Conclusion

  3. Introduction

  4. Objectives of Accreditation • Ensure programmes attain standard comparable to global practice (pg 1 Sec 1.0 EAC Manual) • Ensure CQI culture (pg 1 Sec 1.0 EAC Manual) • Ensure graduates can register with BEM (pg 1 Sec 2.0 EAC Manual) • Ensure CQI is practiced (pg 1 Sec 2.0 EAC Manual) • Benchmark engineering programmes (pg 1 Sec 2.0 EAC Manual)

  5. Accreditation Policy • Focus on outcomes and developed internal system (pg 4 Sec 5.1 EAC Manual) • Determining the effectiveness of the quality assurancesystem(pg 4 Sec 5.1 EAC Manual) • Compliance to criteria (pg 5 Sec 5.5 EAC Manual) • Minor shortcoming(s) – less than 5 years accreditation (pg 4 Sec 5.6 EAC Manual)

  6. EAC Focus • Breadth and depth of curriculum • Outcome based approach • Continual quality improvement • Quality management system

  7. EAC Criteria • Program Objectives • Program Outcomes • Academic Curriculum • Students • Academic & Support Staff • Facilities • Quality Management System

  8. Universities Best Practices

  9. Best Practices - Curriculum • Extensive stakeholders involvement • External examiner with adequate TOR • Balanced curriculum including assessment; cognitive, psychomotor & affective • Comprehensive benchmarking (including against WA attributes) • Considered seriously students’ workload distribution • Various delivery methods

  10. Best Practices – System • Systematic approach to demonstrate attainment of program outcomes • Staff training (awareness) on outcome based approach • Moderation of examination questions to ensure appropriate level • Course CQI implemented

  11. Best Practices – System • System integrity ensured by committed and dedicated staff • Constructive leadership • Comprehensive self assessment report • Planned and monitored activities (PDCA) • Well documented policies / procedures and traceable evidence • Certification to ISO 9001/17025, OSHAS 18001

  12. Best Practices - Staff • Highly qualified academic staff (PhD/PE) with research and industry experience • Staff professional development and involvement • Staff training (awareness) on outcome based approach • Research / industry experience that enhance undergraduate teaching • Academic staff in related discipline • Ideal staff: student ratio (1:10 or better)

  13. Best Practices – Students & Facilities • Awareness programs for students on outcomes • Remedial classes to bridge basic knowledge gaps • Current (not obsolete) laboratory equipment in appropriate number • High end laboratory equipment • Emphasis on safety

  14. Accreditation Concerns

  15. PEO & PO • Specialisation at undergraduate level (eg. BEng [Nanotechnology]) • Stakeholders involvement (eg. IAP); minimal and/or inappropriate • Program objectives (PEO); restatement of program outcomes • Program outcomes (PO); only cognitive assessment

  16. Curriculum • Benchmarking; limited to curriculum (virtual) • No link between engineering courses and specialisation • Course outcomes mapping to PEO/PO; not well understood by academic staff • Delivery method; traditional not embracing project/problem based (open-ended)

  17. Curriculum • Courses devoid of higher cognitive level • Team teaching not visible (not involved in planning nor summative evaluation) • Industrial training (exposure); taking up a semester teaching time and/or conducted last

  18. Assessment & Evaluation • Assessment types and weightage; favour high grades or facilitate pass • Depth (level) of assessment; not visible / appropriate (lack of philosophy) • Examination questions; not challenging • Lack of summative evaluation • Mostly indirect assessement (simplistic direct assessment; grade=outcome)

  19. Staff & Facilities • Varied understanding of system (OBE) • Academic staff; professional qualification / experience limited (mostly young academics) – issue of planning and recruitment policy • Inadequate laboratory equipment / space / technician • Laboratory safety • Ergonomics

  20. Quality Management System • Follow-up actions; slow or not visible • No monitoring • Grading system (low passing marks) • Adhoc procedure (reactive) • Financial sustainability • Incomplete cycle (infancy)

  21. Causes & Development

  22. Causes • Top management; not the driving force (delegation & accountability) • Academic leadership • Inadequate staff training or exposure • Awareness of EAC requirements • Unclear policy, procedures and/or philosophy • Understanding between engineering & technology

  23. Development

  24. Latest Development • 3 PE (or equivalent) per program • Industrial training – vacation (not to take up the regular semester) • WA-graduate attribute profile- Project Management & Finance • WA- typically 4-5 years of study, depending on the level of students at entry • WA- (knowledge aspect) engagement in research literature • Potential merger of European-WA attributes leading to requirement of more advanced courses

  25. EAC Professional Development • Submission to EAC (1-2 days); March 2010 • Outcome based education (2-3 days); April 2010 • Panel evaluators (3-4 days); May 2010 • Evaluator refresher (1/2 - 1 day); May 2010 • On-the-job training (accreditation visit) • Customised workshop/courses • EAC 1st Summit & Forum Aug 2010, Kuching

  26. Improvements • Defer rejection for Application for Approval, and IHL will be called to discuss for resubmission • Response to Evaluators’ report would require IHL’s corrective action as well apart from correcting factual inaccuracies, and would be tabled at EAC meeting

  27. Involvement

  28. EAC Involvement • Accreditation • Recognition • Mentoring • Mutual recognition – CTI France • NABEEA • IEA (Washington Accord) • FEIIC (EQAPS)

  29. Universities • Evaluation Panel • Joint Committee on Standard • Local Benchmarking • Knowledge Sharing (systems) • Local & International Observers • EAC/Professional activities • Interpreting WA graduate attributes • Industry Sabbatical • International collaboration (research + academic)

  30. Feedback

  31. Feedback from Universities • UNIM • UTAR • UTM • IIUM • UNIMAS • UMS • USM • UiTM

  32. Rated Poor (2/5) • Explanation by Panel chair (UNIM) • Interview session with lecturers (UNIM,UTM) • Interview session with students (UNIM) • Time keeping (UTM, USM) • Asking relevant question according to EAC Criteria (IIUM, USM) • Checking records (USM) • Commitment and cooperation during visit (IIUM)

  33. Recapitulation from 1st Dialogue • Not fault finding (need to highlight strength) • Sampling may not be representative • Giving adequate time to adjust with changes to the Manual • Time frame to obtain results • PE definition to be opened to other Professional bodies • No clear justification requiring PE (nice to have) • Appoint suitable and “related discipline” evaluators • Appoint non-PE academics • Usurping the power of senate • MCED should be given the mandate to nominate academics to EAC • Spell out the Manual clearly (eg. benchmarking) • Assessment of EAC evaluators • Flexibility of Appendix B • Local benchmarking • Response at exit meeting • Engineering technology vs Engineering

  34. Conclusion

  35. Conclusion • Great potential in leading engineering education • Quality & competitive engineering education • Contributing to greater goals • Sharing of knowledge and practice • Systems approach outcome based education • Participative and engaging rather than adversary • Professional development • Facilitating and developmental

  36. Thank you

More Related