American Society For Engineering Education
200 likes | 340 Vues
American Society For Engineering Education. Annual Conference St. Louis, MO June 18-21, 2000. Using Quality Function Deployment to Meet ABET 2000 Requirements for Outcomes Assessment Prof. Phillip R. Rosenkrantz Cal Poly Pomona. Outcomes and Assessment Team. ABET 2000 Criteria
American Society For Engineering Education
E N D
Presentation Transcript
American Society For Engineering Education Annual Conference St. Louis, MO June 18-21, 2000
Using Quality Function Deployment to Meet ABET 2000 Requirements for Outcomes AssessmentProf. Phillip R. RosenkrantzCal Poly Pomona
Outcomes and Assessment Team • ABET 2000 Criteria • 1.5 year-long project • Faculty involvement • Industry involvement • Alumni involvement
Selection of Assessment Methodology • Strategic Planning • Malcolm Baldrige National Quality Award Criteria • Total Quality Management (TQM) • Quality Function Deployment (QFD) • Customized Approach
Quality Function Deployment Chosen as Primary Methodology • Enthusiastically supported by the full IME faculty. • Adaptations and enhancements using other methodologies • QFD team formed (Dept, IAC, Alumni) • Met regularly for five quarters • “Modified” version of QFD was used.
Phase I The Voice of the Customer • The IME Department recognized constituencies or “stakeholders” that need to be considered in all curriculum, scheduling, and program related decisions. • Identified eighteen stakeholders.
Three Categories of Stakeholders • Those we serve; • Those who use our graduates; • Those who regulate us • Used 1, 3, 9 weighting scale
Most Important (9 points) • Students (& Alumni) • University Administration/CSU • Manufacturing sector companies • ABET (accrediting agency) • State Government
Next Most Important (3 points) • Other faculty/departments • Parents of students • Service companies • Board of Professional Engineers • ASQ (Certification) • SME (Certification)
Least Important (1 point) • Grad schools • General public • Granting agencies • Public sector employers • Information sector companies • WASC • APICS
Phase IIProgram Objectives and Outcomes(Needs Assessment) • Department Mission Statement • Department Objectives • ABET “a-k” outcomes • SME “Competency Gaps” • “Other” sources • Result: Goals & 24 “SKAA’s” (Skill, Knowledge, Attitude, and Ability areas)
Phase IIIQFD Implementation • Five Matrices • Interative Process • Results flowed from one matrix to the next • Fast Input from many stakeholders • Provided valuable results • Quantifiable
Matrix 1:Stakeholder vs. SKAA • 18x24 matrix was used to evaluate the importance of each SKAA for each stakeholder. • Identified which SKAAs are the most important overall. The result is a ranking that include the importance weighting for each stakeholder.
Matrix 2: SKAA vs. Core Course • Core courses evaluated on current SKAA coverage. • Column totals reveal how much each individual course covers SKAA’s. • Row totals show how much each SKAA is covered in the curriculum. • Rankings of SKAA row totals reveal potential weaknesses in the curriculum.
Case Study - IME 415 Quality Control by Statistical Methods • Column total was initially 41 points. • Professionalism/Ethics & Social Responsibility (+8) • Teaming – Team projects (+8) • Employability – Six-Sigma Quality (+2) • Use Skills/Tools – Web, Charts (+3) • Reliability Engineering – Intro (+3) • Quality Standards –ISO/QS 9000 (+0) • Added 24 points to the column = 75 points
Matrix 3:SKAA vs. Methodology • Developed list of current and potential teaching methodologies. • Methodologies evaluated against each SKAA for “potential” effectiveness and assessment capability. • Rankings indicate methodologies with the most potential benefit in achieving and evaluating desired outcomes.
Matrix 4:SKAA vs. Assessment Tool • List of existing and potential assessment tools. • Presented to the faculty and modified. • Tools rated for potential effectiveness in assessing the degree to which each SKAA has been effectively taught. • Used to decide which tools should be supported at the department level.
Matrix 5:Assessment Tools vs. Core Courses • Core courses rated for the potential effectiveness of the tool. • Matrix gives each faculty member a more complete list of assessment options for the courses taught.
Phase IVAction Planning • Timetable • New Industry Survey Instruments • Revised Instructional Assessment Instrument • Exit Interview Process