1 / 34

How do Computing Programs Demonstrate the Attainment of Professional Skills to Evaluators?

How do Computing Programs Demonstrate the Attainment of Professional Skills to Evaluators?. Stephen B. Seidman Texas State University. Computing Accreditation in the US. ABET (formerly Accreditation Board for Engineering and Technology) CSAB (formerly Computer Science Accreditation Board)

marius
Télécharger la présentation

How do Computing Programs Demonstrate the Attainment of Professional Skills to Evaluators?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do Computing Programs Demonstrate the Attainment of Professional Skills to Evaluators? • Stephen B. Seidman • Texas State University

  2. Computing Accreditation in the US • ABET (formerly Accreditation Board for Engineering and Technology) • CSAB (formerly Computer Science Accreditation Board) • ABET • 4 commissions • EAC is responsible for engineering accreditation • CAC is responsible for computing accreditation • 29 member societies, including CSAB

  3. An ABET visit to a program is a team effort. • An ABET team consists of • a chair, appointed by the relevant commission • one visitor for each program, appointed by a society • However, an ABET team must have at least three members.

  4. Computing Programs Accredited by ABET • Computer Science (CAC) • Computer Engineering (EAC) • Software Engineering (EAC) • Information Systems (CAC) • Information Technology (CAC)

  5. ABET Criteria • Students • Program Educational Objectives • Program Outcomes • Continuous Improvement • Curriculum • Faculty • Facilities • Support • Program-Specific Criteria (e.g., Computer Science)

  6. Professional Competences in ABET’s Criteria for Computing Programs • Criterion 2 • The program has documented, measurable objectives that are based on the needs of the program’s constituencies. • Criterion 3 • The program has documented, measurable outcomes that are based on the needs of the program’s constituencies.

  7. The program enables students to achieve, by the time of graduation: • (a) an ability to apply knowledge of computing and mathematics • (b) an ability to analyze a problem, and identify and define the computing requirements appropriate to its solution • (c) an ability to design, implement, and evaluate a computer-based system, process, component, or program to meet desired needs • (d) an ability to function effectively in teams to accomplish a common goal • (e) an understanding of professional, ethical, legal, security and social issues and responsibilities • (f) an ability to communicate effectively with a range of audiences • (g) an ability to analyze the local and global impact of computing on individuals, organizations, and society • (h) recognition of the need for and an ability to engage in continuing professional development • (i) an ability to use current techniques, skills, and tools necessary for computing practice

  8. Objectives vs. Outcomes • Objective: an attribute of a program graduate 3-5 years after graduation • Outcome: an expectation for the set of program graduates at the time of graduation

  9. Note that programs may • use the ABET expectations as their program outcomes • develop their own outcomes • in this case, they must demonstrate that the fulfillment of their outcomes implies that the ABET attributes will be enabled for their graduates

  10. Criterion 4 • The program uses a documented process incorporating relevant data to regularly assess its program educational objectives and program outcomes, and to evaluate the extent to which they are being met. The results of the evaluations are documented and used to effect continuous improvement of the program through a documented plan.

  11. Examples of Professional Competence Objectives • Graduates will • be able to contribute effectively to society • be able to work effectively as team members

  12. Examples of Professional Competence Outcomes • (d)-(h) in the ABET attribute list • By the time of graduation, students in the program will have practiced, and have an ability to use, written and oral communication skills necessary to be effective in the IT industry

  13. Assessment of Objectives and Outcomes • Assessment process • Frequency and timing of assessments? • What data are collected? • How is data collected? • From whom is it collected? • What is the measure of success? • How are assessment results used, and by whom?

  14. Frequency: • data items are collected periodically • Data sources: • employers and graduates: survey data, reports on internships and job interviews • input from industrial advisory board members

  15. input from students • senior surveys • exit interviews • town hall meetings • input from faculty • formal: results from embedded questions in specific courses • informal: feedback on student accomplishments

  16. Targets (measures of success) • Surveys • threshold for mean of numerical responses to specific questions • Course data • threshold for mean of results on embedded question or attribute • performance criteria: • at least x% of students will score at least y on a given question • no more than w% of students will score below z on a given question

  17. Use of results • Generally, a faculty committee is charged with collecting the results, determining whether they meet the targets, and recommending program changes. • Other faculty committees then implement the changes, and another improvement cycle begins.

  18. Example 1 • Outcome: an ability to communicate effectively with a range of audiences • Measure (one of three): • In CSxyz, instructor evaluation of two presentations made by groups on their project work.

  19. Target • At least 80% of students at least adequate • At least 10% of students rated excellent • No more than 5% of students rated unsatisfactory • Results • Of the 86 students in the course, 20.9% were rated excellent • 68.6% were rated good • 10.5% were rated adequate or competent • No student teams had an average below this. • Note that these were group presentations, with scores given to the entire group. • Note that the target was met.

  20. Curriculum or course changes • None • Questions and concerns • How was the grading done? Did it use a clear and reasonable rubric? • Can one infer individual competences from a group activity?

  21. Example 2 • Outcome: the ability to function effectively on teams to accomplish a common goal • Measure: • Evaluation of specific course learning objectives in four courses. • Target: • Average weighted score of 70%.

  22. Results • Average student scores were over 80%. • Curriculum or course changes • None • Questions and Concerns • Doesn’t it make more sense to ask a given percentage of students to reach a desired level of competence, rather than setting a standard for a class average?

  23. Example 3 • Outcome: the ability to function effectively on teams to accomplish a common goal • Measure : • Reports from student exit interviews, surveys and graduates • Informal (anecdotal) observations that team efforts had been unsatisfactory; specifically, only one or two students on a team actually did any work.

  24. Quality-improvement response: • 5 specific courses were designated as teamwork-intensive • the following rubric was designed to assess teamwork performance of each member of a team

  25. The rubric is distributed to all students in these classes and discussed at the beginning of the semester. • It is used at least twice to assess performance of team members: in the first half of the project and at the end. • Each student judges him/herself and all other team members.

  26. The first evaluation is used by the instructor to identify any problems and give guidance. • The second evaluation is the basis for teamwork-related grades to individual students. • Successive assessment cycles will determine if this change has led to improved results with respect to this outcome.

  27. Example 4Changing a professional competence outcome • Original outcome: an ability to communicate effectively with a range of audiences • Modified outcome: an ability to communicate effectively, both written and oral, with a range of audiences • Process used: faculty discussion • How will the ability to engage effectively in written and oral communication be assessed?

  28. Assessment reports from five courses • Each course has course learning objectives (CLOs) and specific examination, lab problems, or exercises related to each CLO. • Average student scores are computed for each course. • For each course-outcome pair, it is determined whether the course enables the outcome slightly, moderately, or substantively. • This is used to obtain a weighted course assessment score for each program outcome; the target is 70%.

  29. Industrial advisory board evaluation of capstone design projects • Surveys obtained from exit interviews • Survey created by a national organization (Educational Benchmarking, Inc.; www.webebi.com)

  30. Questions and concerns: • Reports from courses are insufficient without information as to just what is being reported. • How is student ability in the area of oral and written communication evaluated? • Are instructors using rubrics to evaluate student communication? • How are these rubrics created and maintained? • Although the EBI has two questions dealing with communication, do the results yield useful information?

  31. Discussion and Conclusions • The ABET approach to the accreditation of engineering and computing programs is based on outcomes. • ABET requires each program to develop program-specific objectives and outcomes, along with processes to maintain them and to assess whether they are being attained. • ABET also requires programs to develop processes that use the program-specific objectives and outcomes to improve program quality.

  32. The objectives and outcomes include and entail several professional competencies. • ABET’s approach to assessing professional competencies is therefore process-based. • The assessment processes will be different for each program, though best practices are starting to emerge.

  33. Discussion: strengths and weaknesses of this approach?

More Related