1 / 99

Graduate Attribute Assessment Workshop Brian Frank Director (Program Development)

Graduate Attribute Assessment Workshop Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University May 8, 2011. Objectives. Answer: What is the “outcomes assessment” or “graduate attribute” expectation all about?

juliet
Télécharger la présentation

Graduate Attribute Assessment Workshop Brian Frank Director (Program Development)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Engineering Graduate Attribute Development (EGAD) Project Graduate Attribute Assessment Workshop Brian Frank Director (Program Development) Faculty of Engineering and Applied Science Queen's University May 8, 2011

  2. Engineering Graduate Attribute Development (EGAD) Project Objectives • Answer: What is the “outcomes assessment” or “graduate attribute” expectation all about? • Understand CEAB’s requirements for graduate attribute assessment • Be able create a process to assess graduate attributes

  3. Engineering Graduate Attribute Development (EGAD) Project Administrative issues Slides and summary handout will be posted to EGAD website http://engineering.queensu.ca/egad Other support and resources will described at the end Will provide a summary handout for reference (terminology, process) Active workshop - feel free to ask questions or comment throughout

  4. Engineering Graduate Attribute Development (EGAD) Project Graduate attributes => Quality assurance process • CEAB is requiring each program to create and apply a quality assurance process to improve the program • Like any QA process, it examines the outputs of a process – in this case, the abilities of graduating engineering students • Compared to traditional CEAB approach: much less prescriptive • Pro: more flexibility for programs • Con: less guidance, more uncertainty. “What do they want to see?”

  5. Graduate attribute assessment • Outcomes assessment is used to answer questions like: • What can students do? • How does their performance compare to our stated expectations? • It identifies gaps between our perceptions of what we teach and what knowledge, skills, and attitudes students develop program-wide.

  6. Engineering Graduate Attribute Development (EGAD) Project Inputs and Outcomes

  7. Broader push for outcomes assessment • Accreditation bodies in most industrialized countries use outcomes-based assessment to demonstrate their students' capabilities. • Washington Accord: allows substantial equivalency of graduates from Australia, Canada, Hong Kong, Republic of Ireland, New Zealand, South Africa, United Kingdom, and United States, Japan, Singapore,Korea, and Chinese Taipei • Ontario: University Undergraduate Degree Level Expectations (UUDLEs), Graduate Degree Level Expectations (GDLEs) will assessed in all programs, fortunately overlap graduate attributes

  8. University Undergraduate Degree Level Expectations • Depth and Breadth of Knowledge • Knowledge of Methodologies • Application of Knowledge • Communication Skills • Awareness of Limits of Knowledge • Autonomy and Professional Capacity

  9. Perspective: Sec 3.1 of CEAB Procedures • “The institution must demonstrate that the graduates of a program possess the attributes under the following headings... There must beprocesses in place that demonstrate that program outcomes are being assessedin the context of these attributes, and that theresults are applied to the further development of the program.”

  10. Engineering Graduate Attribute Development (EGAD) Project 12 graduate attributes • Knowledge base for engineering • Problem analysis • Investigation • Use of engineering tools • Design • Individual and team work • Communication skills • Professionalism • Impact on society and environment • Ethics and equity • Economics and project management • Lifelong learning

  11. Engineering Graduate Attribute Development (EGAD) Project CEAB GA assessment instructions (2010) • Describe the processes that are being or are planned to be used. This must include: • a set of indicators that describe specific abilities expected of students to demonstrate each attribute • where attributes are developed and assessed within the program… • how the indicators were or will be assessed. This could be based on assessment tools that include, but are not limited to, reports, oral presentations, … • evaluation of the data collected including analysis of student performance relative to program expectations • discussion of how the results will be used to further develop the program • a description of the ongoing process used by the program to assess and develop the program as described in (a)-(e) above

  12. Setting up a process (without overwhelming faculty, irritating staff, and going deeper into debt)

  13. Engineering Graduate Attribute Development (EGAD) Project Aside: Idealistic course development process Identify course objectives and content Create specific outcomes for each class Course improvement Student input Map to experiences (lectures, projects, labs, etc.) Analyze and evaluate data Deliver, grade, seek feedback Identify appropriate tools to assess (reports, simulation, tests,...)

  14. Engineering Graduate Attribute Development (EGAD) Project Program-wide assessment process flow Identify major objectives (including graduate attributes) Create Indicators Program improvement Stakeholder input Map to courses/ experiences Analyze and evaluate data Course changes/ Measure Identify appropriate tools to assess (reports, simulation, tests,...)

  15. Engineering Graduate Attribute Development (EGAD) Project Program-wide assessment process flow Identify major objectives (including graduate attributes) Create Indicators Program improvement Stakeholder input Map to courses/ experiences Analyze and evaluate data Course changes/ Measure Identify appropriate tools to assess (reports, simulation, tests,...)

  16. Creating Program objectives • CEAB graduate attributes • Strategic plans • Advisory boards • Major employers of graduates • Input from stakeholders • Focus groups, surveys • SWOT (strengths, weaknesses, opportunities, threats) analysis What do you want your program to be known for?

  17. Engineering Graduate Attribute Development (EGAD) Project Program-wide assessment process flow Identify major objectives (including graduate attributes) Create Indicators Program improvement Stakeholder input Map to courses/ experiences Analyze and evaluate data Course changes/ Measure Identify appropriate tools to assess (reports, simulation, tests,...)

  18. Engineering Graduate Attribute Development (EGAD) Project Why indicators? Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Can this be directly measured? Would multiple assessors be consistent? How meaningful would the assessment be? Probably not, so more specific measurable indicators are needed. This allows the program to decide what is important

  19. Engineering Graduate Attribute Development (EGAD) Project Indicators: examples Lifelong learning An ability to identify and address their own educational needs in a changing world in ways sufficient to maintain their competence and to allow them to contribute to the advancement of knowledge Graduate attribute The student: Critically evaluates information for authority, currency, and objectivity Identify gap in knowledge and develop a plan to address Indicators Describes the types of literature of their field and how it is produced Uses information ethically and legally to accomplish a specific purpose

  20. Establishing Indicators • What specific things should students demonstrate? • What do they need to be able to do? • Are they measurable and meaningful? • Can involve cognitive (recalling, analyzing, creating), attitudes, skills Level of expectation (“describes”, “compares”, “applies”, “creates”, etc.) Content area Critically evaluatesinformation for authority, currency, and objectivity

  21. Engineering Graduate Attribute Development (EGAD) Project Indicators • A well-written indicator includes: • what students will do • the level of complexity at which they will do it • the conditions under which the learning will be demonstrated

  22. Engineering Graduate Attribute Development (EGAD) Project Indicators • Knowledge base for engineering • Critically select* and apply* computational formula to solvenovelproblems • (Performance on benchmark like force concept inventory) • Engineering tools • Demonstrate use of a schematic capture and simulation tool to analyze analog and digital circuits • Demonstrate use of a digital oscilloscope to analyze common signals in time and frequency domain

  23. Engineering Graduate Attribute Development (EGAD) Project Problematic indicators What does the author mean? Students can state the laws? Plug numbers into equations? Apply laws to solve conceptual problems? ... Content area Learns static physics principles including Newtonian laws for linear motion

  24. Engineering Graduate Attribute Development (EGAD) Project Taxonomy Creating (design, construct, generate ideas) Evaluating (critique, judge, justify decision) Analyzing (compare, organize, differentiate) Applying (use in new situation) Understanding (explain, summarize, infer) Remembering (list, describe, name) Anderson, L. W. and David R. Krathwohl, D. R., et al (Eds..) (2001) A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. Allyn & Bacon. Boston, MA (Pearson Education Group

  25. Engineering Graduate Attribute Development (EGAD) Project Interpret Compare Contrast Solve Estimate Explain Classify Modify Integrate Verbs for cognitive skills • Define • List • State • Recall • Identify • Recognize • Calculate • Label • Locate • Analyze • Hypothesize • Evaluate • Justify • Develop • Create • Extrapolate • Design • Critique Higher order skills

  26. Engineering Graduate Attribute Development (EGAD) Project Outcomes at Blooms’ Levels (Romkey, McCahan): 26

  27. Defining Indicators for your Program (10 min) • In groups of 2-4: • Select a graduate attribute • Independently create some indicators for that attribute that reflect your program objectives • Discuss indicators at your table. Are they measurable? Are they meaningful? Would the assessment of them be consistent from one rater to another?

  28. Engineering Graduate Attribute Development (EGAD) Project Follow-up to identifying Indicators • Any points for discussion?

  29. Resources on Indicators • EC2000, ABET 2009 • UK-SPEC, Engineering Subject Centre Guide • Engineers Australia • CDIO • Foundation Coalition • UDLEs • IET criteria for ECE Note: Indicators may also be known as: Assessment criteria Performance criteria Outcomes Competencies Objectives Many linked at: http://bit.ly/9OSODq (case sensitive, no zeros)

  30. Engineering Graduate Attribute Development (EGAD) Project ECE indicators • E.g. IEE (now IET) describes requirements for UK BEng and MEng programs • http://www.theiet.org/careers/accreditation/academic/downloads/handbook.cfm?type=pdf

  31. Engineering Graduate Attribute Development (EGAD) Project Program-wide assessment process flow Identify major objectives (including graduate attributes) Identify Indicators Program improvement Stakeholder input Map to courses/ experiences Analyze and evaluate data Course changes/ Measure Identify appropriate tools to assess (reports, simulation, tests,...)

  32. Where can we assess students? • Courses • Co-ops/internships • Co-curricular activities (competitive teams, service learning, etc.) • Exit or alumni surveys/interviews • ...

  33. Curriculum Mapping • Important to know where students (a) develop attributes and (b) are assessed • In a typical program the courses involved in assessing students are a small subset of courses. This might include a few courses from areas including: • Engineering science • Laboratory • Complementary studies • Project/experiential based

  34. Curriculum mapping surveying • U Guelph developing Currickit: Curriculum Mapping Software • Online survey, completed by each instructor, to describe whether an attribute is developed, assessed, or both • Software collects data and reports on attributes in the program • U Calgary surveyed instructors to find out where attributes are Introduced, Developed, or Utilized (ITU) in courses

  35. Engineering Graduate Attribute Development (EGAD) Project Two approaches to mapping • Attributes to courses • “We know what we want the program to look like – how well do the attributes line up with our curriculum?” • Courses to attributes • “Let’s do a survey of our instructors, and determine experiences appropriate to developing and assessing attributes.” Can do this one way or both ways

  36. Engineering Graduate Attribute Development (EGAD) Project ITU Analysis (UCalgary): Introduced

  37. Engineering Graduate Attribute Development (EGAD) Project ITU Analysis: Taught

  38. Engineering Graduate Attribute Development (EGAD) Project ITU Analysis: Utilized

  39. Attributes to courses: Lifelong learning Engineering Graduate Attribute Development (EGAD) Project

  40. Courses to attributes: First year Engineering Graduate Attribute Development (EGAD) Project

  41. Engineering Graduate Attribute Development (EGAD) Project Example: ABET recommends mapping tables

  42. Engineering Graduate Attribute Development (EGAD) Project Program-wide assessment process flow Identify major objectives (including graduate attributes) Identify indicators Program improvement Stakeholder input Map to courses/ experiences Analyze and evaluate data Course changes/ Measure Identify appropriate tools to assess (reports, simulation, tests,...)

  43. Assessment tools • Direct measures – directly observable or measurable assessments of student learning • E.g. Student exams, reports, oral examinations, portfolios, etc. • Indirect measures – opinion or self-reports of student learning or educational experiences • E.g. grades, student surveys, faculty surveys, focus group data, graduation rates, reputation, etc. How to measure learning against specific expectations?

  44. Engineering Graduate Attribute Development (EGAD) Project Assessment tools Local written exam (e.g. question on final) External examiner (e.g. Reviewer on design projects) Standardized written exam (e.g. Force concept inventory) Oral exam (e.g. Design projects presentation) Performance appraisal (e.g. Lab skill assessment) Oral interviews Simulation (e.g. Emergency simulation) Surveys and questionnaires Behavioural observation (e.g. Team functioning) Focus group Portfolios (student maintained material addressing outcomes) Archival records (registrar's data, previous records, ...)

  45. Instructors: “We do assess outcomes – by grades” How well does the program prepare students to solve open-ended problems? Student transcript Are students prepared to continue learning independently after graduation? Do students consider the social and environmental implications of their work? What can students do with knowledge (plug-and-chug vs. evaluate)? Course grades usually aggregate assessment of multiple objectives, and are indirect evidence for some expectations

  46. External assessment tools • Concept inventories (Force Concept Inventory, Statics concept inventory, Chemistry Concept Inventory, …) • Surveys of learning, engagement, etc. • National Survey of Student Engagement (National data sharing, allowing internal benchmarking), E-NSSE • Course Experience Questionnaire • Approaches to Studying Inventory • Academic motivation scale • Engineering attitudes survey

  47. Targets and thresholds • Need to be able to explain what level of performance is expected of students • Useful to consider the minimum performance expectation (threshold) and what a student should be able to do (target) • Rubrics can be very useful

  48. Engineering Graduate Attribute Development (EGAD) Project Rubrics Improve inter-rater reliability Describe expectations for instructor and students

  49. Rubric example • Creating defined levels (“scales”) of expectations reduces variability between graders, makes expectations clear to students threshold target

  50. Task: Assessment tools (5 min) • Take some assessment criteria developed by group previously: • Determine three ways that they could be assessed (a list of assessment tools are on summary sheet), at least one done using a direct assessment tool • If any are difficult to measure, consider whether the criteria should be modified

More Related