1 / 37

Capstone Engineering Design Learning and Assessment

Capstone Engineering Design Learning and Assessment. Denny Davis, PhD, PE Washington State University Engineering Education Seminar Purdue University September 18, 2008. Acknowledgements. Project Leadership Team

dard
Télécharger la présentation

Capstone Engineering Design Learning and Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Capstone Engineering Design Learning and Assessment Denny Davis, PhD, PE Washington State University Engineering Education Seminar Purdue University September 18, 2008

  2. Acknowledgements • Project Leadership Team • Denny Davis, Howard Davis, Michael Trevisan, Shane Brown, Washington State University • Steven Beyerlein, Jay McCormack, University of Idaho • Phillip Thompson, Seattle University • Olakunle Harrison, Tuskegee University • Project Consultants • Susannah Howe, Smith College • Patricia Brackin, Rose-Hulman Institute of Technology • Paul Leiffer, LeTourneau University • Durward Sobek, Montana State University • Jerine Pegg, University of Idaho • Funding • NSF DUE 0717561: Capstone Engineering Design Assessment:Development, Testing, and Adoption Research

  3. Project Goal and Objectives • Goal • Develop an integrated system for effective, sustainable assessment of capstone engineering design outcomes • Objectives • Develop an assessment system suitable for broad adoption in capstone engineering design courses • Document effectiveness of the assessment system to measure student achievement • Investigate factors that affect assessment adoption by the capstone engineering design community

  4. Guiding Research Questions Research Questions: • To what extent can assessments measure desired performances in learner and solution development? • How can assessments be integrated effectively into capstone design courses? • How can adoption of assessments be encouraged in capstone design courses?

  5. What Outcomes?

  6. Team- work Professional Development Solution Assets Design Processes Areas of Performance Learner Development 4. Idea Generation 3. Problem Definition 5. Idea Evaluation 2. Information Gathering Iteration/reflection 6. Idea Refinement 1. Recognition of Challenge 7. Implementation Solution Development

  7. Learner Development • Professional Development • Individuals performing and improving individual skills and attributes essential to engineering design • Teamwork • Teams developing and implementing collective processes that support team productivity in design

  8. Solution Development • Design Processes • Practices implemented that effectively and efficiently facilitate the production of valuable project assets • Solution Assets • Results from a design project that meet needs and deliver satisfaction and value to key project stakeholders

  9. Assessment Framework

  10. Capstone Course Assessment Framework Observation • Sampling • Student Sample • Knowledge Sample • Time of Sample • Measures • Outcomes • Levels • Metrics Assessment Triangle* • Tasks • Individual Tasks • Team Tasks • Profile of • Learner • Background • Skill Set • Motivation • Profile of • Professional • Roles • Behaviors • Scoring • Training • Reliability • Performance • Criteria • Learners • Solutions • Expectations • Students • Faculty • Clients • Administrators • Accreditors • Employers • Reporting • Learning • Grading • Improvement Model • Course Context • Project Mix • Professor Preparation • Infrastructure/Resources • Role in Program Interpretation *NRC, Knowing What Students Know.

  11. Performance Criteria: Learner • Professional Development • Individuals document professional development aligned with their personal and project needs, professional behaviors, and ways of a reflective practitioner. • Teamwork • Teams demonstrate high productivity, synergistic individual and joint contributions, a supportive team climate, and well-developed team processes.

  12. Performance Criteria: Solution • Design Processes • Designers resourcefully iterate among problem scoping, concept generation, and solution realization activities to co-develop problem understanding and a responsive design solution. • Solution Assets • Designers deliver and effectively defend solutions that satisfy stakeholder needs for functionality, financial benefit, implementation feasibility, and impacts on society.

  13. Purposes of Assessment • Measure Achievement • Guide changes in instruction • Gather data for grading • Document student achievement in course • Study learning processes • Facilitate Learning • Guide learners’ effort to greater learning • Teach self- and peer-assessment skills • Establish reflective practitioner mindset

  14. Assessment Design

  15. Assessing Reflective Practice • Instructional Activities • Assign reflections on performances • Assign similar reflections multiple times • Evidence • Quality of reflections improve over time • Reflective assignments reveal gains from natural reflective practice Reflection on Goals Reflection on Progress Reflection on Achievement

  16. Capstone Course Context ? ? ? Learner Development ? Solution Development ? ? Problem Scoping Phase Concept Generation Phase Solution Realization Phase Project Timeline

  17. Structure of Assignments Formative Summative • Self-Rating • Importance • Level • Self-Rating • Importance • Level • Set Target • Describe it • Action plan • Strengths • Describe them • Explain causes • Progress • A strength • An opportunity • Extension • Envision it • Define impacts

  18. Structure of Feedback Performance Metric Student Response Level 1 Level 2 Level 3 Level 4 Level 5 Self-Rating Factor 1: X Narrative X Factor 2: • Descriptions • Analysis • Extension Narrative Feedback • Comments and suggestions

  19. Assessing Learner Development Growth Progress [I] Feedback Growth Planning [I] Feedback Growth Achieved[I] Assessment Individual-focus Professional Practices [I] Feedback Team Member Citizenship [I] Feedback Team Member Citizenship [I] Feedback Team Processes [I] Feedback Team Contract[T] Feedback Team-focus Teamwork Achieved[I] Assessment Professional Development Processes Activities Team Development Processes Problem Scoping Phase Concept Generation Phase Solution Realization Phase Project Timeline Assessment Notation: [I] = individual assignment [T] = team assignment

  20. Professional Development Assessments Growth Progress Growth Achieved Growth Planning Professional Practices

  21. Teamwork Assessments Team Processes Teamwork Achieved Team Contract Team Member Citizenship

  22. Assessing Solution Development Selected Concept [T] / Design Reflection [I] Assessments Proposed Solution[T] / Design Reflection [I] Assessments Proposed Solution[T] / Design Reflection [I] Assessments Defined Problem [T] / Design Reflection [I] Assessments Evaluation Concept Generation Processes [I] Feedback Solution Realization Processes [I] Feedback Solution Realization Processes [I] Feedback Problem Scoping Processes[I] Feedback Feedback Concept Generation Processes Solution Realization Processes Activities Problem Scoping Processes design implementation Problem Scoping Phase Concept Generation Phase Solution Realization Phase Project Timeline Assessment Notation: [I] = individual assignment [T] = team assignment

  23. Design Processes/Solution Assets Design Phase • Problem scoping • Concept generation • Solution realization Design Reflection Design Reflection [Design Phase] Processes [Design Phase] Processes [Asset] [Asset] Asset • Defined problem • Selected concept • Proposed solution

  24. Assessment Evaluation

  25. Assessment Validity and Reliability • Scoring Reliability • Inter-rater reliability • Content Validity • Instructor and practitioner content analysis • Value to Users • Value gained by students and instructors

  26. Web-Based Implementation

  27. Why Web-Based Implementation? • Practicality • Reduce paperwork of assessment process • Enable automated processing of data • Facilitate testing of assessments • Effectiveness • More rapid feedback to students (learning) • Better view of team member performances • Supports reflection on progress and learning

  28. Web-based System Concept Instructor Receives Summary Data (ratings, comparisons) 5 1 Instructor Makes Assignment (what, when) 3 Instructor Prepares Feedback (ratings, comments) Responses Saved and Compiled (secure database) 4 2 • Student Retrieves • Feedback • from peers (anonymous) • from instructor Students Complete Assignment (ratings, explanations)

  29. Make Assignment

  30. Student Coaching of Members What makes it strong? How does it benefit the team?

  31. Scoring of Student Work Comments or suggestions

  32. Summary Measure desired performances • Individual and team development • Design process and solution development • Reflective practices Integration into capstone design • Assignments for instruction (formative) • Assignments for assessment (summative) Adoption of assessments • Web-based implementation • Reference-based performance scoring • Testing underway

  33. Questions ? ? Contact Denny Davis, Washington State University davis@wsu.edu or davis162@purdue.edu

  34. Scoring: Executive Summary

  35. Scoring: Solution Specifications

  36. Scoring: Proposed Solution

More Related