1 / 23

Assessing multi-disciplinary teams 

Assessing multi-disciplinary teams . Scott Schaffer, Purdue University Bill Oakes, Purdue University Carla Zoltowski, Purdue University Margaret Huyck, Illinois Institute of Technology Mary Raber, Michigan Technological University John Ochs, Lehigh University

olisa
Télécharger la présentation

Assessing multi-disciplinary teams 

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing multi-disciplinary teams  Scott Schaffer, Purdue University Bill Oakes, Purdue University Carla Zoltowski, Purdue University Margaret Huyck, Illinois Institute of Technology Mary Raber, Michigan Technological University John Ochs, Lehigh University Lisa Getzler-Linn, Lehigh University Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  2. Four Partner Universities Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  3. 1) The “idea to commercialization” process 2) Design thinking 3) Teamwork and leadership  4) Professional and ethical behavior Four institutions with common goals related to project-based team experiences: Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  4. Overview: Undergraduates required to participate in two 3-credit multidisciplinary projects Project is self-selected by the student  Teams average 12 students and one or more faculty supervisors Project topics vary based on sponsors http://ipro.iit.edu/ Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  5. IPD Projects http://www.lehigh.edu/ipd • Undergraduate, 2 semesters, 5 or 6 credits • Multidisciplinary teams of 5 - 7, industry mentor, faculty advisor, TA or senior peer mentor • Placed on team based on affinity to project, major, GPA, interview questionnaire • Projects focus on established or start-up company: solving a real world technical problem in a business context • Experiential learning measured with authentic assessment tools • Monthly assessment of progress and team/individual effort by self, peers, faculty and industry sponsor  • Measure project progress, teamwork, development & effective use of project management, communication, creative problem solving and discipline specific skills Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  6. Learning Objectives: Strong skills in communication and persuasion The ability to lead and work effectively as a member of a multi-disciplinary team A sound understanding of non-technical forces that affect engineering decisions An awareness of global markets and competition Demonstrated management skills and a strong business sense .  http://www.enterprise.mtu.edu/ Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  7. Overview & Learning outcomes Teams of undergraduates are designing, building, and deploying real systems to solve engineering-based problems for local community service and education organizations. Discipline knowledge Design process Lifelong learning Customer awareness Teamwork Communication Ethics Social context http://epics.ecn.purdue.edu/ Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  8. Theoretical Foundations for Project-based Learning 1. Learning is an active process of constructing rather than acquiring knowledge and, 2. Instruction is a process of supporting construction rather than communicating knowledge. (Duffy and Cunningham, 1996. p 171) Experiential Learning (Dewey, 1916;1929) Constructivist Learning Environments (Jonassen, 1994; Duffy & Cunningham, 1996) Authentic Learning (Brown, Collins and Duguid, 1989 Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  9. Design Assumptions in CLE (from Duffy & Cunningham, 1996) 1. All knowledge is constructed; all learning is a process of construction 2. Many world views can be constructed: hence there will be multiple perspectives 3. Knowledge is context dependent, so learning should occur in contexts to which it is relevant 4. Learning is mediated by tools and signs 5. Learning is an inherently social-dialogical activity 6. Learners are distributed, multidimensional participants in a socio-cultural process 7. Knowing how we know is the ultimate human accomplishment Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  10. What knowledge do we expect learners to construct? “A major criteria for assessing knowledge construction outcomes must be originality.” Jonassen, 1994 Create new goals and methods for learning Solve relevant problems Develop and defend a position Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  11. What to assess in a CLE? • Knowledge construction; Process & Product • Learning outcomes should reflect intellectual processes of higher order thinking • Merrill’s “find” • Gagne’s “cognitive strategies” • Bloom’s “synthesis” Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  12. Learning Assessment & Evaluation Concepts Performance assessment is testing complex, higher order k/s in the real world context in which they are actually used, generally with open-ended tasks that require substantial examinee time to complete. (p 5) Norm-referenced testing Criterion-referenced testing (Shrock & Coscarelli, 1989) “Authentic” assessment = Performance assessment (Swanson, Norman, & Linn, 1995) Formative Assessment and Summative Evaluation Individual and Team Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  13. Types of Performance Assessment Observation Open-ended problems Microworlds Essays Simulations Projects Portfolios Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  14. Cognitive activity and Structure of Knowledge (Baxter, G.P., and R. Glaser 1998) Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  15. Assessing Process and Product Formative assessment of understanding (integrated into learning ) Product should reflect metacognitive awareness resulting from learning experiences Criteria for assessing products should be authentic – that is, reflect meaningful, real-world criteria Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  16. Individual performance Reflective journal or self-assessment Individual notebook Peer evaluation Assessment Approaches across Institutions Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  17. Team performance Video-based assessment Product quality evaluation Client evaluation Advisor evaluation  Assessment Approaches across Institutions Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  18. What do learners know? • Initial Knowledge Acquisition • Well-structured domains; skills based • Advanced Knowledge Acquisition • Ill-structured domains; knowledge based • Expertise • Elaborate structures; interconnected knowledge (Spiro, et. al., 1988; Duffy & Jonassen, 1992) Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  19. Cross-disciplinary Team Learning Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009 Cross-disciplinary Learning: Islands of Knowledge, Awareness, Appreciation, Understanding (Fruchter & Emery, 1999) Individual Learning and Motivation: Self-Collective Efficacy (Bandura, 2001) Context: Activity Systems (Vygotsky, 1978; Engstrom, 1987)

  20. Cross-disciplinary Team Learning Framework Adaptation Collective-Efficacy Team-Outcome Knowledge-Creation Formation • Team Goal Setting • Leadership • Role Identification • Trust • Interdependence • Social Support • Peer Feedback • Client Feedback • Expert Feedback • Communication & Collaboration Tools • Information Tools • Cognitive & Knowledge Creation Tools • Awareness • Appreciation Identification • Self-Assessment • Information Seeking • Personal Goal Setting • Strategic Planning • Self-Monitoring • Goal Alignment • Shared Mental Model • Understanding • Creativity • Innovation Self-Efficacy Individual-Process Knowledge-Acquisition Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  21. Assessment Imperatives Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009 • MODEL: • Define and categorize problem structures engaged in by team/team members • Analyze cognitive, affective, and connative requirements • OBSERVE: • Identify strategic, procedural, systemic or conceptual knowledge assessment tools (what, how, and why do they do what they do) • INTERPRET: • Correlate assessment with individual/team performance • Align assessment approaches within and across social-cultural-institutional contexts

  22. NSF Project Y2 Goals • Align objectives, activities and assessments (OAA) within each program • Identify high value OAA combinations within each context • Fully describe these combinations from a social-cultural perspective in case studies • Create and/or modify existing assessment instruments based on 1,2,3 Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009

  23. References and Resources Best Assessment Practices Symposium, Indianapolis, IN - April 3-4, 2009 • Bandura, A. (2000). Exercise of human agency through collective efficacy. Current Directions in Psychological Science, Volume 9, Number 3, June 2000 , pp. 75-78(4). • Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42. • Dewey, J. (1916) Democracy and Education. An introduction to the philosophy of education (1966 edn.), New York: Free Press. • Engeström, Y. (1987). Learning by expanding: An activity-theoretical approach to developmental research. Hlesinki: Orienta-Konsultit. • Fruchter, R., & Emery, K. (1999). Teamwork: Assessing cross-disciplinary learning. In C. Hoadley & J. Roschelle (Eds.), Proceedings of the Computer Support for Collaborative Learning (CSCL) 1999 Conference. Stanford University, Palo Alto, California. • Huyck, Margaret, Daniel Ferguson (2007, June). Assessing factors contributing to undergraduate multidisciplinary project team effectiveness. Presented and published in Proceedings of American Association of Engineering Education National Meetings, Honolulu, Hawaii. • Lei, K., & Schaffer, S.P. (2008, March). Theoretical development and empirical validation for a cross-disciplinary team learning (CDTL) model. Proceedings of the American Educational Research Association (AERA) Conference, New York, NY, USA. • Ochs, J.B., Watkins, T.A., Boothe, B.W., (2001).Creating a Truly Multidisciplinary Entrepreneurial Educational Environment. Proceedings of the 2001 American Society for Engineering Education Annual Conference & Exposition • Pilchta, M.R., Raber, M., (2003). The Enterprise Program at Michigan Tech University: Results and Assessment to Date. Mark R. Plichta, Mary Raber. Proceedings of the 2003 ASEE Conference. • Schaffer, S. P., Lei, K., Reyes, L., Oakes, W. C., & Zoltowski, C. (2007, June). Assessing Design Team Learning in a Collaborative Service Learning Environment. Proceedings of the American Society of Engineering Education (ASEE) Conference. Honolulu, HI. • Schaffer, S.P., Lei, K., & Reyes Paulino, L. (2008). A framework for cross-disciplinary learning and performance. Performance Improvement Quarterly, 21(3), 1-16. Resources • Assessments for Capstone Engineering Design. Developed by Transferable Integrated Design for Engineering Education Consortium – http://tidee.org • Authentic Task Design - http://www.authentictasks.uow.edu.au/index.html . Authentic activity as a model for web-based learning. Project funded by the Australian Research Council.

More Related