1 / 14

By W. R. Eddins, York College of Pennsylvania George Strouse, York College of Pennsylvania

OUTCOMES ASSESSMENT VIA RUBRICS: A PILOT STUDY IN AN MIS COURSE AS A PRECOURSOR TO A MULTIPLE MEASURE APPROACH. By W. R. Eddins, York College of Pennsylvania George Strouse, York College of Pennsylvania At NABET 2012. Outcomes Assessment with Rubrics.

liuz
Télécharger la présentation

By W. R. Eddins, York College of Pennsylvania George Strouse, York College of Pennsylvania

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OUTCOMES ASSESSMENT VIA RUBRICS:A PILOT STUDY IN AN MIS COURSE AS A PRECOURSOR TO A MULTIPLE MEASURE APPROACH By W. R. Eddins, York College of Pennsylvania George Strouse, York College of Pennsylvania At NABET 2012

  2. Outcomes Assessment with Rubrics • Our previous research in Outcomes Assessment (OA) focused on inputs and outputs • This paper adds a process dimension to OA - a service system oriented approach • Activities related to Service System Development • Analysis and design • Development activities • Integration • Verification and validation • Experimental design • Findings • Conclusion

  3. Previous Research • A Revised Collaboration Learning Scale: A Pilot Study in an MIS Course (2011) • Subjects: students (10) • Independent variables: RIPLS, tests quizzes and case projects • Dependent variable: Likert scale and grades • Significant findings: collaboration improved and case study grades improved • Using the MSLQ to Measure Collaboration and Critical Thinking in an MIS Course (2010) • Subjects: students (14) • Independent variables: MSLQ, tests, quizzes and case projects • Dependent variable: Likert scale and grades • Significant findings: peer learning improved, and quizzes and case study grades improved

  4. Service System Oriented Approach • Capability Maturity Model Integration for Services (CMMI-SVC) • Forrester et al (2011) • SEI (http://www.sei.cmu.edu/) • Maturity levels (initial, managed, defined, quantitatively managed and optimized) • Service System Development definition p561 • The purpose of Service System Development (SSD) is to analyze, design, develop, integrate, verify, and validate service systems, including service system components, to satisfy existing or anticipated service agreements

  5. Analysis & Design and Use Cases • Apply Rubric • Each faculty member will use the system to apply the rubric to student responses, both pre and post. The faculty member will cycle through the set of student responses to tests, case projects, or other assessment instrument, and select the appropriate milestone for each rubric criterion. The system will not display the student identifier or the type of the response (pre or post) as the faculty member cycles through the complete set of responses.

  6. Analysis & Design and Data Model

  7. Developed using VS, C# & Access

  8. Integration • Sponsor: Dean of Academic Affairs • Assessment Program Learning Committee • Association of American Colleges and Universities VALUE Rubrics • Professionalism and problem solving in the syllabus of IFS305-Management Information Systems included in … • Vision statement of the department • Implemented in student memos

  9. Verification and Validation • AACU’s problem solving rubric • Face and content validity?

  10. Verification and Validation (continued) • Adds a “process” approach to validation with a focus on… • Establishing course/program quality goals • Monitoring goal/level attainment • Applying statistical process control to… • Identify current performance levels • Establish acceptable levels of performance • Compare current performance to goals • Identify process areas to improve • Develop a historical perspective • Analyze and report performance measures and goal attainment

  11. Experimental Design • Summer pilot study • Subjects: faculty who teach the course • Independent variable: AACU’s rubric • Dependent variable: Likert type scale • Assessment of student memo • Pre/post treatment

  12. Findings • Comparison of pre/post showed • Significant difference • Supports content validity • Inter-rater comparison • No significant difference • Supports item reliability

  13. Conclusion • Multi-measure approach • Addressing the effort issue • Random selection (not the entire class) • Selection of independent variables depends upon • Faculty interest • Past variables • Hopefully, the tool can … • Ease effort considerations • Structure the process • The future for the tool

  14. Questions?

More Related