1 / 17

The Development of a Comprehensive Assessment Plan: One Campus’ Experience

The Development of a Comprehensive Assessment Plan: One Campus’ Experience. Bruce White ISECON 2007. Feedback / accountability. “For society to work […] we must be accountable for what we do and what we say.” “No person can succeed unless he or she is held accountable”

shae
Télécharger la présentation

The Development of a Comprehensive Assessment Plan: One Campus’ Experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Development of a Comprehensive Assessment Plan: One Campus’ Experience Bruce White ISECON 2007

  2. Feedback / accountability • “For society to work […] we must be accountable for what we do and what we say.” • “No person can succeed unless he or she is held accountable” • “Feedback is the breakfast of champions” • “You need a culture of assessment, not a climate” 1 -Betty Dowdell 2 – Grant Wiggins 3 – Ken Blanchard 4 – Gloria Rogers

  3. Overview • Are we teaching what we say we are? • Are students learning? • How can we be more effective in our instruction? SO … • What do we want students to learn? • Why do we want them to learn it? • How can we help them to learn it? • How do we know what they have learned?

  4. Furthermore . . . • Stakeholders want to see if we are accomplishing our goals of education. • Possible stakeholders: • Students • Parents • Employers • Board of Regents / State Agencies • Accrediting groups (AACSB / ABET / etc.) • Faculty • Alumni

  5. Our campus program • The Information Systems Management program at Quinnipiac University in Hamden Connecticut started our journey towards a comprehensive assessment program in 2003. • Prior ‘assessment’ was informal: • So ISM faculty – how do you think we are doing? • So Advisory Board – what advise do you have for us? • So IS education community – what should we teach (like IS2002 curriculum) • So Employers – what do our students need to know (or know better) • Etc.

  6. Our desired outcomes: • Analysis and design of information systems which meet enterprise needs. • Use and experience with multiple design methodologies. • Experience in the use of multiple programming languages. • Development of hardware, software and networking skills. • Understanding of data management. • Understand the role of IS in Organizations.

  7. Possible assessment methods: Direct Assessment Methods: • Simulations • Behavioral Observations • Performance Appraisals • Locally Developed Exams • External Examiner • Portfolios / E-portfolios • Oral exams • Standardized Exams (Source: Gloria Rogers- ABET Community Matters 8-06)

  8. Indirect Assessment Methods • Exit and other interviews • Archival data • Focus groups • Written or electronic surveys / questionnaires • Senior exit surveys • Alumni surveys • Employer surveys • Other factors: • IS model curriculum • Advisory board • Alumni Surveys

  9. Foundation of Our Assessment Program • We got interested in the CCER IS Assessment test early • It is a direct assessment test based on the IS2002 model curriculum • It has been thoroughly tested and analyzed • It has been shown to be valid and reliable • Test scores reported on 37 different areas – and relevant to our learning outcomes • The test questions are written at higher levels of Bloom’s taxonomy – with scenarios

  10. More on our assessment process • We also use a senior exit survey (indirect measure) • An advisory board gives input • Informal controls: • Campus decisions (such as number of credits allowed, changes in general education) • Model Curriculum Changes • Employer input • Conferences / technologies

  11. Specific Learning Skills

  12. Continued

  13. Overall Overall Analysis

  14. Senior Exit Survey

  15. Next Step – setting metrics • So … we have a solid direct measurement • And … we have a good indirect measure • Now … what • We are working on setting metrics (especially for our direct measurement – the CCER IS Assessment test) • From ABET: • “Every student does not have to achieve the desired outcomes, but targets must be defined.”

  16. Setting expectations • The faculty have considered the goals of the program and feel our program emphasizes systems analysis, the role of IS in organizations and data management ISECON 2007 - Accreditation

  17. Now what • Next year as students take the CCER IS Assessment test and as we get feedback from our senior survey, we will analyze the data to see if our outcomes have been reached. • If they have:  • If they haven’t: • We analyze why not – was it poor instruction? Poor students? Poor textbook? Overly optimistic expectations? • We change in an effort to ‘constantly and continually improve’ our program!!

More Related