1 / 18

Interpreting and Using Assessment Results

Interpreting and Using Assessment Results. The Lehman College Assessment Council http://www.lehman.edu/research/assessment/council-documents.php. October 20, 2010. Timeline. Spring 2011 Middle States report due April 1 Second completed assessment cycle of student learning goals

liora
Télécharger la présentation

Interpreting and Using Assessment Results

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interpreting and Using Assessment Results The Lehman College Assessment Council http://www.lehman.edu/research/assessment/council-documents.php October 20, 2010

  2. Timeline • Spring 2011 • Middle States report due April 1 • Second completed assessment cycle of student learning goals • Analyze evidence • Report on how fall assessment results were used • Spring 2010 • First Assessment Plan • Programs begin gathering evidence • Supporting workshops • Results and Analysis reported • Learning objectives on syllabi Ongoing assessment • Fall 2010 • First completed assessment cycle of student learning goals • Identifying goal/objective and begin gather evidence on second goal (9/15) • Report on how spring assessment results were used (11/15) • Supporting workshops through the fall semester • Submission of fall assessment results • Syllabi collection

  3. Timeline: Fall 2010 • September 15 • Assessment Plan identifying second major/program learning objective to be assessed. • November 15 • Completed Assessment Report, indicating how results from spring 2010 assessments are being used • Late December/January • Results from fall assessments due • Ongoing • Evidence gathering • Meetings with ambassadors • Syllabi revisions • Development opportunities • Planning for next Spring/Fall assessments

  4. Assessment as a Four-Step Continuous Cycle Source:  Suskie 2004: 4.

  5. Step 1. Establishing Learning Goals “Assessment begins not with creating or implementing tests, assignments, or other assessment tools but by first deciding on your goals:  what you want your students to learn” (Suskie 2004: 73).

  6. Overview of Exemplary Goals Across Departments • Identify the contributions of key figures and events to the historical development of sociology as a scientific discipline • Calculate and interpret descriptive and inferential statistics • Make ethical decisions by applying standards of the National Association of Social Workers code of ethics • Design and conduct a study using an appropriate research method • Demonstrate an understanding of the basic research process and advocacy for the protection of human subjects in the conduct of research

  7. Department Specific Samples of Exemplary Goals and Objectives

  8. Department Specific Samples of Exemplary Goals and Objectives

  9. Step 2: Provide Learning Opportunities • Provide multiple learning opportunities for a single goal across courses within a program • Articulate learning goals for every assignment • Identify specific important learning goals for each assignment, then create meaningful tasks or problems that correspond to those goals • Some goals are not quantifiable: habits of mind, behaviors etc. •   Allow time for reflection, honest self-appraisals of actions, minute papers

  10. Learning Opportunities • Provide a variety of assignments and assignment types • Examples in Suskie p. 158 • Will students will learn significantly more from a larger assignment than a shorter one—enough to justify the time that they and you will spend on it? • Break apart large assignments into pieces that are due at various times (“scaffolding”) Suskie (2009) pp. 155-157

  11. Step 3: Assessing Student Learning: Gathering and Analyzing Data (Direct Evidence) • Direct evidence of student learning is tangible, visible, self-explanatory evidence of exactly what students have and haven’t learned. • Examples of Direct Evidence: • Embedded course assignments (written/oral) graded with rubric • Department wide exams • Standardized tests • Capstone projects • Field experiences • Score gains, Pre-Test/Post-Test Suskie (2004), p. 95

  12. Assessing Student Learning: Gathering and Analyzing Data (Indirect Evidence) • Indirect evidence provides signs that students are probably learning, but evidence of exactly what they are learning may be less clear and less convincing. • Examples of Indirect Evidence: • Pre- and post-course surveys • Open-ended questionnaire survey • Focus group • Track admissions to graduate and professional schools Suskie (2004) , p. 95

  13. Assessing Student Learning: Gathering and Analyzing Data (Direct Evidence)

  14. Assessing Student Learning: Gathering and Analyzing Data (Indirect Evidence)

  15. Step 4: Closing the Loop: Using Results for Improvement In your report, you will be asked to . . . • Explain the implications of the assessment results for the program. • How can the results be used to improve planning, teaching and learning? • Are changes in the program suggested? If so, what kinds of changes? Are changes in the assessment plan indicated? If so, what kinds of changes? The program changes may refer to curriculum revision, faculty development, changes in pedagogy, student services, resource management and/or any other activity that relates to student success. • What, if any, additional information would help inform decision making regarding student achievement of the objective(s)? • What kinds of resources will you need to make changes?

  16. Evaluating the Quality of Your Assessment Process Using the results from the assessment of your first goal(s), discuss plans you have regarding your: • Learning Goals • Curriculum • Teaching Methods • Assessment Strategies and Tools (See handout for questions to guide your thinking for each of these categories.)

  17.  Assessment Council Membership • Kofi Benefo (Sociology) kofi.benefo@lehman.cuny.edu • Salita Bryant (English) salita.bryant@lehman.cuny.edu • *Nancy Dubetz (ECCE) nancy.dubetz@lehman.cuny.edu • Robert Farrell (Lib) robert.farrell@lehman.cuny.edu • Judith Fields (Economics) judith.fields@lehman.cuny.edu • Marisol Jimenez (ISSP) marisol.jimenez@lehman.cuny.edu • Lynn Rosenberg (SLHS) lynn.rosenberg@lehman.cuny.edu • RenukaSankaran (Biology) renuka.sankaran@lehman.cuny.edu • Robyn Spencer (History) robyn.spencer@lehman.cuny.edu • MindaTessler (Psych) minda.tessler@lehman.cuny.edu • Janette Tilley (Mus) janette.tilley@lehman.cuny.edu *Committee Chair Administrative Advisor – Assessment Coordinator Ray Galinski - raymond.galinski@lehman.cuny.edu

  18. References/Resources Suskie, L. (2004). Assessing student learning: A common sense guide. San Francisco: Anker Publishing Co., Inc. Suskie, L. (2009). Assessing student learning: A common sense guide. San Francisco: John Wiley & Sons, Inc.

More Related