1 / 56

AICAD Consortium Meeting Pratt Institute, NY June 12, 2007

Using the National Survey of Student Engagement to Assess Educational Effectiveness at AICAD schools. AICAD Consortium Meeting Pratt Institute, NY June 12, 2007. Agenda. Introduction & NSSE overview What can you learn about your students and their experience from NSSE? NSSE Reports

Télécharger la présentation

AICAD Consortium Meeting Pratt Institute, NY June 12, 2007

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using the National Survey of Student Engagementto Assess Educational Effectiveness at AICAD schools AICAD Consortium Meeting Pratt Institute, NY June 12, 2007

  2. Agenda • Introduction & NSSE overview • What can you learn about your students and their experience from NSSE? • NSSE Reports • Benchmarking • Consortium comparison • AICAD interests: marketing and institutional improvement • NSSE details • Timeline and Administration • Questions – What questions do you have right now?

  3. Introduction Activity: • Assessment at your institution: • What do you want to know about your students? • Why do you want to know this? • What is the purpose of your assessment initiative(s)? • To what extent have you used NSSE data?

  4. What is NSSE? • Student Engagement • The time and energy students devote to educationally purposeful activities and the extent to which institutions emphasize effective practice • Engagement is a reliable predictor of student learning and personal development • Institution can shape curriculum and resources for learning to promote engagement

  5. What Really Matters in College: Student Engagement Because individual effort and involvement are the critical determinants of impact, institutions should focus on the ways they can shape their academic, interpersonal, and extracurricular offerings to encourage student engagement. Pascarella & Terenzini, How College Affects Students, 2005, p. 602

  6. Foundations of Student Engagement Time on task (Tyler, 1930s) Quality of effort (Pace, 1960-70s) Student involvement (Astin, 1984) Social, academic integration (Tinto,1987, 1993) Good practices in undergraduate education (Chickering & Gamson, 1987) Outcomes (Pascarella, 1985) Student engagement (Kuh, 1991, 2005)

  7. Good Practices in Undergraduate Education(Chickering & Gamson, 1987; Pascarella & Terenzini, 2005) • Student-faculty contact • Active learning • Prompt feedback • Time on task • High expectations • Respect for diverse learning styles • Cooperation among students

  8. NSSE Survey & Results • Survey offers an annual snapshot of student participation in programs and activities that institutions provide for their learning and personal development. • Results provide an estimate of how undergraduates spend their time and what they gain from attending college. • NSSE items represent empirically confirmed ‘good practices’; they reflect behaviors associated with desired outcomes of college.

  9. NSSE 2006 Participating Colleges & Universities by Carnegie Classification

  10. Core Survey: NSSE • Research based on effective educational practices • Designed and tested for high validity and reliability • Relatively stable over time • High credibility of self-reported data • Over 275,000 students at 600 institutions annually

  11. Q.1 – Academic activities Q.2 – Learning mental activities Q.3 – Reading & writing Q.4 – Homework Q.5 – Academic challenge Q.6 – Co-curricular activities Q.7 – Enriching educational experiences Q.8 – Campus relationship Q.9 – Time usage Q.10 – Institutional emphasis Q. 11 – Gains Q.12-14 – Satisfaction NSSE Survey Item Organization

  12. NSSE Results • Are diagnostic; to help institutions look holistically at undergraduate experience • Help pinpoint aspects not in line with mission, or what institution expects • Identify weaknesses and strengths in educational program • Help institutions know what to focus on to improve student learning and success

  13. Questions to answer with NSSE results How many hours per week do first-year students spend studying? Do women study more than men? What % of seniors work with faculty members on activities other than coursework (activities, committees)? Does this differ by major? What % of FY and SR spend 0 hours in co-curricular involvements? Is this more than at peer institutions? Do FY students work more frequently with classmates on assignments outside of class than their counterparts at peer institutions?

  14. Questions to answer with NSSE results Do NSSE results match our mission and what we say about a [INSTITUTION] experience? Are we meeting our own expectations for having a supportive campus environment? Since implementing a new multicultural education initiative and expanding diversity programming, has our score on the diversity scale changed? Are FY who withdraw from the institution different in terms of engagement than students who are retained? How are we performing compared to our select peers (normative benchmarking) or to our institutionally identified standards (criterion benchmarking)?

  15. NSSE Deliverables • Institutional Report (August) • Comparison Reports • Respondent characteristics (Demographic Information) • Means and Frequencies (item averages and response percentages) • Benchmarks of Effective Educational Practice • Additional Reports (If Applicable) • FSSE Report • BCSSE Combined Report • Data file (student-identified) • NSSE Institute Information • Using NSSE Data • Accreditation Toolkit • Data Facilitator’s Guide

  16. Sample NSSE results: Frequency comparisons • Frequency Comparisons: About how many hours do you spend in a typical 7-day week participating in co-curricular activities (organizations, campus publications, student government, fraternity or sorority, intercollegiate or intramural sports, etc.) (1=0 hrs/wk, 2=1-5 hrs/wk, 3=6-10 hrs/wk, 4=11-15 hrs/wk, 5=16-20 hrs/wk, 6=21-25 hrs/wk, 7=26-30 hrs/wk, 8=more than 30 hrs/wk 0 Hours on co-curricular activities = 61% FY vs. 56% seniors compared to 43% and 46% at Select Peer Institutions – is this what NSSEville expects??

  17. Sample NSSE results: Mean comparisons NSSEville State score on 1h. (working with peers outside of class) is significantly LESS than SELECT PEER institutions for FY and Seniors

  18. Level of Academic Challenge Active & Collaborative Learning Student Faculty Interaction Supportive Campus Environment Enriching Educational Experiences Benchmark Report

  19. Sample NSSE Results: Benchmark Report NSSEville State Strength – significantly higher score for FY and SR on Supportive Campus Environment First Year Students

  20. AICAD Consortium Questions • Value in developing consortium specific questions • Comparison options • Potential Data sharing • Establish Core questions & others that rotate in

  21. NSSE Use “The NSSE data is one among several pieces of information that is used to organize discussions about enrollment management, curricula, retention, and faculty development.” —Christopher Cyphers, Provost, School of Visual Arts

  22. Using NSSE Data Context Setting – paint a picture of the institution Evidence of outcomes & processes Refocus conversation about collegiate quality Provides lexicon for talking about collegiate quality in an understandable, meaningful way • Benchmarking – longitudinal, criterion, normative • Problem Identification- results point to things institutions can do something about – almost immediately • Mobilize Action - to change/improve • Helps inform decision-making

  23. Making Sense of Data: Benchmarking Three Approaches: Normative - compares your students’ responses to those of students at other colleges and universities. Criterion - compares your school’s performance against a predetermined value or level appropriate for your students, given your institutional mission, size, curricular offerings, funding, etc. Longitudinal – year to year comparison of your students to assess improvement

  24. Benchmarking within AICAD consortium - Writing Example: 1 Assessment Issue: Insure high quality writing experiences in the first year. Are we using writing center/tutors effectively? Relevant NSSE items: 1c, d; 3c,d,e; 11c. Provide student learning process & outcome indicators NSSE results: First year students write short papers comparable to AICAD schools; but fewer med & long papers, are less likely to prepare 2+ drafts & also report lower gain in writing than AICAD peer schools. Interpretation: Benchmarking with AICAD schools indicates institution is underperforming; what other data might you gather to assess writing? What first year writing initiatives might help? What goals might you set for improvement?

  25. Example: 2 Benchmarking – longitudinal (performance indicators & co-curricular improvements) Assessment Issue: Maintaining effectiveness and making targeted improvements with upper division students Relevant NSSE items: NSSE items identified as key performance indicators, gains items for seniors (11 a-p); and targeted improvements in co-curricular experiences (1h,s,t; 6a; 7b, 10f & diversity scale, 1e,u,v; 10c) NSSE results: Baseline NSSE = 2006, monitor indicators in 2008; assess impact of co-curricular enhancements & diversity initiatives started in 2006 by comparing NSSE 2006 SR to 2008 SR scores. Interpretation: Longitudinal benchmarking (2006-2008); could also benchmark with AICAD schools. Did you meet performance goals? Did the enhancements have an impact?

  26. Multi year comparison

  27. Using NSSE to “market” AICAD schools • Demonstrate AICAD consortium and institutional strengths (items, NSSE benchmarks) in undergrad program • Use results to show mission effectiveness • i.e., gains items (11 a-p) & comparison peers show liberal education gains; use consortium results to focus on arts school mission • Provide results to prospective students and families • Share results with current students, development office and alumni

  28. Institutional Example: NSSE and Enrollment Management • The enrollment management area at Meredith has used NSSE results to help guide the enrollment marketing strategies. They look closely at trends and make adjustments to programs and campus visitation days to ensure that students are more cognizant of student involvement and engagement opportunities. • An academic dean reports using NSSE information when speaking to parents at an admissions event. "Parents seemed impressed that there was data to support the points that I was making about what we say about the student/faculty relationships and educational opportunities at Meredith."

  29. Institutional Example: Hanover College • A detailed summary of NSSE is sent to the faculty as well as the Admission and Student Life staffs to ensure the results, both good and bad, are understood by key folks on campus. Last year, Admission requested an additional presentation and discussion of findings to help them better understand the strengths of the Hanover experience and how that impacts student fit.

  30. Helping Students and Families Focus on What Matters to Success • Pocket Guide helps prospective students ask the “right” questions • Good questions to ask of all schools, not just those that participate in NSSE • School counselors can request up to 1000 free pocket guides per year.Colleges and non-profit education organizations can request up to 300 copies free per year.* *Request via the NSSE Web site

  31. “A Pocket Guide to Choosing a College:Are You Asking the Right Questions…”

  32. Connecting NSSE Data to Accreditation Standards - Example • Accreditation standard: “Demonstrate effectiveness of student academic and social support services” • Evidence for institutional self study: • Information about availability and student use of tutoring, writing support, peer study groups, counseling services • NSSE indicates FY & SR believe institution emphasizes spending time studying and support for student success; 79% seniors tutored or taught peers; positive correlation between peer collaboration outside of class, satisfaction and first-year retention • Positive student satisfaction data about support services • Area for improvement - seniors indicate low gains in writing and completing drafts of papers; institution responds with examination of writing requirement in senior capstone and targets seniors for increased use of writing center

  33. NSSE and AICAD consortium • Consider data sharing agreements • Potential for additional comparison studies; prepare papers/presentations; examine shared concerns (retention, outcomes) • Use consortium to explore common concerns • Coordinate survey schedule • Ideas to improve participation rate (incentives, persuasive to promote that survey is occurring at other AICAD schools??) • Identify focus for additional questions (up to 20!) • Develop stable core of questions? or change focus? Or a mix? Rotate new questions in?

  34. Administration Details What challenges have you faced in your NSSE administrations? What concerns do you have about your next administration? Questions about the details?

  35. NSSE Timelines (15 mos.) • May • NSSE/FSSE registration opens • September • NSSE/FSSE registration deadline • NSSE materials due two weeks after registration confirmation • October • NSSE pop. files, oversample, and consortium decisions due • December • FSSE materials and pop. files due • Mid-January early February • NSSE administrations open • BCSSE registration begins • Mid-March early April • FSSE administration opens • June • NSSE & FSSE administrations close • BCSSE administration begins at many campuses • August • Institutional Reports sent, including raw data and printed reports for NSSE, FSSE, and the prior summer’s BCSSE • BCSSE administration continues • September • BCSSE data and reports sent to participating institutions

  36. NSSE Administration • Administration Mode • Paper: We need accurate mailing addresses, letterhead, signatures • Web+: 4x the paper sample, we need e-mail and mailing addresses • Web: 5x the paper sample, we need e-mail addresses

  37. NSSE Administration • Sample Size • Numbers are based on mode and school size • Oversampling can increase sample size or ensure adequate representation of populations of interest

  38. NSSE Administration • Things that we need from you • Contact persons • Campus Project Manager (required) • Campus Administrative Contact (required) • Auxiliary Contact (optional) • Population File • All First-Year and Senior Students • Accurate mailing and/or e-mail addresses • Institutional letterhead and signature file (Paper mode only)

  39. NSSE Administration • Things for you to consider • Broad buy-in from others at your institution (informal word-of mouth) • Web-mode institutions: Good partnership with IT department • Consortium

  40. NSSE Administration • Things you to consider (cont.) • Administration Plan • Follow the IRB rules of Indiana University Bloomington • Allowed up to 5 institutional contacts • Promotion plan • Incentive programs • Tips to boost response rates http://nsse.iub.edu/html/tips.cfm

  41. Step #1: Survey Data • Survey students • Review results • Develop preliminary list of strengths and opportunities for improvement • Step #4: Follow-up • Use results as benchmarks to monitor progress • Faculty & student focus groups • Step #2: Feedback • Share results with faculty, administrators & students • Identify themes & priorities • Design action plan • Step #3: Action Plan • Finalize plan • Share plan with appropriate groups • Link to strategic plan • Implement action NSSE: Only one step in assessment process

  42. NSSE in your assessment plan • How often should I administer NSSE? • Every Year: Gives you a snapshot of each class • Every Three Years: Gives you a picture of a cohort at the beginning and the end of their college experiences • Every Five Years: Works well with most accreditation cycles (Accreditation and Interim Reports) • Other factors to consider • Establishing a baseline • Costs (using all core surveys) • Additional Surveys/Sources of Data • Time to take absorb results, make changes

  43. Updates for 2007 and 2008 • No changes to survey content • Select up to three customized comparison groups on your reports • Electronic report delivery • Executive Summary Report • Pocket Guide Report

  44. Institutional Examples:Worcester Polytechnic Institute • NSSE results showed FY students were less engaged than seniors • New FY interdisciplinary, inquiry-based seminars; better integration of disciplines; engaging introductory courses • Associate Dean appointed to Office for the First Year • Assessment plan in development with NSSE indicators as key component

  45. Institutional Example: NSSE & Assessing General Education goals • Used NSSE items in 11a-p to assess institutional impact on college-level competencies (a.k.a., indirect measures of student learning outcomes) • Undergraduate seniors 2005 NSSE results confirmed findings from 2004 • Most seniors (75%+) reported that KSU experience had “substantial impact” (VM+QAB) in 9 or 16 college-level competencies • KSU rank ordered competencies, showing connection to mission, and compared to other master’s instit where KSU was sig. higher, comparable, sig. lower on competencies

  46. Institutional Example: Program Development and Strategic Planning • NSSE results framed a “Sophomore Experience” • 2005 = Pace’s 5th year of participation • Concern regarding SP- JR persistence; FY results offers context for understanding exp. as students enter SP year • Established “SP Experience Working Group” to investigate if FY exp. carried over in SP year. Focused on low NSSE score items, conducted focus groups, created sophomore survey. Led to pilot of “Pace Plan” (mentoring), includes Career Exploration Course, Sophomore Kick-Off Day • NSSE also used in strategic indicators, Accred, NCATE, AACSB, Faculty Development/Colloquia, items used by offices (Technology, Multicultural Affairs), studies performed by Enrollment Mngmt.

  47. NSSE suite • The “NSSElings” • The Faculty Survey of Student Engagement (2003) • The Beginning College Survey of Student Engagement (2004) • Additional Surveys • The Law Student Survey of Student Engagement • The College Student Experiences Questionnaire • The College Student Expectations Questionnaire • The High School Survey of Student Engagement* • The Community College Survey of Student Engagement* *Not administered by the Center for Postsecondary Research

  48. FSSE • Faculty perceptions of how often their students engage in different activities • Importance faculty place on various areas of learning and development • Nature and frequency of interactions faculty have with students • How faculty members organize class time

More Related