1 / 60

"Developing Surveys to Measure Student Satisfaction and Learning Outcomes“

"Developing Surveys to Measure Student Satisfaction and Learning Outcomes“ Nathan Lindsay & Larry Bunce February 19, 2014. www.campuslabs.com /blog. @ CampusLabsCo # labgab. Like us on Facebook!. Satisfaction Surveys Learning Outcome Surveys Needs Assessments Exit Surveys Alumni Surveys

sana
Télécharger la présentation

"Developing Surveys to Measure Student Satisfaction and Learning Outcomes“

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. "Developing Surveys to Measure Student Satisfaction and Learning Outcomes“ Nathan Lindsay & Larry BunceFebruary 19, 2014 www.campuslabs.com/blog @CampusLabsCo #labgab Like us on Facebook!

  2. Satisfaction Surveys Learning Outcome Surveys Needs Assessments Exit Surveys Alumni Surveys User Surveys Non-User Surveys Student/Faculty/Staff/General Public Surveys Other? There are many types of surveys to consider…

  3. Steps in survey design

  4. What do you want/need to show? Why do you need to show it? Who is the source of your data? How will you use the data? Who will need to see results? Begin with the end in mind…

  5. The purpose of this assessment is… • To better understand what the needs of our veteran students are, and how the new Veteran’s Center can meet them • To evaluate if students achieved the stated learning outcomes of our workshop, and what additional training needs they have • To demonstrate to stakeholders the impact that living in the residence halls has on student development • To assess student awareness of services in order to develop our marketing and communications plan

  6. Examine past assessments Those who cannot remember the past are condemned to repeat it. Philosopher, George Santayana 1905 • Did you use the data? • If not, what kept you from examining it? • If you used the data… • What was useful? • Was any of the data difficult to analyze? • Were there questions you wished you had asked? • Did any question wording make you unsure of what the data meant? • What feedback did you receive from those who participated?

  7. Select an appropriate method

  8. Focus on text/narrative from respondents Why, how Match with outcomes about application, analysis, synthesis, evaluate Seeks to explain and understand Ability to capture “elusive” evidence of student learning and development Quantitative Qualitative • Focus on numbers/numeric values • Who, what, where, when • Match with outcomes about knowledge and comprehension (define, classify, recall, recognize) • Allows for measurement of variables • Uses statistical data analysis • May be generalize to greater population with larger samples • Easily replicated

  9. Sample A subsection of that group Example: survey goes to 30% of campus Sampling Population The whole group Example: survey goes to entire campus If entire campus: use sparingly and coordinate with Institutional Research

  10. Sampling strategies

  11. Sample suggestions Based on 5% margin of error Suggestion in: Assessing Student Learning by Linda Suskie Sample size is the desired number of respondents NOT the number of individuals invited to participate.

  12. Direct Methods Indirect Methods Any process employed to gather data which asks subjects to reflect upon their knowledge, behaviors, or thought processes. Any process employed to gather data which requires subjects to display their knowledge, behavior, or thought processes. • I know where to go on campus if I have questions • about which courses to register for in the fall. • Strongly agree • Moderately agree • Neither agree nor disagree • Moderately disagree • Strongly disagree Where on campus would you go or who would you consult with if you had questions about which courses to register for the fall?

  13. Conducted after the program Makes judgment on quality, worth, or compares to standard Can be incorporated into future plans • Conducted during the program • Purpose is to provide feedback • Use to shape, modify or improve program

  14. Is a survey right for you? Pros: • Include large numbers • Relatively fast and easy to collect data • Lots of resources available • Requires minimal resources • Fast to analyze • Good for surface level or basic data Cons: • Survey fatigue and response rates • Non-responsive • Limited in type of questions asked • Lacks depth in data • Skills set in both designing questions and analyzing data properly

  15. Focus Groups • Group discussions where the facilitator supplies the topics and monitors the discussion. • The purpose is to gather information about a specific (or focused) topic in a group environment, allowing for discussion and interaction by participants. • Similar to interviews, but use when the group interaction will give contribute to a richer conversation

  16. Is a focus group right for you? Pros: • Helps to understand perceptions, beliefs, thought processes • Small number of participants • Focus groups encourage group interaction and building upon ideas • Responsive in nature • Relatively low costs involved Cons: • Getting participants (think of time/places) • Data collection and analysis takes time • Data is as good as the facilitator • Beware of bias in analysis reporting • Meant to tell story, may not help if numbers are needed • Data is not meant to be generalizable

  17. Quick, 1-Minute Assessments On a notecard, write a real-world example of how you can apply what you learned. Pass an envelope containing notecards with quiz questions. Students pick one and have 60 seconds to answer and pass along. At the end of a workshop, ask students to write down 1 thing they learned, and 1 lingering question.

  18. Is a quick assessment right for you? Pros: • Provides a quick summary of take away from student perspective • Quickly identifies areas of weakness and strengths for formative assessment • Can track changes over time (short-term) • Non-verbal (provides classroom feedback from all students) • Captures student voice • Short time commitment • Provides immediate feedback Cons: • Non-responsive • Short (so you may lose specifics) • Sometimes hard to interpret • Need very specific prompts in order to get “good” data • Plan logistics ahead of time and leave time during program/course • May need to be collected over time

  19. Mixed Methods

  20. Identify ethical/logistical considerations • Do you have the necessary resources and brain power? • Do you need to go through IRB? • Do you need to identify respondents for follow up, merging of data, tracking of cohorts, or pre/post analysis? • Do you need to include demographic questions to drill down or separate data? • Who needs to be involved at planning stage to avoid problems when results are in? Does anyone need to approve the project? • Are there any political issues to be aware of?

  21. What to consider • Scales that match • Mutually exclusive • Exhaustive • Neutral/Not applicable/Non-response options • Choose not to respond Don’t know • Not applicable Unable to judge • No opinion Neutral • Neither ___ nor ___

  22. Pairing Question Text with Answer Choices Question text should be compatible with the answer choices e.g., “How satisfied were you with the following?” e.g., “Did you enjoy the Black History Month speaker?” Strongly agree Somewhat agree Somewhat disagree Strongly disagree

  23. Mutually Exclusive Answer Choices Response options should never overlap e.g., How many hours per week do you work? 0-10 10-20 20-30 30-40 Response options should exist independently of one another e.g., Which of the following statements describes your peer mentor? He/she is helpful and supportive He/she is difficult to get a hold of

  24. Exhaustive Answer Choices Respondents should always be able to choose an answer e.g., How often do you use the University website? Daily 2-3 times a week Weekly Monthly

  25. Non-response options Always consider a non-response option Choose not to respond Don’t know Not applicable Unable to judge No opinion Neutral Neither ___ nor ___ Customize the non-response option when possible e.g., How would you rate the leadership session? Excellent Good Fair Poor Did not attend

  26. Pitfalls to avoid Socially desirable responding – based on social norms • Can never be eliminated • Consider sensitive topics like race, drug and alcohol use, sexual activity, and other areas with clear social expectations Leading questions – suggesting there is a correct answer e.g., “Why would it be good to eliminate smoking on campus?” Double-barreled questions – asking more than one question e.g., “What were the strengths and weaknesses of orientation?” Double negatives – including negative phrasing which makes responding difficult e.g., “I do not feel welcome in my residence hall.”

  27. Response Formats • Open ended responses • Free response - text • Numeric • Yes/No with please explain • Types of multiple choice responses • Yes/No • Single response • Multiple response (e.g., Check all that apply, Select 3) • Ranking • Scales

  28. Yes/No When to Use: • There is no response between “Yes” and “No” e.g., “Have you ever lived on campus?” • You consciously want to force a choice even if other options might exist e.g., “Would you visit the Health Center again?” When Not to Use: • There could be a range of responses e.g., “Was the staff meeting helpful?”

  29. Single response When to Use: • All respondents would only have one response e.g., “What is your class year?” • You consciously want to force only one response e.g., “What is the most important factor for improving the Rec Center?” When Not to Use: • More than one response could apply to respondents e.g., “Why didn’t you talk to your RA about your concern?”

  30. Multiple response Options: “Check all that apply” or “Select (N)” When to Use: • More than one answer choice might be applicable e.g., “How did you hear about the Cultural Dinner?” (Check all that apply) • You want to limit/force a certain number of responses e.g., “What were your primary reasons for attending?” (Select up to 3) When Not to Use: • It’s important for respondents to only be associated with one response e.g., “What is your race/ethnicity?”

  31. Ranking When to Use: • You want to see the importance of items relative to one another e.g., “Please rank how important the following amenities are to you in your residence: (1=most important)” • You are prepared to do the analysis and interpretation! When Not to Use: • You want to see the individual importance of each item e.g., “How important are the following amenities to you?”

  32. Scales When to Use: • You want to capture a range of responses e.g., “How satisfied were you with your meeting?” • When you would like statistics e.g., 4 = strongly agree 3 = agree 2 = disagree 1 = strongly disagree When Not to Use: • The question is truly a Yes/No question e.g., “My mother has a college degree.”

  33. Scales Bipolar – positive or negative (with or without a midpoint) Very safe Somewhat safe Somewhat unsafe Very unsafe Unipolar – no negative A great deal Considerably Moderately Slightly Not at all • Consider… • Number of points • Inclusion of neutral • Whether labels are needed • Order (e.g., 1, 2, 3, 4, 5 or 5, 4, 3, 2, 1)

  34. Recommended Scales

  35. Note the reason for each data point • Bubble next to question • Compare against purpose to identify gaps • Look for overlap • Eliminate “nice to know” • Help with ordering • Retain for analysis step

  36. DATA COLLECTION METHODS

  37. Paper surveys Things to consider • Captive audience • Administrator available for questions • No technology issues or benefits • Data entry necessary

  38. Web surveys Things to Consider: • No data entry • Technology issues and benefits • Immediate results • Can be anonymous or identified • Not a captive audience

  39. Data Collection Methods

  40. SURVEY FATIGUE

  41. General information • Survey response rates have been falling • Difficult to contact people • Refusals to participate increasing • Strategies for correcting low response rates: • Weight the data for non-response • Implement strategies to increase response rates

  42. Non-response may not be random • Correlation exists between demographic characteristics and survey response • Higher response has been found among certain sub-populations: • Women • Caucasians • High academic ability • Living on campus • Math or science majors • Research is inconsistent

  43. IMPROVING RESPONSE RATES

  44. Specific techniques • Survey length • Preannouncement • Invitation text • Reminders • Timing of administration • Incentives • Confidentiality statements • Salience • Request for help • Sponsorship • Deadlines

More Related