1 / 51

September 10, 2014 Lexington, Kentucky

Using Student Engagement Survey Data to Inform Schoolwide Programs and Strategies Why Student Engagement Matters and How to Measure It. September 10, 2014 Lexington, Kentucky. Welcome and Overview. Lydotta Taylor Research Alliance Lead, REL Appalachia. What is a REL?.

hedda-mccoy
Télécharger la présentation

September 10, 2014 Lexington, Kentucky

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Student Engagement Survey Data to Inform Schoolwide Programs and StrategiesWhy Student Engagement Matters and How to Measure It September 10, 2014 Lexington, Kentucky

  2. Welcome and Overview Lydotta Taylor Research Alliance Lead, REL Appalachia

  3. What is a REL? • A REL is a Regional Educational Laboratory. • There are 10 RELs across the country. • The REL program is administered by the U.S. Department of Education, Institute of Education Sciences (IES). • A REL serves the education needs of a designated region. • The REL works in partnership with the region’s school districts, state departments of education, and others to use data and research to improve academic outcomes for students.

  4. What is a REL?

  5. REL Appalachia’s mission • Meet the applied research and technical assistance needs of Kentucky, Tennessee, Virginia, and West Virginia. • Conduct empirical research and analysis. • Bring evidence-based information to policymakers and practitioners: • Inform policy and practice – for states, districts, schools, and other stakeholders. • Focus on high-priority, discrete issues and build a body of knowledge over time. http://www.RELAppalachia.org Follow us! @REL_Appalachia

  6. Overview of today’s workshop • Clarifying the Student Engagement Construct. • Best Practices for Survey Data Collection and General Data Management. • Selecting an Instrument for Measuring Student Engagement. • Action Planning for Next Steps. • Wrap-up and Closing Remarks. • Stakeholder Feedback Survey.

  7. Workshop goals • Understand the student engagement construct. • Select an appropriate instrument for measuring student engagement. • Gain an understanding of best practices for survey data collection and general data management.

  8. What is Student Engagement Jerry Johnson, Ed.D. Associate Professor University of North Florida

  9. Discussion questions • What does the term “student engagement” mean to you? • Is student engagement a concern in your school or district? • In what ways and to what extent is your school or district working to promote student engagement? • What more would you like to know about student engagement, both generally and within your student population?

  10. What is student engagement? • Student engagement has been defined variously by different researchers and theorists, but there is consistency around key ideas. • A broad conceptual definition that reflects those varied perspectives: Student engagement is a measure of the extent to which a student willingly participates in schooling activities. (Appleton, Christenson, & Furlong, 2008) • There is consensus among researchers and theorists that student engagement is a multidimensional construct with four elements (Fredericks et al., 2011): • Academic engagement. • Affective engagement. • Behavioral engagement. • Cognitive engagement.

  11. What does research say about student engagement? • Student engagement is closely associated with desirable schooling outcomes (higher attendance, higher academic achievement, fewer disciplinary incidents, lower dropout rates, higher graduation rates). (Appleton et al., 2008; Finn, 1989, 1993; Fredricks, Blumenfeld, & Paris, 2004; Jimerson, Campos, & Grief, 2003; Jimerson, Renshaw, Stewart, Hart, & O’Malley, 2009; Shernoff & Schmidt, 2008) • Student engagement is closely associated with general measures of well-being (lower rates of health problems, lower rates of high-risk behaviors). (Carter, McGee, Taylor, & Williams, 2007; McNeely, Nonnemaker, & Blum, 2002; Patton et al., 2006) • Student engagement levels can be effectively influenced through school-based interventions. (Appleton, Christenson, Kim, & Reschly, 2006; Christenson et al., 2008; Fredricks et al., 2004)

  12. How are student engagement measures used? • Research purposes: • Research on motivation and cognition. • Research on dropping out. • Evaluation of interventions. • Diagnosis and monitoring • Student level. • Teachers, school, or district level. • Needs assessment. (Fredericks et al., 2011)

  13. Basics of Noncognitive Measures

  14. Overview • Noncognitive measures. • Basics of measurement: • “Poor” versus “good” measures. • Reliability. • Validity.

  15. Noncognitive measures: Purpose • Cognitive measures test what people know or can do. • Noncognitive measures capture an individual’s attitude, behavior, characteristic, value, or interest. Cognitive example: Radiocarbon dating CANNOT be used on the remains of: • Plants • Minerals • Warm-blooded animals • Cold-blooded animals Noncognitive example: I enjoy studying science: • Strongly disagree • Disagree • Neutral • Agree • Strongly agree

  16. Noncognitive measures: An example • Objectives: • (Cognitive objective) Increase knowledge in science course: • Give test: • Score as right or wrong. • (Noncognitiveobjective) Increase positive attitudes toward science: • Administer survey measuring student attitudes toward math and science: • Cognitive and noncognitive objectives call for different types of measures.

  17. Noncognitive measures: How they are constructed • Likert responses. • Typically 5 to 7 response options. • Cannot score as correct or incorrect. • Use total score or subscale scores. • Overall attitudes toward science. • Attitudes toward each type of science (biology, chemistry, physics). • Some items will be reversed scored. • “I am tired of learning about science.”

  18. Noncognitive measures: Potential problems • Response set. • ABCDABCDABCD… • Social desirability. • “I would enjoy studying science.” • Difficulty defining construct. • Are we really measuring attitudes? • Maybe I respond positively to the above items because I enjoy studying anything, and not necessarily science in particular.

  19. Basics of measurement I have objectives and know where to find instruments…Now what? • Examine instruments: • Do they measure your objectives? • Are any suitable for your purposes? • Do they cost money to administer or score? • Are they difficult to administer? • How long do they take to complete? • How many items do they contain? • Is there any reliability or validity information? I have objectives and know where to find instruments…Now what? • Any issues with the items? • “I like science when it’s easy.” • “I don’t like science now, but I probably will in the future.” • “Science uses the scientific method.” • “I like science, just not biology.”

  20. Basics of measurement “Poor” versus “good” measures

  21. Basics of measurement: Example of a poor measure • Scientific Attitudes Inventory II. • 40 items. • 6 subscales. • 1 (“Strongly disagree”) through 6 (“Strongly agree”)

  22. Basics of measurement: Example of a poor measure Please answer the following items ( Rindicates reverse scored): • A scientist must have a good imagination to create new ideas. • RA major purpose of science is to produce new drugs and save lives. • Ideas are the important result of science. • RScience tries to explain how things happen. • RElectronics are examples of the really valuable products of science. • A major purpose of science is to help people live better.

  23. Basics of measurement: Example of a good measure • Definitions: Engagement versus Disaffection with Learning • Engagement: enthusiastic and emotionally positive in interactions. • Disaffection: apathetic withdrawal and frustrated alienation. • Sample items: • Behavioral Engagement: “When I’m in class, I listen very carefully.” • Behavioral Disaffection: “When I’m in class, I just act like I’m working.” • Emotional Engagement: “I enjoy learning new things in class.” • Emotional Disaffection: “When we work on something in class, I feel discouraged.” (Skinner et al., 2009)

  24. Basics of measurement: Reliability of a measure • Consistency of measurement. • Test-retest reliability. • Internal consistency reliability. • Alternate or parallel forms reliability. • Cronbach’s alpha as a measure of internal consistency. • Ranges from 0.00 to 1.00. • The higher the value, the greater the reliability. • Reliability depends on: • The sample of respondents. • Length of the measure (number of items).

  25. Basics of measurement: Validity of a measure • Accuracy of measurement: Is it measuring what it claims to measure? • Face validity as a first step. • Construct validity = how well the item or items capture the underlying concept. • When single items are sufficient. • When summated scales are necessary. • Additional subtypes of validity: • Predictive validity. • Convergent validity.

  26. Basics of measurement: Reliability and validity Item responses are reliable.  Item responses are NOT valid.  • We’re consistently measuring something, but what is it? Item responses are reliable.  Item responses are valid.  Item responses are NOT reliable.  Item responses are NOT valid. 

  27. Basics of measurement: Validity versus reliability • Validity as accuracy. • Reliability as precision. • Ideally, a measure should be valid and reliable.

  28. Administering the Student Engagement Survey

  29. Preparing to administer the survey • Identify a district point of contact (POC). • Identify a school survey administrator (this could be the same person as the POC).

  30. Data collection • REL Appalachia staff will upload the selected instrument to an online survey system and send the URL to district POC. • Survey will include field for student name or district student ID (district preference). • District POC forwards survey URL to school survey administrator. • Schools administer survey to students. • District POC adds student background characteristics to completed survey data file using student name or student district ID. • REL Appalachia will provide a list of requested student background characteristics (e.g., student background information race/ethnicity, grade level, limited English proficiency, IEP). • District POC removes student name or student district ID and sends survey data and corresponding background characteristics to REL Appalachia.

  31. Why are student background data important? • Without student background data, can only report student engagement at the school level. • Need student background data to disaggregate results by key student subgroups. • Grade level • Race/ethnicity • English proficiency • IEP • Can tailor student engagement programs or interventions to particular types of students.

  32. Survey follow up • Data analyses will be conducted by REL Appalachia staff. • REL Appalachia staff will prepare data memos summarizing the results. • Debriefing session on survey results and implications. • Review results disaggregated by student subgroups. • Review schoolwide strategies and programs for promoting student engagement (as reported in the extant literature). • Develop school- and/or district-level action plans.

  33. Survey ethics - consent and assent forms • Parental consent forms. • Send home with students. • Must be obtained before data collection. • Do not allow student to fill out survey without parental consent. • Assign an alternate computer-based activity during survey administration. • Student assent forms. • Read script aloud to students. • It will also be on the landing page of the online survey.

  34. Missing data: why response rates matter • Non-response bias. • Item- and instrument-level non-response. • Occurs when answers of respondents differ from the way non-respondents might answer. • Monitoring response rates. • Labels remaining on sheets sent to schools. • “Analyze Data” tab in Survey Monkey. • Goal is 80% or higher. • Incentives for participation. • Distinguishing between diligent follow-up and harassment of respondents.

  35. Break

  36. Selecting an Instrument for Measuring Student Engagement

  37. IES Issues & Answers Report (available online) http://ies.ed.gov/ncee/edlabs/regions/southeast/pdf/REL_2011098.pdf

  38. Overview of the report • Purpose: “to describe 21 instruments for measuring student engagement in upper elementary through high school as identified through a literature review.” • Content and structure: • Definitions, instrument types, psychometric properties. • Instrument abstracts. • Tables for comparing instrument attributes (e.g., developer/availability, engagement dimensions assessed, intended purposes/uses). • Potential uses for stakeholders: • Introducing stakeholders to the instruments that are available. • Assisting stakeholders in determining the appropriateness of various available measures.

  39. How is student engagement measured? • Three primary data collection strategies: • Student self-reports. • Teacher self-reports. • Observational measures. • Three dimensions of student engagement: • Behavioral — the student’s involvement in academic, social, and extracurricular activities. • Affective/emotional — extent of the student’s positive [and negative] reactions to teachers, classmates, academics, and school. • Cognitive — the student’s level of investment in his/her learning. Note: Academic engagement is measured using traditional outcome data, such as student achievement results. (Fredericks et al., 2011)

  40. Questions to guide the selection process • What type(s) of student engagement do you want to better understand? • Behavioral? • Affective/emotional? • Cognitive? • All, or a combination of the above? • In what specific ways do you hope to use the information gained from this process? • For monitoring of individual students? • For needs assessments at the classroom, grade, school, or district level? • Both?

  41. Questions to guide the selection process (continued) • What type of data collection strategy appeals to you? • Student self-report survey? • Teacher self-report survey? • Observation using a protocol? • How much time are you willing to allocate to data collection? • What human resources are available in your school/district to facilitate the data collection process?

  42. Two example surveys – Handout 1 • Student Engagement Instrument (SEI) Appleton, J.J., Christenson, S.L., Kim, D., & Reschly, A.L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445. • High School Survey of Student Engagement Center for Evaluation and Education Policy, Indiana University

  43. Activity #1: Select a student engagement survey • Break into small groups of 2-3. • Generate responses to the five “Questions to Guide the Selection Process” in Handout 2. • Each small group will then share and explain your responses, allowing for questions. • Finally, as a whole group, we will make a determination about which instrument best addresses the needs and objectives of the group as a whole.

  44. Lunch

  45. Action Planning for Next Steps

  46. Activity #2: Plan for survey administration • Work in a school or district team (or as an individual representative of your school or district). Please see Handout 3 and complete the table to develop a tentative plan of action for activities associated with data collection.

  47. What happens next • Workshop attendees • Steps for survey administration summarized in Handout #4. • Please contact Michael Flory at FloryM@cna.org to let us know you will participate in this project. • REL Appalachia • Provide URL for survey instrument to the district POC. • Analyze survey data following administration by schools. • Prepare data memo summarizing results. • Discuss results and implications.

  48. Wrap-Up and Closing RemarksStakeholder Feedback Survey Lydotta Taylor Research Alliance Lead, REL Appalachia

  49. Connect With Us! www.relappalachia.org @REL_Appalachia Patty Kannapel Kannapelp@cna.org Michael Flory Florym@cna.org

  50. Sources cited Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45, 369–386. Appleton, J.J., Christenson, S.L., Kim, D., & Reschly, A.L. (2006). Measuring cognitive and psychological engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427-445. Carter, M., McGee, R., Taylor, B., & Williams, S. (2007). Health outcomes in adolescence: Associations with family, friends, and school engagement. Journal of Adolescence, 30, 51-62. Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman, S., Spangers, D., & Varro, P. (2008). Best practices in fostering student engagement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology V (pp. 1099–1120). Washington, DC: National Association of School Psychologists. Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59, 117–142. Finn, J. D. (1993). School engagement and students at risk. Washington, DC: National Center for Education Statistics. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74(1), 59–109. Fredricks, J., McColskey, W., Meli, J., Mordica, J., Montrosse, B., & Mooney, K. (2011). Measuring student engagement in upper elementary through high school: A description of 21 instruments (Issues & Answers Report, REL 2011–No. 098). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved September 13, 2013, from http://ies.ed.gove/ncee/edlabs. Jimerson, S., Campos, E., & Grief, J. (2003). Toward an understanding of definitions and measures of school engagement and related terms. The California School Psychologist, 8, 7–27. Jimerson, S., Renshaw, T., Stewart, K., Hart, S., & O’Malley, M. (2009). Promoting school completion through understanding school failure: A multi-factorial model of dropping out as a developmental process. Romanian Journal of School Psychology, 2, 12–29.

More Related