1 / 26

ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities

ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities. ASNNA & University of Colorado Denver. Jini Puma, PhD 2019 ASNNA Conference Arlington, Virginia February 4-7 th.

ellisg
Télécharger la présentation

ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities ASNNA & University of Colorado Denver Jini Puma, PhD 2019 ASNNA Conference Arlington, Virginia February 4-7th

  2. Sue Foerster (Emeritus), Kimberly Keller (MO), Pamela Bruno (ME), Sue Sing Lim (KS), Jennie Quinlan (CO), Deanna LaFlamme (CO) – 2019 Census Project Workgroup • Max Young, Star Morrison & the rest of the 2017 Census Project Workgroup members • ASNNA respondents Thank You!

  3. For FFY 2019, are SIAs planning more comprehensive programming approaches and/or evaluations, especially in longer-term and in the outer spheres of the Framework? • Do SIAs intent to impact and evaluate the Framework indicators vary by region? • How do these results compare to the 2017 Census results that captured the time when the Framework was introduced? Goal of the Census

  4. 2019 Census

  5. The targeted respondents for the survey were the program directors of the SIAs (Total = 143). • Names and email addresses of the program directors were identified • Census survey respondents were asked to use their FFY 2019 state plan and its evaluation activities to inform their responses • For the most part, the survey content was the same as in 2017. 2019 Census Methodology

  6. The ASNNA listserve was used to publicize the Census. • The survey was administered through REDCap (Research Electronic Data Capture) database survey link in October 2018. • The survey data collection protocol followed the Tailored Design Method (Dillamn, 2007) and included: • An advanced notice e-mail • followed by a survey link e-mailed one week after the notice. • Up to 3 follow-up e-mails (one per week) with the survey link sent to non-responders. • After 4 e-mailed survey attempts, any non-responders were contacted personally by ASNNA colleagues or UCD staff and encouraged to participate in the survey. 2019Census Methodology

  7. This data collection approach resulted in 129 (out of 143 known) SIAs completing the survey (90% response rate). • Remarkably similar to the response rate in 2017 (n = 124; 91% response rate) • All states, the District of Columbia, and both Territories were represented, except Idaho. 2019 Census Methodology

  8. Out of the 51 indicators, on average, SIAs are intending to impact 19 indicators and evaluate 12 indicators. • There is a gap in the intent to impact and intent to evaluate for all indicators. • Gap widens for long-term indicators at the Individual-Level. • In every sphere of influence of the Framework, the indicators that have the highest percent of SIAs that intend to Impact and evaluate them are the core indicators (with the exception of ST1 at the Individual-level). 2019 Census Results Highlights

  9. 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Individual-Level (n = 12 Indicators)

  10. 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Environmental Settings-Level (n = 12 Indicators)

  11. 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Sectors of Influence-Level (n = 16 Indicators)

  12. 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Population-Level (n = 11 Indicators)

  13. 2019 Census: Barriers to Evaluating Indicators • POPULATION • Not Enough Staff Time/ Personnel (31%) • Budget constraints, in general (23%) • Lack of funds to pay for comparison studies (21%) • Outcomes cannot be linked to intervention (17%) • Secondary data sources not available (12%) • Not Enough Staff Time/Personnel (58%) • Lack of outside funds to pay respondents (37%) • Not Enough Staff Time/Personnel (63%) • Budget constraints, in general (40%) • Lack of training/expertise (33%) • Not Enough Staff Time/Personnel (41%) • Budget constraints, in general (29%) • Lack of training/expertise (29%) • Outcomes cannot be linked to intervention (21%) • Another entity is conducting the evaluation (13%)

  14. Mean % of SIAs Intending to Impact and Evaluate Indicators Across Framework Levels 2019 Census Results Highlights Proximal Distal The intent to impact and evaluate the indicators is, on average, higher in the more proximal spheres, than distal spheres

  15. Mean % of SIAs Intending to Impact and Evaluate Indicators Across the Individual, Environmental and Sectors of Influence Framework Levels 2019 Census Results Highlights The intent to impact and evaluate short-term indicators is greater than medium-term indicators, and medium-term indicators is greater than long-term indicators.

  16. 2019 Census: Were there regional differences in the number of indicators that SIAs intended to impact? n = 23 SIAs Total # Indicators : 51 Average # Indicators: 16 n = 12 SIAs Total # Indicators : 45 Average # Indicators: 23 n = 14 SIAs Total # Indicators : 49 Average # Indicators: 22 n = 7 SIAs Total # Indicators : 41 Average # Indicators: 19 n = 27 SIAs Total # Indicators : 51 Average # Indicators: 19 n = 22 SIAs Total # Indicators : 51 Average # Indicators: 18 n = 24 SIAs Total # Indicators : 51 Average # Indicators: 21

  17. Comparison of Results from 2017 - 2019

  18. Mean % of SIAs Intending to Impact and Evaluate Indicators Across Framework Levels Across Years Comparison of Results from 2017 – 2019There were no statistically significant differences in Framework levels across the years. Results are remarkably similar between the two years, which supports the reliability and validity of the Census survey tool.

  19. Mean % of SIAs Intending to Impact and Evaluate Indicators Across the Individual, Environmental and Sectors of Influence Framework Levels Levels Across Years Comparison of Results from 2017 – 2019There were no statistically significant differences in Frameworklevels across the years

  20. Exceptions: The Indicators with Significant Increases Over Time Trends in Results from 2017 – 2019 *p<.05; **p<.01

  21. Intent to impact and to evaluate Framework outcomes decreased as SIAs moved to longer-term outcomes and those in the outer Spheres of Influence. • USDA’s core indicators have, by far, the highest percent of SIAs that intend to impact and evaluate them, with little change reported over 2 years. SIAs appear to focus on SNAP-Ed Guidance and annual reporting requirements • From 2017 to 2019, the expected increase toward more comprehensive approaches and evaluation of Framework indicators was not seen. • Among all 51 Indicators, efforts related to ST8-Partnerships and MT12-Social Marketing were the only significant changes noted over time. Evaluation Policy Take-Aways

  22. Need for training and expertise is noted across all levels of the Framework (in addition to staff and budget constraints) ➣ SIAs may benefit from more interventions and evaluation tools for long-term and outer sphere outcomes in searchable, online resources like the SNAP-Ed Toolkit andSNAP-Ed Connection ➣ Ongoing, multi-disciplinary technical assistance would address gaps in knowledge and effort • Further examination of barriers to and incentives for use of the Framework is needed, including within the annual USDA Guidance, state plans and reporting EvaluationPolicy Take-Aways

  23. Evaluation Committee encourages ASNNA members to present and publish their work to widen understanding and adoption of the Framework: • The SNAP-Ed Evaluation Framework: Demonstrating the Impact of a National Framework for Obesity Prevention in Low-Income PopulationsTranslational Behavioral Medicine (Submitted January, 2019) • 2017 and 2019 Census • Regional, state and SIA results Specific Opportunities

  24. To increase Framework capacity and use, the need for technical assistance and training could be addressed through: • Further analysis of the 2019 Census and 2018 Social Marketing Profile results for gaps to address at regional, state, and SIA levels • Identifying champions and mentors for peer learning at outer levels of the Framework • Joining efforts with other stakeholders such as NOPREN (National Obesity Prevention, Research, and Evaluation Network) to advance practice • Entering into new partnership with NCCOR (National Collaborative on Childhood Obesity Research) and ASNNA to update the Framework’s Interpretive Guide • Seeking outside grant support for Framework-related projects Specific Opportunities

  25. Jini Puma, PhD Assistant Professor Colorado School of Public Health University of Colorado Denver jini.puma@ucdenver.edu 720-514-2729 Thank You!

More Related