1 / 18

AmeriCorps Grantmaking

AmeriCorps Grantmaking. Presented by Ia Moua, Grants & Program Development Specialist Eddie Aguero , AmeriCorps Specialist. Session Objectives. Review 2014 grantmaking results Share insights from grant review process R eview potential changes to 2015 RFA

laurie
Télécharger la présentation

AmeriCorps Grantmaking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. AmeriCorps Grantmaking • Presented by • Ia Moua, Grants & Program Development Specialist • Eddie Aguero, AmeriCorps Specialist

  2. Session Objectives • Review 2014 grantmakingresults • Share insights from grant review process • Review potential changes to 2015 RFA • Discuss tentative 2015 grantmaking timeline

  3. 2014 Grantmaking • Closed competition focused on recompeting current and new Governor’s Initiative programs • Total request: $2 million, supporting 3 recompeting and 1 new programs • 3 applications submitted to national competition • 2 applications funded competitively • 1 application funded in formula

  4. 2014 Grant Results

  5. Grant Review Insights • Points lost due to compliance issues • Incomplete/unsubmitted evaluation reports • Exceeded required cost per member • Prohibited activities not addressed • CV RFA instructions not followed • Previous contracted match not met • Labor Organization Certification unsatisfactory

  6. Grant Review Insights • Points lost due to inadequate descriptions • Previous performance challenges not acknowledged • Less than 100% enrollment/retention not addressed • Lacked evidence-based interventions • Program design shifts not explained • Dusted off previous application

  7. 2015 CV Funding Outlook • Demands for funds (recompeting programs) • $6.3 million...........6 Competitive • $7.6 million………18 Formula __________________________________________ • $13.9 million……24 Total Recompeting • CA Formula Fund Projection • $10.9 million ……. Assumes level allocation • - $ 5.3 million……….Continuation Commitments • $5.6 million………Projected Formula Funds Available

  8. RFA Changes/New Requirements • The following were significant changes to the RFA as a result of changes in the NOFO: • Logic Model Chart • Proposed intervention scored and placed one of 4 4 tiered evidence levels • Increased emphasis on evaluation • Letters of commitment from major partners • Organizational chart • Page limit reduced from 25 double-spaced pages to 14 double-spaced pages

  9. Logic Model INPUTS: What we invest (e.g. human, financial organizational, and community resources available for carrying out program activities) ACTIVITIES: What we do (processes, tools, events, and actions used to achieve intended changes or outcomes) OUTPUTS: Direct products from program activities EVIDENCE OF CHANGE/OUTCOMES Short-Term Outcomes – one year • Changes in knowledge, skills, attitudes Medium-Term Outcomes – three years • Changes in behavior or action resulting from new knowledge Long-Term Outcomes – ten years • Meaningful changes in condition

  10. Evidence Basis Evidence for the intervention (member service activities) should be supported by: • Results of impact evaluations • Research studies • Past performance measurement outcome data

  11. Evidence Basis The proposed intervention was scored and place on one of 4 tiered evidence levels: • Pre-preliminary evidence • Preliminary evidence • Moderate evidence • Strong evidence

  12. Evidence BasisPre-preliminary Evidence Applicant presents evidence that it has collected quantitative or qualitative data from program staff, program participants, or beneficiaries that have been used for program improvement, performance measurement reporting, and/or tracking… Example: Gathering feedback from program participants following their service year.

  13. Evidence BasisPreliminary Evidence Applicant presents an initial evidence base that can support conclusions about the program’s contribution to observed outcomes. The evidence base consists of at least: • 1 non-experimental study conducted on the proposed program (or another similar program that uses a comparable intervention); OR • An outcome evaluation; OR • An implementation (process evaluation) Examples: • outcome studies that track program participants through a service pipeline and measure participants’ responses at the end of the program; and • pre- and post-test research that determines whether participants have improved on an intended outcome.

  14. Evidence BasisModerate Evidence • Applicant presents a reasonably developed evidence base that can support causal conclusions for the specific program proposed by the applicant with moderate confidence. The evidence base consists of: • 1 or more quasi-experimental studies conducted on the proposed program (or another similar program that uses a comparable intervention) with positive findings on one or more intended outcome; OR • 2 or more non-experimental studies conducted on the proposed program with positive findings on one or more intended outcome; OR • 1 or more experimental studies of another relevant program that uses a similar intervention. • Examples: well-designed and well-implemented quasi-experimental studies that compare outcomes between the group receiving the intervention and a matched comparison group (i.e. a similar population that does not receive the intervention).

  15. Evidence BasisStrong Evidence • Applicant presents an evidence base that can support causal conclusions for the specific program proposed by the applicant with the highest level of confidence. This consists of 1 or more well-designed and well-implemented experimental studies conducted on the proposed program with positive findings on one or more intended outcome.

  16. Evaluation • Increased information required on evaluation plan (e.g. data collection procedures including the types and sources of data, the population or sample, and a data analysis plan, etc.). • Applications with a stronger evidence base established through quasi- and experimental studies received more points. • Evaluation reports submitted are assessed in terms of the quality of the evaluation designs and the studies’ findings. These assessments may be used to inform CVs/CNCS’s consideration of the selection criteria and for the purpose of clarifying or verifying information in the proposals.

  17. 2015 Tentative Timeline

  18. QUESTIONS???

More Related