1 / 28

May 8-12, 2013

PROGRAM PLANNING, IMPLEMENTATION & EVALUATION The Service Delivery Model Link Brenda Thompson Jamerson , Chair Services to Youth Facet. May 8-12, 2013. GENERAL DEFINITIONS. PROGRAM VS. EVENT PROGRAM

janna
Télécharger la présentation

May 8-12, 2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PROGRAM PLANNING,IMPLEMENTATION & EVALUATIONThe Service Delivery ModelLink Brenda Thompson Jamerson, ChairServices to Youth Facet May 8-12, 2013

  2. GENERAL DEFINITIONS PROGRAM VS. EVENT PROGRAM A collection of related projects or events working in alignment towards a common cause EVENT Something that happens: an occurrence A one-time effort A noteworthy happening A social occasion or activity

  3. PROGRAMMING GOALS Support and enhance chapter’s efforts to delivery and sustain transformative programs that: Address problems, find solutions, and positively impact critical needs of the community Are comprehensive, accountable and replicable Yield measurable outcomes and impacts Promote collaboration and partnerships Are aligned and integrated with Signature Programs and National Initiatives, and; Institutionalize our Service Delivery Model.

  4. THE SERVICE DELIVERY MODEL

  5. WHAT IS A SERVICE DELIVERY MODEL? A graphic that shows the relationship between inputs, outputs, and outcomes relative to a problem we are trying to solve.

  6. WHY USE A SERVICE DELIVERY MODEL? • Connects activities with impact • Provides continuity • Continued improvement

  7. SERVICE DELIVERY MODEL a picture of your program; what you are putting into the program, what you are doing, and what you are trying to achieve. Clarifies the strategy underlying your program Builds common understanding, especially about the relationship between actions and results Communicates what your program is (and is not) about Forms a basis for planning and evaluation “Theory of action” – what is invested, what is done, and what results

  8. PLANNING PROCESS The planning process resolves around the basic definition of programming. Definition of Programming A Comprehensive approach to solving a problem or addressing a need A series of related activities focused on achieving a predetermined set of goals and objectives Not a “one time” event or single activity

  9. SERVICE MODEL PLANNING ELEMENTS Situation: Service models are built in response to an existing situation. We identify the problem or priority the program is responding to and the expected benefit to specific audiences. Inputs: Resources available to make your program work; could include the people, the money or the community resources that are necessary to operate the program. Inputs lead to Outputs. Outputs: Activities, products, methods and services you use represent your outputs. Examples include research, training, technical assistance and other services. Outputs lead to Outcomes.

  10. SERVICE MODEL PLANNING ELEMENTS Outcomes: Results and benefits for groups, individuals or communities represent outcomes. They may include direct products, services or events delivered through planned activities. External Factors: Outside forces that affect the implementation and success of the program. Assumptions: Beliefs we have about why our program will work.

  11. EVALUATION PLANNING An evaluation plan to assess the program can be superimposed using the Service Delivery Model format. Evaluation involves asking key questions.

  12. EVALUATION PLANNING ASKING KEY QUESTIONS: Were inputs made as planned? Were activities conducted as planned? Was the desired level of participation achieved? Did clients express or show that they were satisfied with the program?

  13. EVALUATION PLANNING ASKING KEY QUESTIONS: Did the participants show an increased level of knowledge, or awareness? Were behaviors of the clients modified or were policies changed? To what extent did the program affect social, economic, political, or environmental conditions? Are indicators appropriate and measurable?

  14. SERVICE MODEL PLANNING ELEMENTS Terms to know • Situation(What is the problem or need?) • Inputs (What we invest) • Outputs (What we do and who we reach) • Outcomes (Changes or results) • External Factors (Influence) • Assumptions (Principles) • Evaluation (Assessment of expected outcomes)

  15. DEVELOPING YOUR SERVICE MODEL Key Questions: What is the community-level impact (change) that our chapter would like to create as a result of our program? What are the long-term outcomes or behaviors we would like our clients to achieve? What are the short-term outcomes we would like our clients to achieve? What programs, strategies or services do we need to achieve the short and long term outcomes? What resources or inputs do we need to support strategy or service implementation? What is going on in our community or in our client’s lives that we have no control over but will affect the quality of the success of our program?

  16. HOW TO GET STARTED Conduct a needs assessment of the chapter’s service area to determine the community’s needs. This may be a simple gathering of data on the communities’ statistics from the newspapers, etc. Rank, prioritize and select from among them those top priorities that the chapter wishes to focus on. Review all programming requirements for Chapters from National and the Area. Identify partners with kindred interest. Establish goals and objectives that are outcome-based or measurable so that you will know what you have achieved and can quantify it if necessary. Identify strategies or activities to achieve desires goals and objectives.

  17. HOW TO GET STARTED Assign persons to be responsible for each strategy or activity. Establish a timeline, a calendar of activities along with reporting dates. Establish a monitoring system. Determine what data is to be collected and in what format. Develop forms for data collection. Develop an evaluation plan. Establish a budget.

  18. ARE YOU BEING ‘SMART’? • A good objective is clear and S.M.A.R.T. • Specific: Does the objective clearly specify what will be accomplished and by how much? • Measurable: Is the objective measurable? • Appropriate: Does the objective make sense in terms of what is desired to be accomplished? • Realistic: Is the objective achievable given the available resources and experience. • Time-based: Does the objective specify by when it will be achieved?

  19. EXECUTE THE PLAN Do what your plan says you will do. Document as you conduct your activities: take photos, videos, DVD’s, develop a scrapbook, etc. and assign people to do documentation activities for each segment of the program. Collect data such as: number of participants (chapter and others), description of participants and attendees, age range, gender, businesses, partners (addresses), length of activity, persons responsible, description of activity, budget and actual amount spent, and in-kind support. Monitor and take corrective actions as necessary.

  20. EVALUATE Summarize and analyze all of the data gathered ASK THE KEY QUESTIONS

  21. WRITING THE REPORT Select a committee to develop the report. Assign pertinent parts to the appropriate members/committees or chairs. Review the Program Review Criteria and follow the directions carefully. Use the report form as a guide for gathering and organizing the data for the report. Develop a draft of the report and circulate to the committees/chairs for review to ensure you have captured everything.

  22. PROGRAM REVIEW CRITERIA • PROGRAM PLANNING AND OPERATIONS • Met reporting requirements: due date, abstract/summary, and impact statement completed. • Indicate Facet(s) involved; umbrella; consideration for an award (SDM and program budget form documentation required). • Program Description/Problem Addressed is clearly stated. • Measurable goals and objectives are listed. • Activities (outputs) are appropriate for the goals/objectives of the program. • Number of people served was appropriate to the goals of the program. • Method used to determine the group served reflected a need in the community. • Consistent and active participation by group(s) targeted. • Chapter support and participation was evident.

  23. PROGRAM REVIEW CRITERIA • COLLABORATION • Collaboration with community resources • Collaboration with local, state or federal agencies/community groups • BUDGET & FINANCE • Realistic budget defined and executed • Supplemental funding (including in-kind) sought and received • IMPACT AND EVALUATION • Program operated more than one year • Sustainability/institutionalization is planned and pursued • Evaluation strategy in place and utilized • Outcomes/results are identified • PUBLICITY AND PUBLIC RELATIONS • Effective communication to and publicized in the community • Community participation, support and recognition are evident

  24. SERVICE DELIVERY MODEL AWARD CONSIDERATION • Situation (The Problem) • Priorities • Mission/Vision (what drives the outcome?) • Inputs/Resources (what you need to accomplish your activities) • Outputs – Participation (who are you serving?) • Outcomes – Short Term (expected changes in 1-2 years) • Outcomes – Medium Term (expected changes in 3-4 years) • Outcomes – Long Term Impact (long term societal changes) • External Factors • Assumptions • Evaluation-Indicators: Specific data tracked to measure progress, e.g., surveys, record reviews, observations, etc. • Organization Budget

  25. YOUR PROGRAM TOOLBOX • Service Model information and template • Biennium Report template • Sample evaluation form • Sample press release • Sample registration form (with photo permission) • Sample calendar/milestone chart • Donor database • Speakers database • Vendor and Facility database • Partner database • Chapter skills database • Etc.

  26. AWARD WINNING PROGRAM • Planning • Execution • Evaluation • Recording

  27. THANK YOU!

More Related