1 / 35

DOE Technical Assistance Program

DOE Technical Assistance Program. March 29, 2011, 2-3pm EST. Julie Michals, Northeast Energy Efficiency Partnerships, Inc. Chris Neme, Energy Futures Group. Developing an Evaluation, Measurement and Verification Plan For Residential Retrofit Programs. What is TAP?.

valadez
Télécharger la présentation

DOE Technical Assistance Program

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DOE Technical Assistance Program March 29, 2011, 2-3pm EST Julie Michals, Northeast Energy Efficiency Partnerships, Inc. Chris Neme, Energy Futures Group Developing an Evaluation, Measurement and Verification PlanFor Residential Retrofit Programs

  2. What is TAP? DOE’s Technical Assistance Program (TAP) supports the Energy Efficiency and Conservation Block Grant Program (EECBG) and the State Energy Program (SEP) by providing state, local, and tribal officials the tools and resources needed to implement successful and sustainable clean energy programs.

  3. How Can TAP Help You? On topics including: • Energy efficiency and renewable energy technologies • Program design and implementation • Financing • Performance contracting • State and local capacity building • TAP offers: • One-on-one assistance • Extensive online resource library, including: • Webinars • Events calendar • TAP Blog • Best practices and project resources • Facilitation of peer exchange

  4. The TAP Blog Access the TAP Blog!http://www.eereblogs.energy.gov/tap/ Provides a platform for state, local, and tribal government officials and DOE’s network of technical and programmatic experts to connect and share best practices on a variety of topics.

  5. Accessing TAP Resources We encourage you to: 1) Explore our online resources via the Solution Center 2) Submit a request via the Technical Assistance Center • 3) Ask questions via our call center at 1-877-337-3827 or email us at solutioncenter@ee.doe.gov

  6. Who We Are Program Design & Implementation/ Technical Assistance Team

  7. Julie Michals, Director, EM&V Forum at Northeast Energy Efficiency Partnerships, Inc. Chris Neme, Principal, Energy Futures Group Today’s Speakers

  8. Defining Evaluation, Measurement & Verification (EM&V) DOE Guidance on EM&V and Reporting Benefits of EM&V Developing a plan for Residential Retrofit EE programs Q&A OVERVIEW

  9. Evaluation: Impact evaluation: quantification of energy savings & other benefits Using M&V tools Can be at multiple levels – program, market segment, measure, etc. Process evaluation: assessment of program design & procedures Typically interviews, focus groups, etc. Typically focused on customers, trade allies, program staff Measurement & Verification: Collection of data (pre- and post efficiency measure installation)… Product info (e.g. manufacturer efficiency ratings) Metering (i.e. on-site measurement of energy use, power draw, hours of use) Billing data Simulation modeling (calibrated to building energy usage) …to support estimates of energy savings What is EM&V?

  10. DOE requires reporting of energy savings, but not specific EM&V as part of Grantees award agreement Estimates can be calculated using Recovery Act Benefits Calculator,but not intended to replace more rigorous EM&V techniques http://www1.eere.energy.gov/wip/solutioncenter/calculator/default.aspx Grantees with resources to conduct more sophisticated EM&V encouraged to conduct studies in accordance with Program Notice 10-017, and to share results with DOE through Project Officers. See http://www1.eere.energy.gov/wip/pdfs/eecbg_evaluation_guidelines_10_017.pdf DOE EM&V Guidance

  11. Detailed Data Collection Detailed Data Collection - DOE Guidance for SEP Grantees: • Contact Information of people served/impacted (name, company, address of contact, phone, email) • Detail descriptions of services received: address of actions taken, recommendations from audits, measures taken, installation dates etc. • e.g., CA Evaluation Protocols (April 2006, pg 205) http://www.calmac.org/events/EvaluatorsProtocols_Final_AdoptedviaRuling_06-19-2006.pdf

  12. DOE Reporting Requirements Reporting required on: • Job Impacts • Energy Savings • Energy Costs Reductions • Renewable Energy Capacity and Generation • Emission Reductions • Process Metrics: # buildings retrofitted, square footage, efficiency measures purchased, etc. • EEBCG Program Guidance 10-07B - Reporting: http://www1.eere.energy.gov/wip/pdfs/eecbg_reporting_program_guidance_10_007b.pdf • SEP Program Guidance 10-006BA – Reporting: http://www1.eere.energy.gov/wip/pdfs/sep_10-006a_arra_reporting_guidance.pdf

  13. Characteristics of a Leader

  14. Strategic Energy Planning Best Practices • Identify and convene stakeholders • Establish a leadership team • Develop a common energy vision • Develop a community energy baseline • Based on the vision and baseline, develop energy goals • Evaluate supply & demand policy and program resource options • Find and secure funding sources • Compile the plan • Measure & evaluate – continuously improve plan

  15. Why Conduct EM&V? • Retrospective: • How much: energy and money saved, pollution reduced, jobs created, etc.? • Was the investment cost-effective? • Prospective: • How can the program be improved? • What appealed to participants? Why didn’t others participate? • Did some retrofit contractors do better than others? Why? • Actual savings vs. forecasted savings • “closing rates” (from audit to completed jobs) • Comprehensiveness of treatment • Were opportunities for additional cost-effective savings missed? Why? • Note: Can also benefit program as it is being delivered

  16. Why Develop EM&V Plans? • EM&V Plan should ideally be developed before program launch: • Critical to ensure the right data are collected • Clarity around data tracking & reporting needs • Clarity around responsibilities for data • Ensures communications w/utilities about leveraging start early • Some EM&V features can be program design features • Allows for on-going program refinement – not just “after-the-fact” • Highlights expectations regarding accountability • Program staff • Retrofit contractors • Ensures adequate budget set aside

  17. EM&V Plan – Approaches for Residential Retrofit Impact Assessment Options • Deemed savings • Building energy modeling • Billing analysis Process Evaluation • Tracking database “mining” • Expert drive-alongs with retrofit contractors • Interviews – participants, non-participants, staff, contractors All of these can be done either independently or piggybacking on utility/other EM&V efforts (or combination of the two)…

  18. EM&V Plan – Deemed Savings Independent Approach • Identify likely common efficiency measures • Develop engineering assumptions/algorithms tailored to local situation • Ensure all data needed for calculations are collected for each job • Requires specialized efficiency expertise • Can be hired • Not necessarily expensive ($5k-$25k) for limited range of measures • Needs periodic maintenance/updating Piggybacking • Reach out to utility/others to identify existing assumptions/algorithms • Determine whether refinements needed • Get utility help to refine or add measures Note: combination of two approaches possible as well

  19. EM&V Plan – Deemed Savings (2) Numerous existing “local” tools: • e.g.: CA, Northwest, OH, MI, NJ, Mid-Atlantic, VT, MA – partial list • But not all address building envelope measures • Quality varies • Ease of use varies • Thus, leveraging still requires some technical expertise or support See Slide #31 for Deemed Savings references/resources

  20. EM&V Plan – Building Modeling Independent Approach • Identify modeling tool(s) that will be used • “Home Energy Score” – U.S. DOE pilot • Many others… • Distribute and train retrofit contractors on their use • Collect pre- and post-treatment modeling results for each home • Could also be done “after the fact”, if all necessary data are collected • Potential program design/delivery advantages • Also serves as “sales tool” • Gives participants a “leave behind” – energy rating/performance score Piggybacking • If utility/others using this approach, get community-specific data • Summarize & synthesize community-specific data Note: May need to combine w/ Deemed Savings for some measures

  21. DOE’s Home Energy Score Tool

  22. Earth Advantage EPS(using SIMPLE algorithms)

  23. EM&V Plan – Billing Analysis Independent Approach • Most accurate • Need to collect participants’ energy bills • Important to get “releases” signed during service delivery – difficult later! • Still not always without difficulties • Should have at least 12 months pre- and post-treatment info • Requires specialized technical expertise & statistical tools • Needs to be hired • Not necessarily expensive (10-30k) if you have good data, find right firm Piggybacking • If utility/others using this approach, get community-specific data • Requires up-front collaboration on EM&V design w/utilities • Will cost utility extra $, possibly necessitating community contribution Note: May need to combine w/Deemed Savings for some measures

  24. EM&V Plan – Process Evaluation Independent Approach • Some aspects can be conducted in-house: • Participant surveys • Non-participant surveys • Contractor interviews or focus groups • Some approaches require hiring specialized technical expertise • Drive-alongs with retrofit contractors • Tracking system reviews • Not necessarily expensive ($5-$15k) for targeted scope & the right firm Piggybacking • Possible to add extra, community-specific questions to surveys • Requires up-front collaboration on EM&V design with utilities/others • Will cost utility extra $, possibly necessitating community contribution

  25. Barriers to EM&V • Cost • Varies depending on extent of EM&V • Need to consider volume of participation relative to cost • Requires technical expertise – likely not in-house • Even leveraging existing utility efforts likely to require expertise • Limited range of contractors willing to do “small” projects • Data availability • Critical to good evaluation • Essential to minimize costs of hiring contractors • Requires on-going monitoring of what you are getting from the field • Requires up-front planning – integration w/program delivery • Collecting right data as you go • Choosing an accounting/tracking software • Ease of use for evaluation purposes • Program management tool too!

  26. EM&V Steps & Ideal Timeline

  27. ADDITIONAL BACKGROUND INFO AND RESOURCES

  28. International Performance Measurement & Verification Protocol (IPMVP Vol 1, 2010 www.evo-world.org) Detailed steps for comprehensive M&V planning: http://www1.eere.energy.gov/femp/pdfs/intro_mv.pdf EM&V Planning Guidance

  29. Guidelines for States Conducting or Contracting Evaluations of ARRA Funded SEP Activities (using 3rd party contractors): http://www1.eere.energy.gov/wip/pdfs/evaluation_webinar_slides_june16_2010.pdf High level guidelines/standards on: Evaluation Metrics – energy/demand savings, carbon emission reductions, job creation Independent Evaluations – by 3rd independent party Attribution of Effects – net effects due to SEP funds, with guidance on allocation of effects for jointly funded projects Evaluation Budgeting – recommends 5% or less of project budget Timing of Evaluation – evaluation planning to start at same time as when projects are initiated, determine baseline approach, data collection and analysis efforts Continued… DOE Guidance on EM&V – for GranteesConducting 3rd Party Evaluations

  30. High level guidelines/standards cont: State of the Art Analysis – evaluation approach should use current state of the art evaluation approaches and analysis methods Evaluation Rigor and Reliability: Study should be as reliable as possible within study approach and budget limits Study Design and Study Plan: Study methods/approach, tasks to be conducted, detailed data collection approach, detailed analysis approach for energy and demand savings Sampling and Statistical Significance: minimize bias and maximize representativeness of the population. Sample to be no less rigorous than 90% confidence level with +/- 10% precision M&V Approaches: analytic approach, baseline and post-installation operation assessments should use IPMVP field data collection frameworks (discussed later) DOE Guidance on EM&V – for GranteesConducting 3rd Party Evaluations cont.

  31. Grantees can refer to existing state energy efficiency program administrator data assumptions and algorithms if project data is not all available/collected. These “Technical Reference Manuals” (TRMs) include a mix of stipulated data, calculations based on models, prior EM&V studies and/or manufacturer specs. Several should be considered before using these sources Existing resources include: NW Regional Technical Forum: http://www.nwcouncil.org/energy/rtf/ Other state savings assumptions documents for: CT, MA, ME, NJ, NY, VT, PA and multi-state (MD, DC, DE) available at: http://neep.org/emv-forum/emv-library/research-evaluation-studies California DEER Database: http://www.energy.ca.gov/deer/ Other state TRMs CEE evaluation clearinghouse: http://www.cee1.org/eval/clearinghouse.php3 Available Energy Savings Data

  32. Approaches/methods range from simple and direct to complex and indirect, sometimes combined, where more complex methods generally require more detailed data and higher cost Guidelines for EM&V measurement/analysis include: US DOE/EPA Model Energy Efficiency Program Impact Evaluation Guide http://www.epa.gov/cleanenergy/documents/suca/evaluation_guide.pdf Regional EM&V Forum Guidelines: http://neep.org/emv-forum/forum-products-and-guidelines NW Regional Technical Forum Protocols http://www.nwcouncil.org/energy/rtf/ U.S. FEMP M&V Guidelines: Measurement and Verification for Federal Energy Projects Version 3.0, 2008 http://mnv.lbl.gov/ ASHRAE Guideline 14: Measurement of Energy and Demand Savings (2002) – updated version forthcoming 2011. www.ashrae.org CA Evaluation Protocols: http://www.calmac.org Most of the above refer to IPMVP: The International Performance Measurement & Verification Protocol (IPMVP Vol 1, 2010 www.evo-world.org) 3. The spread of high resolution usage data (AMI) and other new technologies provides the opportunity for better methods in the future – stay tuned! Which EM&V Approach to Use?

  33. Q&A Questions?

  34. Questions? CONTACTS VEIC: Dan Quinlan, dquinlan@veic.org, 802-488-7677 (Team Lead) MEEA: Steve Kismohr, skismohr@mwalliance.org, 312-784-7257 NEEP: Ed Londergan, elondergan@neep.org, 781-860-9177 NEEA: Elaine Blatt, eblatt@nwalliance.org, 503-688-5458 SWEEP: Curtis Framel, cframel@swenergy.org, 303-447-0078 SEEA: Scott Slusher, scott@seealliance.org, 480-239-4236 ACEEE: Eric Mackres, emackres@aceee.org, 202-507-4038 NRDC: Lara Ettenson, lettenson@nrdc.org, 415-875-6100 EFG: Richard Faesy, rfaesy@energyfuturesgroup.com, 802-482-5001

  35. Upcoming Webinars Please join us again: • Basic Benchmarking: Benchmarking Your Building’s Energy Use Using ENERGY STAR’s Portfolio Manager • Host: Courtney Smith, ICF • Date: March 30, 2011 • Time: 12:00-1:30 EDT For the most up-to-date information and registration links, please visit the Solution Center webcast page at www.wip.energy.gov/solutioncenter/webcasts

More Related