1 / 21

United Nations Oslo City Group on Energy Statistics

ESCM Chapter 8: Data Quality and Meta Data. United Nations Oslo City Group on Energy Statistics. OG7, Helsinki, Finland October 2012. IRES Chapter 9: deals with Data Quality Assurance and Meta Data Under IRES, countries are encouraged to: Develop national quality assurance programs

Télécharger la présentation

United Nations Oslo City Group on Energy Statistics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ESCM Chapter 8: Data Quality and Meta Data United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012

  2. IRES Chapter 9: deals with Data Quality Assurance and Meta Data Under IRES, countries are encouraged to: Develop national quality assurance programs Document these programs Develop measures of data quality Make these available to users Introduction

  3. Institutional and organizational conditions, including: Legal basis for compilation of data Adequate data-sharing and coordination between partners Assurance of confidentiality and security of data Adequacy of resources – human, financial, technical Efficient management of resources Quality awareness Prerequisites of Data Quality

  4. Make quality a stated goal of the organization Establish standards for data quality Track quality indicators Conduct regular quality assurance reviews Develop centres of expertise to promote quality Deliver quality assurance training Promoting Data Quality

  5. All planned activities to ensure data produced are adequate for their intended use Includes: standards, practices, measures Allows for: Comparisons with other countries Self-assessment Technical assistance Reviews by international and other users See Figure 8.1 for examples of quality frameworks What is a Quality Assurance Framework?

  6. Six Dimensions of Data Quality, based on ensuring “fitness for use” Relevance Accuracy Timeliness Accessibility Interpretability Coherence Quality Assurance Framework

  7. Should cover all elements of the Quality Assurance Framework Methodology should be well-established, credible Must be easy to interpret and use Should be practical – reasonable, not an over-burden For Key Indicators, see Chapter 8, Table 8.2 Quality Measures and Indicators

  8. From IRES Table 9.2, linked to QA Framework Relevance: user feedback on satisfaction, utility of products and data Accuracy: response rate, weighted response rate, number and size of revisions Timeliness: time lag between reference period and release of data Accessibility: number of hits, number of requests Interpretability: amount of background info available Coherence: validation of data from other sources Sample Quality Indicators

  9. Quality assurance must be built into all stages of the survey process Survey Stages: Specify needs Design Build Collect Process Analyze Disseminate Archive Evaluate

  10. 1. Specify Needs Activities: Quality Assurance Consult with users and key stakeholders Clearly state objectives, concepts Establish quality targets Check sources for quality, comparability, timeliness Gather input and support from respondents • Determine needs: define objectives, uses, users • Identify concepts, variables • Identify data sources and availability • Prepare business case

  11. 2. Design Activities: Quality Assurance Consult users on outputs Select, test & maintain frame Design & test questionnaire and instructions Use established standards Develop processes for error detection Develop & test imputation • Determine outputs • Define concepts, variables • Design data collection methodology • Determine frame & sampling strategy • Design production processes

  12. 3. Build Activities: Quality Assurance Focus test questionnaire with respondents Test systems for functionality Test workflows; train staff Document Develop quality measures • Build collection instrument • Build processing system • Design workflows • Finalize production systems

  13. 4. Collect Activities: Quality Assurance Maintain frame Train collection staff Use technology with built in edits Implement verification procedures Monitor response rates, error rates, follow-up rates, reasons for non-response • Select sample • Set up collection • Run collection • Finalize collection

  14. 5. Process Activities: Quality Assurance Monitor edits Implement follow-ups Focus of most important respondents Analyze and correct outliers • Integrate data from all sources • Classify and code data • Review, validate and edit • Impute for missing or problematic data • Create and apply weights • Derive variables

  15. 6. Analyze Activities: Quality Assurance Track all indicators Calculate quality indicators Compare data with previous cycles Do coherence analysis Validate against expectations and subject matter intelligence Document all findings • Transform data to outputs • Validate data • Scrutinize and explain data • Apply disclosure controls • Finalize outputs

  16. 7. Disseminate Activities: Quality Assurance Format & review outputs Verify that tools do not introduce errors Verify disclosure control Ensure all meta data and quality idicators are available Provide contact names for user support • Load data into output systems • Release products • Link to meta data • Provide quality indicators • Provide user support

  17. 8. Archive Activities: Quality Assurance Periodic testing of processes and systems Ensure meta data are attached • Create rules and procedures for archiving and disposal • Maintain catalogues, formats, systems

  18. 9. Evaluate Activities: Quality Assurance Consult with clients about needs, concerns Monitor key quality indicators Periodic data quality reviews Perform ongoing coherence analysis Compare with best practices elsewhere • Conduct post mortem reviews to assess performance, identify issues • Take corrective actions or make new investments, as required

  19. Important for assessing “fitness for use” and ensuring interpretability Required at every step of the survey process Critical for enabling comparisons with other data Should include results of data quality reviews Figure 8.4: generic set of meta data requirements Meta Data

  20. Should become a driver of survey design Can be used proactively to prescribe definitions, concepts, variables and standards Can support the harmonization of international surveys and data Efforts are underway to create an integrated approach for producing and recording meta data Future of Meta Data

  21. Andy Kohut, Director Manufacturing and Energy Division Statistics Canada Section B-8, 11th Floor, Jean Talon Building Ottawa, Ontario Canada K1A 0T6 Telephone: 613-951-5858 E-mail: andy.kohut@statcan.gc.ca www.statcan.gc.ca Thank you!

More Related