1 / 15

DATA QUALITY

DATA QUALITY. How closely do the data used reflect the truth about results?. What is quality. Quality is dynamic concept that is continuously changing to respond to changing customer requirements Conformance to specifications Fitness for use. Purpose of Data Quality Assessment.

Télécharger la présentation

DATA QUALITY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DATA QUALITY How closely do the data used reflect the truth about results?

  2. What is quality • Quality is dynamic concept that is continuously changing to respond to changing customer requirements • Conformance to specifications • Fitness for use

  3. Purposeof Data Quality Assessment MANDATORY: Data reported to Washington for Government Performance and Results Act (GPRA) reporting purposes or for reporting externally on Agency performance must have had a data quality assessment at some time within the three years before submission. (ADS203.3.8.3.) USAID Missions are mandate to conduct data quality assessments more frequently if needed. However, managers should be aware of the strengths and weaknesses of all indicators. USAID Missions are not required to conduct data quality assessments for data that are not reported to USAID/Washington. Managers are not required to do data quality assessments on all performance indicators that they use.

  4. Issues MANAGEMENT Can you make decisions based on the data? Better quality data leads to better informed management and planning. REPORTING Are the data believable? Audiences want to know how “credible” your data are so they can trust your analysis and conclusions. 4

  5. Quality issues • Problems can result from: • Human error: • Machine error • Process error:

  6. Five standards for quality of data VALIDITY RELIABILITY TIMELINESS PRECISION INTEGRITY 6

  7. Validity Key question: Do data clearly and directly measure what we intend? (7 indicator characteristics?) Issue: Direct • Result: Poverty of vulnerable communities in conflict region reduced • Indicator: Number of people living in poverty • Source: government statistics office • The government doesn’t include internally displaced people (IDPs) in the poverty statistics Issue: Bias • Result: Modern sanitation practices improved • Indicator: Number of residents in targeted villages who report using “clean household” practices • Source: door-to-door survey conducted three times a year • Most of the people in the targeted region work long hours in the fields during the harvest season 7

  8. ReliabilityKey question: If you repeated the same measurement or collection process, would you get the same data? Issue: Consistency or Repeatability • Result: Employment opportunities for targeted sectors expanded • Indicator: Number of people employed by USAID-assisted enterprises • Source: Structured interviews with USAID-assisted enterprises, as reported by implementing partner AAA, BBB, and CCC • The DO Team found out that the implementing partners were using these definitions: • AAA – employees means receives wages from the enterprise • BBB – employees means receives full-time wages from the enterprise • CCC – employees means works at least 25 hours a week 8

  9. Timeliness Key question: Are data available timely enough to inform management decisions? Issue: How Frequent • Result: Use of modern contraceptives by targeted population increased • Indicator: Number of married women of reproductive age reporting using modern contraceptives (CPR) • Source: DHS Survey • The DHS survey is conducted approximately every 5 years Issue: How Current • Result: Primary school attrition in targeted region reduced • Indicator: Rate of student attrition for years 1 and 2 at targeted schools • Source: Enrollment analysis report from Ministry of Education • In July 2002 the MOE published full enrollment analysis for school year August 2000 – June 2001 9

  10. Precision Key question: Are the data precise enough to inform management decisions? Issue: Enough detail • Result: CSO representation of citizen interests at national level increased • Indicator: Average score of USAID-assisted CSOs on the CSO Advocacy Index • Source: Ratings made by Partner XXX after interviews with each CSO • The DO team reported this data to the Mission Director: • 1999 = 2.422000 = 32001 = 3.000 Issue: Margin of error • Result: Primary school attrition in targeted region reduced • Indicator: Rate of student attrition for years 1 and 2 at targeted schools • Source: Survey conducted by partner. Survey is informal and has a margin of error of +/- 10% • The USAID intervention is expected to cause 5 more students (for every 100) to stay in school longer 10

  11. IntegrityKey question: Are there mechanisms in place to reduce the possibility that data are manipulated for political or personal gain? Issue: Intentional manipulation • Result: Financial sustainability of targeted CSOs improved • Indicator: Dollars of funding raised from local sources per year • Source: Structured interviews with targeted CSOs • When a DO Team member conducted spot checks with the CSOs, she found out that organizations CCC and GGG counted funds from other donors as part of the “locally raised” funds. 11

  12. Techniques to Assess Data Quality WHY Goal is to ensure DO team is aware of: Data strengths and weaknesses Extent to which data can be trusted when making management decisions and reporting All data reported to Washington must have had a data quality assessment at some time in the three years before submission. ADS 203.3.5.2 12

  13. Examples of problems • Invalid Key fields • Data collection forms not standardized • Location Accuracy • Different wards • Logical Inconsistencies • Jobs completed before they started • Mandatory Fields missing data • Sex or age • Data collectors Bias • Selfish personal interest

  14. Ways of improving quality • Tackle quality at source, not downstream in the lifecycle • Training data collectors is importance in getting it right • Continual improvement with quality method

  15. HOW? Steps to Conduct Assessment Review performance data Examine data collection, maintenance, processing procedures and controls Verify performance data against data quality standards Reliability, precision, timeliness, validity, integrity If data quality limitations are identified, take actions to address them Triangulate; Supplement with data from multiple sources Report the limitations Revise indicator Document the assessment and the limitations in reports to be sent to the Mission who in turn communicate with the IP about the outcome of the assessment Retain supporting documentation in files Photocopies of documents collected during the exercise Approach for conducting data quality assessment If data will be included in the annual report, disclose the DQA findings in the “data quality limitations” section of the Annual report 15

More Related