1 / 30

Using Macro and Micro Data Sets in Cross National Analyses in Developing Countries

Using Macro and Micro Data Sets in Cross National Analyses in Developing Countries. Workshop 1: Unit 2 Roy Carr-Hill Universities of East London, London and York. Objectives: Participants able to:. Understand the rationale for linking macro and micro data

Télécharger la présentation

Using Macro and Micro Data Sets in Cross National Analyses in Developing Countries

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Macro and Micro Data Sets in Cross National Analyses in Developing Countries Workshop 1: Unit 2 Roy Carr-Hill Universities of East London, London and York

  2. Objectives: Participants able to: • Understand the rationale for linking macro and micro data • Be aware of the typical factors that are likely to vary across countries or regions • Be aware of the limitations on the macro data sets that are currently available • Understand the differences between the International Micro Data Sets that are publicly available in terms of purpose, technical data quality, user-friendliness • [Describe some key quality control processes in collection and processing of survey data] • Be able to critically assess the compatibility of data collected for similarly named variables at the micro, meso and macro levels]

  3. Organisation of Presentation • Rationale for linking Macro and Micro Data • Availability of Data for Developing Countries • Limitations of Available Macro Data • Limitations of Available Micro Data • Monitoring Poverty/ Socio-Economic Status through Household Surveys • Reliability of Self-Reports of ill-Health biased • [Assessing the Quality of Survey Data] • Examples of Pitfalls in Interpretation

  4. I. Rationale Two main reasons • Generalise ability of findings obtained from single – usually individual level – sample • Improve understanding of association observed – usually at country level. Hence structure • Data Availability • Data Quality and Limitations • Measurement and Compatibility Issues • Errors of interpretation

  5. II. Availability of Data Sets • Demographic Household Surveys sponsored by USAID (www.measuredhs.com) • Living Standards Measurement Surveys - World Bank (http://www.worldbank.org/lsms/), • Multiple Indicator Cluster Surveys - UNICEF (http://www.unicef.org/statistics/index_24302.html).

  6. Activity A: Variation across countries • In each group draw up a short list of the major factors that you think affect literacy/ smoking behavio0ur; and then discuss which of these are most likely to vary between countries or between regions • Would it be sensible to carry out a combined cross country analysis? Are the answers different in the two cases?

  7. III. Macro Data Sets: How reliable?; can they be compared? Debates about Poverty • Ever more sophistication in concepts (absolute/relative) and analysis of poverty (e.g poverty mapping; • Whilst recognised as a problem there is still very little attention to quality of basic data Millennium Development Goals • How will we know where we are in 2015? • How far are we away now?

  8. III.1 Population Denominator Data in Developing Countries • Only a few countries have functioning registration systems • Current population estimates are based on Coale-Brass-Demeny population models • As Chris Murray showed, in several countries, the estimates are based on parameters from neighbouring countries NONE OF DATABASES COVER THIS

  9. III.2 Quality of Data in Developing Countries • International recognition that quality of statistics has deteriorated (e.g. Can’t Count Progress, ODI Review) BUT • Majority of information systems are donor funded with minimal national involvement • Routine administrative data systems are ‘thin’ • Still little attention to assuring the quality of the basic data

  10. III.3 Deterioration of Statistical Systems • Lack of attention since independence to ‘boring’ issue of infra-structure of statistical systems; but • Current trend towards decentralisation usually means that district estimates are central to resource allocation

  11. III.4. Donor Funded Systems “The global statistical system is fragmented and characterised by poor inter-agency co-operation. Whilst more information is now available compared with previous years, this is usually through the medium of donor funded household surveys, which may by-pass domestic information systems and serve the needs of donors rather than developing countries themselves.” Can’t Count Progress

  12. III.5. Routine Administrative Data in Databanks • Population Censuses – very high coverage but usually insufficient information • Collection of Data on use of Education and Health services, rarely includes socio-demographic data • Collection of data on receipt of income support/welfare also biased on both data collector and supplier sides

  13. III.6. Quality of Basic Data • Entrenched systems • Little or no inspection or quality assurance • Weak capacity – numbers and qualifications • No local use of data – so no incentive to verify (Musgrove – data has to be used within 5km to ensure reliability)

  14. III.7 Measurement of Income or Socio-Economic Status • Lack of agreement over whether to use absolute or relative poverty • Conventional levels like US$1 or US$2 a day per person are used with little evidence • Lack of relation with other measures of well-being e.g. mortality, education • Usually based on household expenditure surveys, i.e. what is consumed in the market omitting barter, black markets and exchange

  15. III.8 Asset Indices Difficulty of asking expenditure has led to development of ‘asset indices’ but: • no information on quality and quantity of goods and services including the reliability of the asset • distinguishing between household ownership, household based assets and individual access • Routine Administrative Data • problems in generalising across different communities

  16. IV. Monitoring Social Conditions: Household Surveys MOST MACRO SOCIAL DEVELOPMENT DATA BASED ON HOUSEHOLD SURVEYS

  17. IV.1 Three Difficulties • Household surveys do not include the poorest of the poor (see next slide); • Consumption expenditure is a poor substitute for measuring standard of living; • The proxies used to measure poverty are almost impossible to compare over time because of changes in reach of formal economy) so that even within country trends are very difficult to assess

  18. IV.2 Household Surveys: Omissions • Those not in households because they are homeless • Those who are in institutions • Mobile, Nomadic or pastoralist populations • Many of those in fragile or disjointed or multiple occupancy households.

  19. IV.3 Limits of Self-Reporting • Focus on household rather than community – or intra-household – poverty • Known associations between income and relative reporting e.g. • reported illness rate higher in households with piped water supply, with inside toilets with central heating, with TV; • reported illness increases, reported deaths decrease with mother’s educational level • Consumption and income poverty may not be the most salient (e.g. refugee camps, communication in Palestine)

  20. IV.4 Possible Solutions? • Extending the Sample - need to be confident about sampling frame • Attributing - poverty mapping assumes initial relationship valid and relies on outside experts • Modelling - comparison of trend estimates from different surveys, supplement with local evidence about excluded groups

  21. Activity: Suitability of data from surveys Issues to consider are: • Definitions and time periods (obviously) • Context and purpose of data collection especially in micro study • Coverage of macro data compared to the sample you are looking at • If your own study uses secondary data sources, you should explore the extent to which the technical reports on those data sources cover these issues of coverage and excluded populations.

  22. V. Different definitions of quality • ‘Fitness for use’ • ISO norm 8402: ‘Totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs’ • Process (throughput) versus product (output) quality versus customer satisfaction (outcome?)

  23. V.2 ESS Dimensions of Quality • Relevance: are the data what user wanted • Accuracy: is the figure reliable • Punctuality and timeliness: on time according to pre-determined schedule • Accessibility and Clarity: understandable? • Comparability: across countries • Coherence: with other data • (for details see HO 1A, 1B)

  24. V.3 DQAF of IMF • DQAF: developed for National Accounts & Balance of Payments data (IMF webpage) • Pre-requisites of Quality: e.g. Legal and institutional environment: supportive environment of statistics resources: adequate for the needs of the statistical programmes • Components: Assurances of Integrity; Methodological Soundness; Accuracy and Reliability; Serviceability; Accessibility; • Comparisons/ overlaps with ESS • (See Annexes for comparison with ESS)

  25. V.4 Concerns about Data Quality Official and Administrative Data • Focus on recording activity rather than processes or outcomes • Motivation to collect reliable data • Hierarchy of data collectors • Pressures to manipulate data

  26. V.5 Quality of Surveys • Were standard contracts issued? (HO 4) • Do they have in-house quality control (HO 5) • Were power calculations of sample size carried out – see HO 6 on Swaziland • Were questions appropriate to problem • Did they follow appropriate procedures (see HO 7)

  27. V.6 Data Collection: Instrumentation • Instrumentation • Has a check sum facility or similar been included? • Has instrument been piloted in a variety of contexts? • Piloting procedures of data collection • (not just for question content of surveys) • Team organisation • Timing and Cost • Acceptability to interviewees

  28. V.7 Data Collection: Field Work • Training • Assess commitment of interviewers to collecting reliable data • During Fieldwork • Procedures for logging work, ensuring correct (unique) identification of materials • Audit Trail

  29. V.8 Processing and Analysis Processing • Training of those entering data • Double Entry of Data (at least a sample) • Independence from Administrative Intervention Analysis • Appropriateness of techniques used • Plausibility of results • Is there any confirmatory data (HO 8) AND ALWAYS : Is there an Audit Trail

  30. VI Examples of Pitfalls in Interpretation • Comparison of educational performance of pupils in Anglophone and Francophone educational systems in Vanuatu • Comparing inequalities in income within a country with inequalities of health within a country and GNP per capita • Comparing Literacy achievement across regions of Uganda • Bias in reporting health across 110 countries in DHS Surveys

More Related