220 likes | 390 Vues
The Quality Metadata S ubs ystem In the Czech Statistical Office. European Conference On Quality in Official Statistics 4-6 May 2010, Helsinki Czech Statistical Office Jitka Prokop. Content. Background Quality Metadata System Quality Monitoring Quality Assessment Lessons Learned
E N D
The Quality Metadata SubsystemIn the Czech Statistical Office European Conference On Quality in Official Statistics 4-6 May 2010, Helsinki Czech Statistical Office Jitka Prokop
Content • Background • Quality Metadata System • Quality Monitoring • Quality Assessment • Lessons Learned • Conclusions
1. BackgroundSMS-QUALITY features and functions • Architecture of metadata on quality • Links to other SMS subsystems • Quality Monitoring • Evaluation of quality, self-assessment and auditing tools • Support for quality reporting (ESS and EFQM reports) or auditing
1. BackgroundLinks between SMS subsystems SMS-CLASS SMS-VAR SMS-TASKS SMS-USERS SMS-RESP SMS-QUALITY SMS-DISSEM SMS-SERIES Statistical task - a set of statistical activities needed to fulfil a user’s request for statistical information. It can be composed of one or more statistical surveysor similar statistics.
1. BackgroundSMS-subsystems & SMS-QUALITY • The SMS subsystems • Enable definition of metadata • Provide values into SMS-QUALITYviews • Enable storage of values inserted via SMS-QUALITY
2. Quality Metadata SystemSMS-QUALITY Architecture Based on • The ESS concept of quality • Quality criteria / components / dimensions • Quality and performance indicators • The European Self-Assessment Checklist for Survey Managers (DESAP) • Statistical Business Process Model • Identificationof Q-attributesfor sub-processes
2. Quality Metadata SystemQuality Form Map • QFMis • DefinedstructureofQ-attributes(metadataitems) • Base of SMS-QUALITY • For different types of statistics (in some extent). • SW application for QFM • Defines Q-attributes and/or links into other SMS subsystems • Enables inputs of values manually • Provides views into DWH • Values of Q-attributes are stored in DWH
2. Quality Metadata systemLevels of Q-attributes • Most stable Q-attributes • Relates to the whole statistical task, survey, phase of process • e.g.: Key users, Information on training of staff, Methodology.. • Q-attributes related to processing in a concrete period • e.g.: Unit response rate, Extent of sample and frame, Punctuality C. Q-attributes related to a key statistical variable • In generally defined breakdown • e.g.: Coefficient of variation, Item response rate Levels of Q-attributes relate to content and stability of values.
3. Quality MonitoringAspects • Quality monitoring covers • Collection of data / metadata • Calculations of Q-attributes • Checks with expected values / scales (future; comparisons) • Quality monitoring is based on • Input variables, incl. „history of changes“ • Variables specially designed for quality monitoring • Related to questionnaire or interview • Related to concrete variables • Results of Q-attributes • During a sub-process / phase of statistical process • At the end of the sub-process / phase • After finishing the whole statistical process
4. Quality AssessmentThe Quality Assessment Guidance (QAG) • Purpose, applicability • Support for management of process • Support for high level decisions • Support for quality reporting • Self-assessment • Auditing • Levels - Assessment of • Quantitative and qualitative resultsof Q-attributes • Statistical survey / statistical task
4. Quality AssessmentWays ofAssessment • Categorial assessment • Averages of individual results • Textual assessment and summary • Expert commentaries (suggested issues in QAG) • Strong aspects • Weaknesses • Proposal of concrete actions, priorities
Structure of QA follows theESS quality criteria andcovers the following levels: Key statistical variablein particular breakdowns. Statistical variable as an aggregate (average) of the breakdowns or statistical survey. Set of similar indicators, quality of sub-process, quality sub-criteriaorcriteria. The audited statistics as thewhole. 4. Quality AssessmentLevels of Q Assessment
5.Lessons learned1. Links to the phases of the SBP Suggestions -> Collection -> Calculations -> Assessment -> Feedback and actions
5. Lessons LearnedCross-sectional aspects and Time coordination • Cross-sectional aspects • Ensure applicability for differenttypes of statistics • Avoid duplicities, arrange links with other SMS subsystems • Involve experts: • SMS-Quality project team • quality methodologists • subject-matter statisticians • ICT experts, members from other project teams • Time coordination • Design and implementation of SMS-QUALITY and other SMS subsystems to bemutually coordinated – Committee for the Redesign of SIS and SMS
5. Lessons Learned4.SMS-Quality project team • Appointed by the top-management • Suggests schedule of the SMS-QUALITYactivities • Regularly reports to the top-management • Designed QF-map (Architecture of SMS-QUALITY) => • Proposes SW application (content, functions) incl. updates • Coordinates implementation and cooperation among activities of involved experts
5.Lessons Learned5. Role of subject-matter statisticians and methodologists (a) Testing phase • Qualitymethodologists • Define Q-attributes and explanatory notes into SW application • Test the SW applications • Administrate SMS-Quality, QF Map, QAG • Provide scripts for calculations of quality indicators • Suggest scales for quality assessment • Subject-matter statisticians • Provide data for tests • Adjust scales for quality assessment • Manage routine quality monitoring, assessment
5.Lessons Learned5. Role of subject-matter statisticians and methodologists (b) Full implementation • Q-methodologists manage • Quality methodology updates (according to the ESS development) • Support subject-matter statisticians • Administration of SMS-QUALITY, updates • Subject-matter statisticians manage • Collection of Q-attributes (automatic or manually) • Quality assessment, self-assessment • Approval of results of assessment
5.Lessons Learned6. Compliance with ESS quality framework • SMS-QUALITY should be designed as a flexible tool to ensure easy methodology updates, taking into account development on the ESS level. • Application software should have necessary flexibility concerning collection, monitoring and assessment procedures.
6. Conclusions • Development of SMS-QUALITY has been scheduled with high priority for the next two years. • Further progress in development of SMS-QUALITY depend • On available human and financial resources due to fact the same experts are involved in real production of quality reports and in development of SMS-QUALITY. • On progress of other SMS subsystems. • Development of quality methodology on both national and international levels shall be taken into account.
Thank you for your attention. jitka.prokop@czso.cz