1 / 30

Candice Kasprzak, GISP Project Manager & David Hart, CP President

Is This Data Fit to Serve? Defining Acceptance Criteria for Completeness of Digital Mapping Products. Candice Kasprzak, GISP Project Manager & David Hart, CP President Continental Mapping Consultants, Inc. Madison, WI ASPRS/MAPPS Conference November 18, 2009 San Antonio, TX.

declan
Télécharger la présentation

Candice Kasprzak, GISP Project Manager & David Hart, CP President

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Is This Data Fit to Serve?Defining Acceptance Criteria for Completeness of Digital Mapping Products Candice Kasprzak, GISPProject Manager & David Hart, CP President Continental Mapping Consultants, Inc. Madison, WI ASPRS/MAPPS Conference November 18, 2009 San Antonio, TX

  2. Who we are and why we developed this paper • Continental Mapping is a Production Photogrammetry firm • We provide foundational geospatial data from satellite, airborne and terrestrial sensors (imagery and lidar) • We process imagery, derive topographic and planimetric mapping data • Photogrammetrists and image/geospatial analysts bulk of our workforce • Develop mapping products and GIS datasets for fed/state/local government clients utility, manufacturing and other private clients

  3. Background • We have developed an understanding and PROCESS for Mapping Specifications Development • See a need to educate users on: • The process of developing mapping specifications that are Fit for a Specific Use • Defining Appropriate Acceptance Evaluation Criteria

  4. Mapping Specifications Are Hard Because… Developing a DigitalMapping Specification is Developing an Abstraction This concept is misunderstood and often lost. Why?

  5. A Map is an Abstraction • The strategy of simplification (abstraction) applies to any map • Focus of this presentation limited to vector mapping products derived from imagery (planimetric data) • Plan mapping specs more difficult that topographic or imagery specs because they require more interpretation (far more abstraction decisions)

  6. It is hard to evaluate an abstraction objectively Developing acceptance criteria for digital mapping products is difficult to define because a map is a(n): • Generalization & Simplification • Abstraction • the process or result of generalization by reducing the information content of a concept or an observablephenomenon, typically to retain only information which is relevant for a particular purpose

  7. Mapping Specs in a Digital Age • The Past: hard copy mapping products were 2D cartographic representations • The Present: the digital age of mapping introduced: • 3D • Topology • Multi-Scale display • Data Association (GIS) • Spatial relationships (such as contiguity and connectivity) • The Future: interactive geo-enabled content, multi-scale, multi-temporal, interactive, real-time (currency, clarity, content improvements)

  8. Example Mapping Spec Guidance • Interpretive • Subjective • Evaluation foracceptance needsto focus on itsFitness for Use

  9. Existing Standards and Evaluation Criteria are Inadequate or Incomplete • Many spatial data standards related to Positional Accuracy • Some existing methodologies related to evaluation of quality and completeness but… • Many of these are automated topology and attribute checks • These processes do not have mechanisms for evaluating abstraction objectively. This leads to conflict between data creators and reviewers and higher costs

  10. So what is the solution • The Problem • Subjective review of abstract mapping data leads to conflict, higher cost, and a lost focus on “what’s important on this map” • The Solution • Develop a method that provides guidance on a spatial sampling methods that are systematic, repeatable, and eliminate total inspection • Develop a method that provides a framework for evaluation of data quality and completeness • Provides a measure of a digital maps Fitness for Use

  11. IGSM – Integrated Geospatial Sampling Model (Gillies) • Combination of two federally accepted Quality Management standards • ISO 19114 • MIL-STD 1916 • Charles Gillies Paper – Geospatial Statistical Quality Management: Integration of MIL-STD 1916 and ISO 19114:2003 • Uses statistical process control (SPC) methods • Systematic & repeatable • Eliminates total inspection

  12. ISO 19114 • International Standard for geographic information quality evaluation procedures and data characteristics • Provides guidance on random spatial sampling framework • Outlines Data Quality Elements for geospatial data

  13. ISO 19114 – Data Quality Elements • 5 data quality elements and 16 sub-elements to measure various aspects of geospatial data. • Methodology focuses on Four • Completeness – Omission • Completeness – Commission • Thematic Accuracy – Classification Correctness • Thematic Accuracy – Non-Quantitative Attribute Correctness

  14. ISO 19114 – Data Quality Elements • Completeness – Feature Based Element • Commission – Excess data in a dataset • Over collection - either collected houses that weren't houses, or over collected a feature - collected all buildings within a school when only the boundary was needed • Omission – Data absent in a dataset • Under collection - Missed feature or groups of features - missed a house in the woods, or collected a BUA instead of each individual house within it

  15. Completeness Example

  16. Completeness Example

  17. Completeness Example

  18. ISO 19114 – Data Quality Elements • Thematic Accuracy – Attribute Based Element • Classification Correctness -Comparison of the classes assigned to features or their attributes to a universe of discourse (ground truth or reference data set) • A secondary class stream (class B) is classified as a tertiary stream (class C) • Non-Quantitative Classification Correctness - Correctness of non-quantitative attributes • The Name attribute "Main St" on road layer is Main St on the universal/reference map.

  19. ISO 19114 – Sampling Plans

  20. ISO 19114 – Evaluation Measures • How do you measure whether a data element is acceptable • Nominal – use of a classification scheme - wet/dry, yes/no, pass/fail • Ordinal – ranking scale – Grade (A,B,C,D) or numerical rating (1-5) • Interval – assign exact number – elevation, temperature • Ratio – Percentage of features not meeting a criteria

  21. ISO 19114 – Sampling Strategy

  22. MIL-STD 1916 • AKA: ISO 21247:2005 • Department of Defense accepted methodology for sampling data • Guide for creating sample population sizes

  23. MIL-STD 1916 – Verification Levels & Code Letters • Combination Defines the Population Lot Size and whether or not a sample or full inspection evaluation method is needed • Switching Procedures for Tightened and Reduced lot sizes based on past performance

  24. MIL-STD 1916 – Verification Levels (VL) & Code Letters (CL) Can use VL as a guide for fitness of use for certain features. More important features will have a higher VL level then less important ones. For Tightened Inspection Level move to the left one VL, for Reduced move to the Right

  25. Integrated Geospatial Sampling Model - Methodology • Step 1 - Identify features and data quality elements and sub elements • Step 2 - Determine the appropriate evaluation measure for each feature type and DQ element (nominal, ratio, ordinal) • Step 3 – Define Sampling Strategy • Population Definition • Sampling Procedure • Verification Level

  26. Integrated Geospatial Sampling Model - Methodology • Step 4 – Determine Evaluation Method – sample or full inspection – based on VL/CL • Step 5 – Inspect • Step 6 – Report • These steps will populate table for reporting measures

  27. Integrated Geospatial Sampling Model - Methodology

  28. Where to go next • Develop Standards for collection (based on fitness for use categories) – Not reinventing the wheel each time • Develop Beta Software for aiding in sampling population sizes and reporting findings • Fully test and report on findings – Is this a viable methodology for QA/QC geospatial data

  29. Conclusions • The Problem • Subjective review of abstract mapping data leads to conflict, higher cost, and a lost focus on “what’s important on this map” • The Solution • The development of a method that provides guidance on a spatial sampling methods that are systematic, repeatable, and eliminate total inspection • That provides a framework for evaluation of data quality and completeness and • Provides a measure of a digital maps Fitness for Use

  30. Questions? • Our contact info: • David Hart, CPPresidentdhart@continentalmapping.com • Candice Kasprzak, GISPProject Managerckasprzak@continentalmapping.com

More Related