EOSDIS Survey Overview Carol.L.Boquist@nasa.gov HDF and HDF-EOS Workshop Nov. 4, 2009
Why we survey NASA’s Earth Observing System Data and Information System provides data products and associated services for interdisciplinary studies to a diverse user community. Beginning in 2004, the Earth Science Data and Information System (ESDIS) Project has conducted annual surveys to measure user satisfaction using the American Customer Satisfaction Index (ACSI). The results are one of the ESDIS Project’s metrics sent to the Office of Management and Budget (OMB).
The Easy Part We start by setting up an Interagency Agreement between NASA and the Department of Interior. The survey is conducted by CFI Group under contract with the Federal Consulting Group (FCG) of Department of Interior’s National Business Center. The FCG is Executive Agent in government for the ACSI. They work with OMB to ensure adherence to applicable Federal regulations such as the Privacy Act and the Paperwork Reduction Act.
EOSDIS Data Centers ASF DAAC Sea Ice, Polar Processes SEDAC Human Interactions in Global Change CDDIS Solid Earth LP DAAC Land Processes & Features GES DISC Atmos Dynamics, Strato Composition, Hydrology, Biosphere, Radiance OBPG Ocean Biology & Biogeochemistry NSIDC DAAC Cryosphere, Polar Processes MODAPS Atmosphere ORNL DAAC Biogeochemical Dynamics, EOS Land Validation ASDC Radiation Budget, Clouds, Aerosols, Tropo Chemistry PO.DAAC Ocean Circulation Air-Sea Interactions GHRC Hydrological Cycle & Severe Weather
EOSDIS Data Centers Alaska Satellite Facility DAAC—ASF DAAC http://www.asf.alaska.edu Synthetic Aperture Radar (SAR) Products, Sea Ice, Polar Processes, Geophysics Land Processes DAAC—LPDAAC http://LPDAAC.usgs.gov Surface Reflectance, Land Cover, Vegetation Indices Physical Oceanography DAAC— PO.DAAC http://podaac.jpl.nasa.gov Sea Surface Temperature, Ocean Winds, Circulation and Currents, Topography and Gravity Global Hydrology Resource Center—GHRC http://ghrc.nsstc.nasa.gov/ Hydrologic Cycle, Severe Weather Interactions, Lightning, Atmospheric Convection Oak Ridge National Laboratory DAAC—ORNL DAAC http://daac.ornl.gov Biogeochemical Dynamics, Ecological Data, Environmental Processes National Snow and Ice Data Center DAAC—NSIDC DAAC http://nsidc.org Snow and Ice, Cryosphere, Climate Interactions, Sea Ice
EOSDIS Data Centers Atmospheric Science Data Center—ASDC LaRC http://eosweb.larc.nasa.gov Radiation Budget, Clouds, Aerosols, Tropospheric Chemistry Crustal Dynamics Data Information System—CDDIS http://cddis.gsfc.nasa.gov Space Geodesy Ocean Biology Processing Group http://oceancolor.gsfc.nasa.gov Ocean Biology, Sea Surface Temperature, Biogeochemistry MODAPS Level-1 Atmospheres Archive and Distribution System— MODAPS LAADS http://ladsweb.nascom.nasa.gov MODIS Level-1 and Atmosphere Data Products Socioeconomic Data and Applications Center—SEDAC http://sedac.ciesin.columbia.edu Human Interactions, Land Use, Environmental Sustainability, Geospatial Data, Multilateral Environmental Agreements Goddard Earth Sciences Data and Information Services Center—GES DISC http://disc.gsfc.nasa.gov Atmospheric Composition, Atmospheric Dynamics, Global Precipitation, Hydrology, Solar Irradiance, Global Modeling, Multi-Sensor Research Products
Who do we survey? • E-mail addresses are obtained from: • Orders from registered users • Inquiries • Anonymous FTP access • Data center lists • The number of e-mail addresses per data center varies greatly because the data centers vary greatly: • Discipline-specific • Number or frequency of available products • Size of user communities • Number of anonymous ftp users • Restricted access or cost • Multiple invitations to the same user • A user with multiple e-mail addresses will receive multiple invitations • A user accessing multiple data centers will receive multiple invitations
Survey Questions • The survey contains comment fields and several types of questions: • Demographic questions • Questions to aid recall • Rating questions for the ACSI and EOSDIS models • Non-modeled rating questions • The questions were originally based on a previous survey of the members of the User Working Group for each of the data centers. • Formulating the questions is a combined effort of project personnel, EOSDIS User Services Working Group (USWG), and the CFI Group. • The challenge is to keep the survey short enough so users are willing to respond, yet long enough to reflect the complexity of the data and differences in data centers, and more importantly, understand the needs of the diverse user community.
How did EOSDIS Compare in 2008? The ACSI is the #1 national economic indicator of customer satisfaction. The ACSI is produced by the Stephen M. Ross Business School at the University of Michigan, in partnership with the American Society for Quality (ASQ) and the international consulting firm, CFI Group. The ACSI is used to measure over 40 industries and 200 organizations covering 45% of the U.S. economy. Over 70 U.S. Federal Government agencies have used the ACSI to measure more than 120 programs/services.
Survey Results 2004-2008 Although the survey contains over 50 questions, the ACSI is based on responses for only three questions. N= number of responses used (sample size). See Slide 9 for confidence levels.
The EOSDIS Model 2008 Results and Priorities • A series of questions are asked for each of the 6 elements of the EOSDIS model: • Product Search • Product Selection and Order • Delivery • Product Quality • Product Documentation • Customer support • CFI’s methodology quantifiably measures and links satisfaction levels to performance and prioritizes actions for improvement.
EOSDIS ACSI Score Comparison 2004-Comparisons 2004-2008 YYR • Federal Government* • Notes: • These numbers are from the end of year results available at thecsi.org. • The charts in our annual presentations may show a slightly different score because the EOSDIS ACSI score is computed before the end of the calendar year. • Our presentations have either the score from a previous year or a quarterly ACSI average of the scores available at the time that our EOSDIS ACSI score was computed.
Top-line Results 2004-2008 (cont.) * Product Quality includes format questions.
2004 HDF/HDF-EOS In what format were data or products provided?
2005 HDF/HDF-EOS Format received … Format preferred …
2006 HDF/HDF-EOS In 2005, 9% said products were provided in GeoTIFF and 25% who said it was their preferred method.
2007 HDF/HDF-EOS In 2006, 67% said products were provided in HDF-EOS and HDF and 42% said they were their preferred method. * *Multiple responses allowed
2008 HDF/HDF-EOS *Multiple responses allowed
2009 Format Questions • In what format(s) were your data products provided to you? (select any that apply) • HDF-EOS/HDF • NetCDF • Binary • ASCII • GeoTIFF • JPEG, GIF, PNG, TIFF • OGC Web services (WMS, WCS, WFS, etc.) • GIS (e00, shp, etc.) • KML, KMZ • CEOS • Don’t know • Other (please specify and/or comment • What format(s) would/do you prefer? (select any that apply) • HDF-EOS/HDF • NetCDF • Binary • ASCII • GeoTIFF • JPEG, GIF, PNG, TIFF • OGC Web services (WMS, WCS, WFS, etc.) • GIS (e00, shp, etc.) • CEOS • KML, KMZ • Other (Please specify another format or comment on specific version, etc. • Still using the 10-point scale on which “1” means “Poor” and “10” means “Excellent,” how would you rate… • Ease of using the data product in the delivered format • Overall quality of the data product • Overall usability of the data product
2009 Documentation Questions • What documentation did you use or were you looking for? • Instrument specifications • Science algorithm • Product format • Tools • Science applications • Data product description • Production code • Other • Was the documentation • Delivered with the data • Available online • Not found (Skip to Customer Services) • Still using the 10-point scale on which “1” means “Poor” and “10” means “Excellent,” how would you rate… • Overall quality of the document (i.e., technical level, organization, clarity) • Extent to which the data documentation helped you use the data
Customer Service • Have you requested assistance from <Data center name>’s user services office during the past year? • Yes, continue • No: Go to Closing • Was it • By phone • By E-mail • Both by phone and e-mail • Think about the user services staff you interacted with when you requested assistance from <Data center name> user services. On the same scale from 1 to 10 where 1 means “Poor” and 10 means “Excellent,” how would you rate the user services staff on… • Professionalism • Technical knowledge • Accuracy of information provided • Helpfulness in selecting/finding data or products • Helpfulness in correcting a problem • Timeliness of response
Some thoughts on users … http://www.humanfactors.com