1 / 38

NASA

NASA. Earth Observing System Data and Information System Customer Satisfaction Results November 8, 2010 November 9, 2010 November 15, 2010. Today’s Discussion. Background Overview Key Results Detailed Analysis Summary. Background. Project Background Objectives.

Télécharger la présentation

NASA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NASA Earth Observing System Data and Information System Customer Satisfaction Results November 8, 2010 November 9, 2010 November 15, 2010

  2. Today’s Discussion • Background • Overview Key Results • Detailed Analysis • Summary

  3. Background

  4. Project BackgroundObjectives • Measure customer satisfaction with the NASA Earth Observing System Data and Information System at a national level and for each Data Center • Alaska Satellite Facility Distributed Active Archive Center • Crustal Dynamics Data Information System • Global Hydrology Resource Center • Goddard Earth Sciences Data and Information Services Center • Land Processes Distributed Active Archive Center • MODAPS Level-1 Atmospheres Archive and Distribution System • NASA Langley Atmospheric Science Data Center • National Snow and Ice Data Center Distributed Active Archive Center • Oak Ridge National Laboratory Distributed Active Archive Center • Ocean Biology Processing Group • Physical Oceanography Distributed Active Archive Center Jet Propulsion Laboratories (JPL) • Socioeconomic Data and Applications Center • Assess the trends in satisfaction with NASA EOSDIS specifically in the following key areas: • Product Search • Product Selection and Order • Delivery • Product Quality • Product Documentation • Customer Support • Identify the key areas that NASA can leverage across the Data Centers to continuously improve its service to its users

  5. Project BackgroundMeasurement timetable

  6. Project BackgroundData collection Respondents • 4,390 responses were received • 4,390 responses were used for modeling Those who answered for more than one data center: Two: 134 Three: 12 Four: 2 Five: 1 E-mail addresses from lists associated with some of the data centers were included to reach the large number of users who may have accessed data via anonymous ftp.

  7. Project BackgroundRespondent information Demographics (when comparable) remain fairly consistent with 2009 For which specific areas do you need or use Earth science data and services? * Multi-select question; Answer choices added in 2010; Language to question was changed slightly in 2009; Modeling was asked as a separate question prior to 2008

  8. 2007 2008 2009 2010 Percent Percent Percent Percent Currently located - USA vs All Others USA 35% 32% 29% 27% All Others 65% 68% 71% 73% Number of Respondents 2,290 2,601 3,842 4,390 Downloaded data or data products Downloaded -- -- 96% 94% Have not downloaded -- -- 4% 6% Number of Respondents 0 0 3,842 4,390 Method of searching for data products or services Data center´s or data-specific specialized search, online holdings or datapool 23% 18% 40% 49% Direct interaction with user services personnel 2% 3% 4% 3% Global Change Master Directory 2% 1% 1% 1% Internet search tool 18% 12% 19% 16% Warehouse Inventory Search Tool (WIST) 52% 41% 29% 17% Did not search -- -- 5% 4% Other 3% 4% 3% 3% Number of Respondents 2,291 2,601 3,675 4,390 Data delivery method FTP immediate retrieval from online holdings 20% 23% 20% 21% FTP retrieved after order 55% 49% 52% 45% FTP via subscription 3% 5% 4% 5% http-based download from Web 14% 14% 17% 16% http-based batch download from Web 2% 3% 3% 3% Web-based visualization tool -- 3% 2% 3% Web services -- -- -- 2% Machine to machine transfer -- -- -- 0% Physical media -- -- -- 1% Don´t know -- -- -- 3% Other -- -- 2% 1% Number of Respondents 2,291 2,601 3,601 4,040 Looked for or got documentation Looked -- -- 73% 72% Did not look -- -- 27% 28% Number of Respondents 0 0 3,842 4,390 Project BackgroundRespondent information Demographics (when comparable) remain fairly consistent with 2009 * Questionnaire was modified in 2009 and 2010; Prior to 2010 WIST also included EDG. WIST became available in 2005. EDG was decommissioned Feb. 2008 when all data could be accessed through WIST.

  9. Overview Key Results

  10. NASA EOSDISCustomer satisfaction results N=1016 N=2601 N=3842 N=1263 N=2291 N=4390 N=2857 2008 2007 2009 2010 2004 2005 2006 ACSI 75 74 77 75 77 77 78 (+/-) 0.5 (+/-) 0.6 (+/-) 0.4 (+/-) 0.7 (+/-) 0.4 (+/-) 0.9 (+/-) 0.5 Overall satisfaction How satisfied are you with the data products and services provided by [DAAC]? 79 81 81 81 80 82 78 Expectations To what extent have data products and services provided by [DAAC] fallen short of or exceeded expectations? 73 73 71 74 74 73 73 Ideal How close does [DAAC] come to the ideal organization? 71 75 75 75 73 76 72

  11. NASA EOSDIS Benchmarks Strong performance continues … ACSI (Overall) is updated on a quarterly basis, with specific industries/sectors measured annually. Federal Government (Overall) is updated on an annual basis and data collection is done in Q3. Quarterly scores are based on a calendar timeframe: Q1- Jan through March; Q2 – April through June; Q3 – July through Sept.; Q4 – Oct. through Dec.

  12. NASA EOSDIS ModelProduct Search/Selection/Documentation and Customer Support most critical Scores The performance of each component on a 0 to 100 scale. Component scores are made up of the weighted average of the corresponding survey questions. Impacts The change in target variable that results from a five point change in a component score. For example, a 5-point gain in Product Search would yield a 1.0-point improvement in Satisfaction.

  13. NASA EOSDIS 2007 – 2010 Significant changes from 2009 (+/-) 0.4 (+/-) 0.8 (+/-) 0.5 (+/-) 0.5 (+/-) 0.5 (+/-) 0.5 (+/-) 0.5 =Significant Difference vs. 2009

  14. Areas of Opportunity for NASA EOSDIS Remain consistent year over year Top Improvement Priority Product Search (76) Product Selection and Order (77) Product Documentation (76)

  15. Detailed Analysis

  16. Score ComparisonSame CSI inside and outside the USA 73% of respondents are outside of the USA in 2010 vs. 71% in 2009. Respondents outside the USA have the same Satisfaction score with EOSDIS (77). Compared to last year there was no score change for those respondents within the USA and a one-point score increase for those respondents outside the USA.

  17. CSI by Data CentersOnly one data center shows a statistically sig. change (+/-) 2.1 (+/-) 2.9 (+/-) 1.7 (+/-) 2.2 (+/-) 3.3 (+/-) 0.6 (+/-) 1.0 (+/-) 1.4 (+/-) 1.4 (+/-) 1.7 (+/-) 2.0 (+/-) 2.6 =Significant Difference vs. 2009

  18. Product SearchKey driver of satisfaction 49% used data center’s or data-specific specialized search, online holdings or datapool (40% in 2009) 17% used WIST to search for data and products 16% selected Internet Search Tool (19% in 2009) =Significant Difference vs. 2009 Impact=1.0

  19. Product Search Score ComparisonBy method for most recent search How did you search for the data products or services you were seeking? (+/-) 0.6 49% (+/-) 2.7 3% (+/-) 5.8 1% (+/-) 1.2 16% (+/-) 1.0 17% (+/-) 3.2 3% =Significant Difference vs. 2009 *WIST became available in 2005. EDG was decommissioned Feb. 2008 when all data could be accessed through WIST.

  20. Product Search Scores by Data Center; variation in the trends (+/-) 2.4 (+/-) 2.8 (+/-) 2.5 (+/-) 2.6 (+/-) 4.1 (+/-) 0.7 (+/-) 1.1 (+/-) 1.6 (+/-) 1.7 (+/-) 2.4 (+/-) 2.7 (+/-) 3.3 =Significant Difference vs. 2009

  21. Product Selection and Order Also a top opportunity for continuous improvement Did you use a sub-setting tool? 36% said No 41% said Yes, by geographic area 4% said Yes by geophysical parameter 16% said Yes, by both geographic area and geophysical parameter 3% said by band 1% said by channel. 94% said that they are finding what they want in terms of type, format, time series, etc. =Significant Difference vs. 2009 Impact=1.3

  22. Product Selection and Order Scores by Data Center; variation in the trends (+/-) 2.3 (+/-) 2.8 (+/-) 2.0 (+/-) 2.5 (+/-) 3.2 (+/-) 0.7 (+/-) 1.1 (+/-) 1.5 (+/-) 1.7 (+/-) 2.2 (+/-) 2.4 (+/-) 3.2 =Significant Difference vs. 2009

  23. Product DocumentationData product description most sought after What documentation did you use or were you looking for? Data product description 78% Product format 66% Science algorithm 49% Instrument specifications 44% Tools 40% Science applications 30% Production code 12% CSI for those whose documentation was not found is 70 vs. those who got it delivered with the data (77) or online (78). Was the documentation… Delivered with the data (18% vs. 18% in ‘09) Available online (75% vs. 73% in ‘09) Not found (7% vs. 9% in ‘09) Impact=1.0 =Significant Difference vs. 2009

  24. Product DocumentationScores by data center (+/-) 2.2 (+/-) 3.0 (+/-) 2.6 (+/-) 2.7 (+/-) 4.1 (+/-) 0.8 (+/-) 1.3 (+/-) 1.6 (+/-) 1.9 (+/-) 2.6 (+/-) 2.4 (+/-) 3.3 =Significant Difference vs. 2009

  25. Customer SupportMaintain strong performance 88% (89% in 2009) were able to get help on first request. These respondents continue to have a significantly higher CSI (82) than those who did not (66). Did you request assistance from the Data Center’s user services staff during the past year? No=75%. Of those who said yes, 87% used email, 2% used the phone, and 11% used both phone and e-mail. Impact=1.5 =Significant Difference vs. 2009

  26. Product QualityPreferences in line with actual for the most part In 2009, 58% said products were provided in HDF-EOS and HDF and 44% said they were their preferred method. *Multiple responses allowed

  27. Product QualityMaintains score this year Impact=0.5 =Significant Difference vs. 2009

  28. DeliveryDrops one point this year Over half said their data came from MODIS (same in 2009) 28% said ASTER (27% in 2009) *Multi-Select Impact=0.6 =Significant Difference vs. 2009

  29. DeliveryMethods for receiving … 73% said FTP was their preferred method in 2009 How long did it take to receive your data products? 23% immediate retrieve CSI=81 20% less than 1 hour CSI=78 27% less than a day CSI=76 23% 1-3 days CSI=76 4% 4-7 days CSI=75 2% more than 7 days CSI=69

  30. Customers over multiple yearsWho have answered the survey multiple years … No significant differences were seen between 2009 and 2010 for those who have answered the survey over the last four years. For those answering the survey over multiple years, most scores have seen some positive movement (Difference refers to 2010 vs. 2009)

  31. Customers over the past two years Who answered the survey in 2009 and 2010 For those answering the survey in 2009 and 2010, there are a number of statistically significant score differences. (Difference refers to 2010 vs. 2009)

  32. Customers over the past three yearsWho answered the survey in 2008, 2009 and 2010 For those answering the survey in 2008, 2009 and 2010 there were no statistically significant score differences between 2009 to 2010. (Difference refers to 2010 vs. 2009)

  33. Summary

  34. Summary • NASA EOSDIS has made significant improvements versus last year in a couple of areas (Product Search and Product Selection and Order) • Delivery and Product Documentation both saw a small but significant decrease this year • Product Search, Selection and Order continue to be the top opportunities for improvement • Documentation also continues to be a top opportunity • Customer Support continues to be high impact for those who use it. Imperative to maintain the strong level of service. • Ensure those who are providing it realize how it affects satisfaction

  35. Appendix

  36. x1 lx1 y1 ly1 lx2 x2 x1 b1 ly2 x3 lx3 y2 h1 x4 lx4 ly3 y3 b2 lx5 x5 x2 x6 lx6 = l x + d x , for i = 1,2,3 t = 1,2 i xi t i = l h + e y , for j = 1,2,3 1 j yj j h = b x + b x + z 1 1 1 2 2 1 The Math Behind the Numbers A discussion for a later date…or following this presentation for those who are interested.

  37. A Note About Score Calculation • Attributes (questions on the survey) are typically answered on a 1-10 scale • Social science research shows 7-10 response categories are optimal • Customers are familiar with a 10 point scale • Before being reported, scores are transformed from a 1-10 to a 0-100 scale • The transformation is strictly algebraic; e.g. • The 0-100 scale simplifies reporting: • Often no need to report many, if any, decimal places • 0-100 scale is useful as a management tool

  38. Deriving Impacts • Remember high school algebra? The general formula for a line is: y = mx + b • The basic idea is that x is a “cause” and y is an “effect”, and m represents the slope of the line – summarizing the relationship between x & y • CFI Group uses a sophisticated variation of the advanced statistical tool, Partial Least Squares (PLS) Regression, to determine impacts when many different causes (i.e., quality components) simultaneously effect an outcome (e.g., Customer Satisfaction)

More Related