Presentation is loading. Please wait.

Presentation is loading. Please wait.

CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer.

Similar presentations


Presentation on theme: "CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer."— Presentation transcript:

1 CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer Satisfaction Results November 8, 2010 November 9, 2010 November 15, 2010

2 2 © CFI Group 2010 Today’s Discussion Background Overview Key Results Detailed Analysis Summary

3 3 © CFI Group 2010 Background

4 4 © CFI Group 2010 Project Background Objectives Measure customer satisfaction with the NASA Earth Observing System Data and Information System at a national level and for each Data Center –Alaska Satellite Facility Distributed Active Archive Center –Crustal Dynamics Data Information System –Global Hydrology Resource Center –Goddard Earth Sciences Data and Information Services Center –Land Processes Distributed Active Archive Center –MODAPS Level-1 Atmospheres Archive and Distribution System –NASA Langley Atmospheric Science Data Center –National Snow and Ice Data Center Distributed Active Archive Center –Oak Ridge National Laboratory Distributed Active Archive Center –Ocean Biology Processing Group –Physical Oceanography Distributed Active Archive Center Jet Propulsion Laboratories (JPL) –Socioeconomic Data and Applications Center Assess the trends in satisfaction with NASA EOSDIS specifically in the following key areas: –Product Search –Product Selection and Order –Delivery –Product Quality –Product Documentation –Customer Support Identify the key areas that NASA can leverage across the Data Centers to continuously improve its service to its users

5 5 © CFI Group 2010 Project Background Measurement timetable Finalized questionnaireJuly 28, 2009 Data collection via web Sending invitations spanned the first two weeks. Sending reminders spanned the last two weeks. The survey was in the field for a longer time this year for resending invitations. August 25, 2010 – October 5, 2010 Topline resultsOctober 15, 2010 Results briefingNovember 8, 2010 November 9, 2010 November 15, 2010

6 6 © CFI Group 2010 Project Background Data collection Respondents 4,390 responses were received 4,390 responses were used for modeling Those who answered for more than one data center: Two: 134 Three: 12 Four: 2 Five: 1 E-mail addresses from lists associated with some of the data centers were included to reach the large number of users who may have accessed data via anonymous ftp.

7 7 © CFI Group 2010 Project Background Respondent information For which specific areas do you need or use Earth science data and services? Demographics (when comparable) remain fairly consistent with 2009 * Multi-select question; Answer choices added in 2010; Language to question was changed slightly in 2009; Modeling was asked as a separate question prior to 2008

8 8 © CFI Group 2010 Project Background Respondent information Demographics (when comparable) remain fairly consistent with 2009 * Questionnaire was modified in 2009 and 2010; Prior to 2010 WIST also included EDG. WIST became available in 2005. EDG was decommissioned Feb. 2008 when all data could be accessed through WIST.

9 9 © CFI Group 2010 Overview Key Results

10 10 © CFI Group 2010 NASA EOSDIS Customer satisfaction results Ideal How close does [DAAC] come to the ideal organization? Overall satisfaction How satisfied are you with the data products and services provided by [DAAC]? Expectations To what extent have data products and services provided by [DAAC] fallen short of or exceeded expectations? ACSI 76 82 73 78 2005 2007 75 80 73 2008 77 81 74 75 71 79 73 75 2004 (+/-) 0.9 (+/-) 0.7 (+/-) 0.6 (+/-) 0.5 N=1016 N=1263N=2291 N=2601 2009 77 81 73 75 (+/-) 0.4 N=3842 72 78 71 74 (+/-) 0.5 N=2857 2006 2010 77 81 74 75 (+/-) 0.4 N=4390

11 11 © CFI Group 2010 NASA EOSDIS Benchmarks Strong performance continues … ACSI (Overall) is updated on a quarterly basis, with specific industries/sectors measured annually. Federal Government (Overall) is updated on an annual basis and data collection is done in Q3. Quarterly scores are based on a calendar timeframe: Q1- Jan through March; Q2 – April through June; Q3 – July through Sept.; Q4 – Oct. through Dec.

12 12 © CFI Group 2010 NASA EOSDIS Model Product Search/Selection/Documentation and Customer Support most critical The performance of each component on a 0 to 100 scale. Component scores are made up of the weighted average of the corresponding survey questions. Scores The change in target variable that results from a five point change in a component score. For example, a 5-point gain in Product Search would yield a 1.0-point improvement in Satisfaction. Impacts

13 13 © CFI Group 2010 NASA EOSDIS 2007 – 2010 Significant changes from 2009 =Significant Difference vs. 2009 (+/-) 0.4 (+/-) 0.8 (+/-) 0.5

14 14 © CFI Group 2010 Areas of Opportunity for NASA EOSDIS Remain consistent year over year Top Improvement Priority Product Search (76) Product Selection and Order (77) Product Documentation (76)

15 15 © CFI Group 2010 Detailed Analysis

16 16 © CFI Group 2010 Score Comparison Same CSI inside and outside the USA Respondents outside the USA have the same Satisfaction score with EOSDIS (77). Compared to last year there was no score change for those respondents within the USA and a one-point score increase for those respondents outside the USA. Respondents outside the USA have the same Satisfaction score with EOSDIS (77). Compared to last year there was no score change for those respondents within the USA and a one-point score increase for those respondents outside the USA. 73% of respondents are outside of the USA in 2010 vs. 71% in 2009.

17 17 © CFI Group 2010 CSI by Data Centers Only one data center shows a statistically sig. change =Significant Difference vs. 2009 (+/-) 2.1 (+/-) 1.4 (+/-) 2.9 (+/-) 2.2 (+/-) 1.7 (+/-) 3.3 (+/-) 0.6 (+/-) 1.0 (+/-) 1.4 (+/-) 1.7 (+/-) 2.0 (+/-) 2.6

18 18 © CFI Group 2010 Product Search Key driver of satisfaction 49% used data center’s or data-specific specialized search, online holdings or datapool (40% in 2009) 17% used WIST to search for data and products 16% selected Internet Search Tool (19% in 2009) =Significant Difference vs. 2009 Impact=1.0

19 19 © CFI Group 2010 Product Search Score Comparison By method for most recent search How did you search for the data products or services you were seeking? =Significant Difference vs. 2009 (+/-) 0.6 (+/-) 2.7 (+/-) 5.8 (+/-) 1.2 (+/-) 1.0 (+/-) 3.2 49% 3% 1% 16% 17% 3% *WIST became available in 2005. EDG was decommissioned Feb. 2008 when all data could be accessed through WIST.

20 20 © CFI Group 2010 Product Search Scores by Data Center; variation in the trends =Significant Difference vs. 2009 (+/-) 2.4 (+/-) 2.8 (+/-) 2.5 (+/-) 2.6 (+/-) 4.1 (+/-) 0.7 (+/-) 1.1 (+/-) 1.6 (+/-) 1.7 (+/-) 2.4 (+/-) 2.7 (+/-) 3.3

21 21 © CFI Group 2010 Product Selection and Order Also a top opportunity for continuous improvement 94% said that they are finding what they want in terms of type, format, time series, etc. Impact=1.3 =Significant Difference vs. 2009 Did you use a sub-setting tool? 36% said No 41% said Yes, by geographic area 4% said Yes by geophysical parameter 16% said Yes, by both geographic area and geophysical parameter 3% said by band 1% said by channel.

22 22 © CFI Group 2010 Product Selection and Order Scores by Data Center; variation in the trends =Significant Difference vs. 2009 (+/-) 2.8 (+/-) 2.0 (+/-) 2.5 (+/-) 3.2 (+/-) 0.7 (+/-) 1.1 (+/-) 1.5 (+/-) 1.7 (+/-) 2.2 (+/-) 2.4 (+/-) 3.2 (+/-) 2.3

23 23 © CFI Group 2010 Product Documentation Data product description most sought after What documentation did you use or were you looking for? Data product description 78% Product format 66% Science algorithm 49% Instrument specifications 44% Tools 40% Science applications 30% Production code 12% Impact=1.0 CSI for those whose documentation was not found is 70 vs. those who got it delivered with the data (77) or online (78). =Significant Difference vs. 2009 Was the documentation… Delivered with the data (18% vs. 18% in ‘09) Available online (75% vs. 73% in ‘09) Not found (7% vs. 9% in ‘09)

24 24 © CFI Group 2010 Product Documentation Scores by data center =Significant Difference vs. 2009 (+/-) 2.2 (+/-) 3.0 (+/-) 2.6 (+/-) 2.7 (+/-) 4.1 (+/-) 0.8 (+/-) 1.3 (+/-) 1.6 (+/-) 1.9 (+/-) 2.6 (+/-) 2.4 (+/-) 3.3

25 25 © CFI Group 2010 Customer Support Maintain strong performance Did you request assistance from the Data Center’s user services staff during the past year? No=75%. Of those who said yes, 87% used email, 2% used the phone, and 11% used both phone and e-mail. 88% (89% in 2009) were able to get help on first request. These respondents continue to have a significantly higher CSI (82) than those who did not (66). Impact=1.5 =Significant Difference vs. 2009

26 26 © CFI Group 2010 Product Quality Preferences in line with actual for the most part In 2009, 58% said products were provided in HDF- EOS and HDF and 44% said they were their preferred method. *Multiple responses allowed

27 27 © CFI Group 2010 Product Quality Maintains score this year Impact=0.5 =Significant Difference vs. 2009

28 28 © CFI Group 2010 Delivery Drops one point this year Over half said their data came from MODIS (same in 2009) 28% said ASTER (27% in 2009) Impact=0.6 =Significant Difference vs. 2009 *Multi-Select

29 29 © CFI Group 2010 Delivery Methods for receiving … How long did it take to receive your data products? 23% immediate retrieve CSI=81 20% less than 1 hour CSI=78 27% less than a day CSI=76 23% 1-3 days CSI=76 4% 4-7 days CSI=75 2% more than 7 days CSI=69 73% said FTP was their preferred method in 2009

30 30 © CFI Group 2010 Customers over multiple years Who have answered the survey multiple years … For those answering the survey over multiple years, most scores have seen some positive movement (Difference refers to 2010 vs. 2009) No significant differences were seen between 2009 and 2010 for those who have answered the survey over the last four years.

31 31 © CFI Group 2010 Customers over the past two years Who answered the survey in 2009 and 2010 For those answering the survey in 2009 and 2010, there are a number of statistically significant score differences. (Difference refers to 2010 vs. 2009)

32 32 © CFI Group 2010 Customers over the past three years Who answered the survey in 2008, 2009 and 2010 For those answering the survey in 2008, 2009 and 2010 there were no statistically significant score differences between 2009 to 2010. (Difference refers to 2010 vs. 2009)

33 33 © CFI Group 2010 Summary

34 34 © CFI Group 2010 Summary  NASA EOSDIS has made significant improvements versus last year in a couple of areas (Product Search and Product Selection and Order)  Delivery and Product Documentation both saw a small but significant decrease this year  Product Search, Selection and Order continue to be the top opportunities for improvement  Documentation also continues to be a top opportunity  Customer Support continues to be high impact for those who use it. Imperative to maintain the strong level of service. Ensure those who are providing it realize how it affects satisfaction

35 35 © CFI Group 2010 Appendix

36 36 © CFI Group 2010 x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x 1 x 3 x 4 x 5 x 6 y1y1 y2y2 y3y3 y 3 y 2 y 1 11 22  x i xit i , for i=1,2,3 t=1,2 y jyjj  1, for j=1,2,3    111221  x 2 The Math Behind the Numbers A discussion for a later date…or following this presentation for those who are interested.

37 37 © CFI Group 2010 A Note About Score Calculation Attributes (questions on the survey) are typically answered on a 1-10 scale –Social science research shows 7-10 response categories are optimal –Customers are familiar with a 10 point scale Before being reported, scores are transformed from a 1-10 to a 0-100 scale –The transformation is strictly algebraic; e.g. –The 0-100 scale simplifies reporting: Often no need to report many, if any, decimal places 0-100 scale is useful as a management tool

38 38 © CFI Group 2010 Deriving Impacts Remember high school algebra? The general formula for a line is: y = mx + b The basic idea is that x is a “cause” and y is an “effect”, and m represents the slope of the line – summarizing the relationship between x & y CFI Group uses a sophisticated variation of the advanced statistical tool, Partial Least Squares (PLS) Regression, to determine impacts when many different causes (i.e., quality components) simultaneously effect an outcome (e.g., Customer Satisfaction)


Download ppt "CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer."

Similar presentations


Ads by Google