CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer.

Slides:



Advertisements
Similar presentations
Product Quality and Documentation – Recent Developments H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC
Advertisements

1 MODAPS - Land Processing Status Robert Wolfe and Ed Masuoka Code 619, NASA GSFC Sadashiva Devadiga Sigma Space MODIS Science Team Meeting April 15-17,
Status of MODIS Production (C4/C5 Testing) Mike Teague 1/5/06.
SMALL BUSINESS & CUSTOMER RESEARCH UPDATE Matthew James Research Manager Thursday, 7 th June 2007.
1 Communications Preferences & Priorities Survey October, 2013 vs. June, 2014.
2008 Contact Center Satisfaction Index Presented by Sheri Teodoru CEO, CFI Group.
IDEA What it is and How to Implement the System Texas A & M, February 2013 Shelley A. Chapman, PhD Senior Educational Consultant.
EBI Statistics 101.
Summary of Key Results from the 2012/2013 Survey of Visa Applicants Who Used a Licensed Adviser Undertaken by Premium Research Prepared: July 2013.
Discretionary Grants Information System (DGIS) – Home Visiting (HV)
American Customer Satisfaction Index (ACSI) Survey Results HDF/HDF-EOS Spring Meeting 04/01/
SE 450 Software Processes & Product Metrics Survey Use & Design.
MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems.
Organizational Climate Survey. Why Measure Organizational Climate? Organizational climate is one of the most powerful drivers of organizational effectiveness.
Federal Consulting Group August 2004 Department of Labor Civil Rights Center 2004 Satisfaction Study - Recipients.
1 Member Findings Christmas Donations An Amárach Report // February 2015 //
Fremantle Visitor Information Centre Report 2011.
TNS Proprietary: © Linking Employee Compensation to Survey Metrics High-Level Considerations and Best Practices January, 2006.
1 1 Slide © 2009 South-Western, a part of Cengage Learning Chapter 6 Forecasting n Quantitative Approaches to Forecasting n Components of a Time Series.
Slides by John Loucks St. Edward’s University.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM REPRESENTATIVE OFFICES BUENOS AIRES KUALA LUMPUR PORTO ALEGRE.
Summary of 2008 EOSDIS User Survey & EOSDIS Outreach HDF & HDF-EOS Workshop Aurora, CO 10/16/2008 HDF.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM REPRESENTATIVE OFFICES BUENOS AIRES KUALA LUMPUR PORTO ALEGRE.
NASA Earth Observing System Data and Information Systems
Earth Observing System Data and Information System (EOSDIS) provides access to more than 3,000 types of Earth science data products and specialized services.
MANAGEMENT OF MARKETING USE OF TECHNOLOGY IN MARKETING.
NASA Land Atmosphere Near real-time Capability for EOS 2014 Customer Satisfaction Results January 2015.
Staff Survey Executive Team Presentation (Annex B) Prepared by: GfK NOP September, Agenda item: 17 Paper no: CM/03/12/14B.
Chapter 7 Forecasting n Quantitative Approaches to Forecasting n The Components of a Time Series n Measures of Forecast Accuracy n Using Smoothing Methods.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
EOSDIS FY2010 Annual Metrics Report Prepared By: Hyo Duck Chang Adnet, Inc. Brian Krupp Adnet, Inc. Lalit Wanchoo Adnet, Inc. February 2011.
© CFI Group 1 NWS Wind Chill Customer Satisfaction Results: Media Personnel JAG/TI Meeting November 6, 2003.
CFI GROUP WORLDWIDE ANN ARBOR BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer Satisfaction.
EvergreenEcon.com ESA 2011 Impact Evaluation Research Plan Public Workshop #1 February 20, 2013 Presented By: Steve Grover, President.
2007 EOSDIS User Survey Carol Boquist ESDIS Outreach Manager Science Operations Office 11/7/2007 Carol Boquist ESDIS Outreach Manager Science Operations.
Introduction – Addressing Business Challenges Microsoft® Business Intelligence Solutions.
November 13, 2003 CMT Day 1 Kate Johnston Corporate Projects Consultant Halton Region CMT: The Halton Experience.
EOSDIS Status 9/29/2010 Dan Marinelli, NASA GSFC
September 4, 2003MODIS Ocean Data Products Workshop, Oregon State University1 Goddard Earth Sciences (GES) Distributed Active Archive Center (DAAC) MODIS.
ESIP Federation 2004 : L.B.Pham S. Berrick, L. Pham, G. Leptoukh, Z. Liu, H. Rui, S. Shen, W. Teng, T. Zhu NASA Goddard Earth Sciences (GES) Data & Information.
Copying distribution or use of the contents of this document is prohibited without written authorization from SafeHarbor Technology Corporation. Maximizing.
EOSDIS Status 10/16/2008 Dan Marinelli, Science Systems Development Office.
ESDIS Project Status 11/29/2006 Dan Marinelli, Science Systems Development Office.
FCD CWI 1 The Foundation for Child Development Index of Child Well- Being (CWI) 1975 to 2004 with Projections for 2005 A Social Indicators Project Supported.
0 1 1.Key Performance Indicator Results ( ) KPI Survey Statistics Student Distribution by Year in Program KPI Overall Results Student Satisfaction.
The Satisfied Student October 4 th, Today’s Presentation  Present data from Case’s Senior Survey and the National Survey of Student Engagement.
EOSDIS FY2009 Annual Metrics Report Prepared By: Hyo Duck Chang Adnet, Inc. Brian Krupp Adnet, Inc. Lalit Wanchoo Adnet, Inc. March 2010.
EOSDIS FY2011 Annual Metrics Report Prepared By: Hyo Duck Chang Adnet, Inc. Brian Krupp Adnet, Inc. Lalit Wanchoo Adnet, Inc. February 2012.
2005 Customer Satisfaction Study September 2005 NASA Earth Observing System Data and Information Systems.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM REPRESENTATIVE OFFICES BUENOS AIRES KUALA LUMPUR PORTO ALEGRE.
NORTH RIVERSIDE Public Library North Riverside, IL North Riverside Public Library District Community Survey Report September 2015.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM REPRESENTATIVE OFFICES BUENOS AIRES KUALA LUMPUR PORTO ALEGRE.
2011 ACSI Survey Summary HDF/HDF-EOS Workshop Riverdale, MD April 18, 2012.
August 2002BioCoRE 2002 Survey1 D. Brandon, R. Brunner, K. Vandivort and G. Budescu August 2002.
EOSDIS FY2008 Annual Metrics Report Prepared By: Ed Sofinowski SGT, Inc. Donna Rahmani SGT, Inc. March 2009 ESDIS Project GSFC Code 423.
MODIS SDST, STTG and SDDT MODIS Science Team Meeting (Land Discipline Breakout Session) July 13, 2004 Robert Wolfe Raytheon NASA GSFC Code 922.
National Center for Civic Involvement (NCCI) Grant Advisory Council on Citizen-Friendly Reporting Last Meeting! August 30, 2005.
CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BUENOS AIRES KUALA LUMPUR LONDON MADRID MELBOURNE MILAN PARIS PORTO ALEGRE SEOUL SHANGHAI STOCKHOLM National Weather.
MODIS Atmosphere Products: The Importance of Record Quality and Length in Quantifying Trends and Correlations S. Platnick 1, N. Amarasinghe 1,2, P. Hubanks.
SeaWiFS Highlights July 2002 SeaWiFS Celebrates 5th Anniversary with the Fourth Global Reprocessing The SeaWiFS Project has just completed the reprocessing.
CAPE ROAD SURGERY Patient Questionnaire 2013 / 2014.
2015/16 Staff Performance Appraisals Webinar for ANR Supervisors Spring 2016.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Earth Science Data and Information System (ESDIS) Project Update Jeanne Behnke, Deputy Project Manager for Operations NASA Earth Science Data & Information.
Development Management Customer Satisfaction Survey 2015/16 Economy, Planning and Employability Services Reported Prepared May 2016.
AIRS Meeting GSFC, February 1, 2002 ECS Data Pool Gregory Leptoukh.
Seeking Advice Regarding the 2010 TeraGrid User Survey SAB Meeting Tuesday July 21, 2009 Sergiu Sanielevici, GIG AD for User Support
Student Achievement Data Displays Mathematics & Reading Grade 3
Global Assessment on Tendency Surveys
Understanding How the Ranking is Calculated
Presentation transcript:

CFI GROUP WORLDWIDE ANN ARBOR ATLANTA BEIJING LONDON MADRID MILAN PARIS SHANGHAI STOCKHOLM NASA Earth Observing System Data and Information System Customer Satisfaction Results November 8, 2010 November 9, 2010 November 15, 2010

2 © CFI Group 2010 Today’s Discussion Background Overview Key Results Detailed Analysis Summary

3 © CFI Group 2010 Background

4 © CFI Group 2010 Project Background Objectives Measure customer satisfaction with the NASA Earth Observing System Data and Information System at a national level and for each Data Center –Alaska Satellite Facility Distributed Active Archive Center –Crustal Dynamics Data Information System –Global Hydrology Resource Center –Goddard Earth Sciences Data and Information Services Center –Land Processes Distributed Active Archive Center –MODAPS Level-1 Atmospheres Archive and Distribution System –NASA Langley Atmospheric Science Data Center –National Snow and Ice Data Center Distributed Active Archive Center –Oak Ridge National Laboratory Distributed Active Archive Center –Ocean Biology Processing Group –Physical Oceanography Distributed Active Archive Center Jet Propulsion Laboratories (JPL) –Socioeconomic Data and Applications Center Assess the trends in satisfaction with NASA EOSDIS specifically in the following key areas: –Product Search –Product Selection and Order –Delivery –Product Quality –Product Documentation –Customer Support Identify the key areas that NASA can leverage across the Data Centers to continuously improve its service to its users

5 © CFI Group 2010 Project Background Measurement timetable Finalized questionnaireJuly 28, 2009 Data collection via web Sending invitations spanned the first two weeks. Sending reminders spanned the last two weeks. The survey was in the field for a longer time this year for resending invitations. August 25, 2010 – October 5, 2010 Topline resultsOctober 15, 2010 Results briefingNovember 8, 2010 November 9, 2010 November 15, 2010

6 © CFI Group 2010 Project Background Data collection Respondents 4,390 responses were received 4,390 responses were used for modeling Those who answered for more than one data center: Two: 134 Three: 12 Four: 2 Five: 1 addresses from lists associated with some of the data centers were included to reach the large number of users who may have accessed data via anonymous ftp.

7 © CFI Group 2010 Project Background Respondent information For which specific areas do you need or use Earth science data and services? Demographics (when comparable) remain fairly consistent with 2009 * Multi-select question; Answer choices added in 2010; Language to question was changed slightly in 2009; Modeling was asked as a separate question prior to 2008

8 © CFI Group 2010 Project Background Respondent information Demographics (when comparable) remain fairly consistent with 2009 * Questionnaire was modified in 2009 and 2010; Prior to 2010 WIST also included EDG. WIST became available in EDG was decommissioned Feb when all data could be accessed through WIST.

9 © CFI Group 2010 Overview Key Results

10 © CFI Group 2010 NASA EOSDIS Customer satisfaction results Ideal How close does [DAAC] come to the ideal organization? Overall satisfaction How satisfied are you with the data products and services provided by [DAAC]? Expectations To what extent have data products and services provided by [DAAC] fallen short of or exceeded expectations? ACSI (+/-) 0.9 (+/-) 0.7 (+/-) 0.6 (+/-) 0.5 N=1016 N=1263N=2291 N= (+/-) 0.4 N= (+/-) 0.5 N= (+/-) 0.4 N=4390

11 © CFI Group 2010 NASA EOSDIS Benchmarks Strong performance continues … ACSI (Overall) is updated on a quarterly basis, with specific industries/sectors measured annually. Federal Government (Overall) is updated on an annual basis and data collection is done in Q3. Quarterly scores are based on a calendar timeframe: Q1- Jan through March; Q2 – April through June; Q3 – July through Sept.; Q4 – Oct. through Dec.

12 © CFI Group 2010 NASA EOSDIS Model Product Search/Selection/Documentation and Customer Support most critical The performance of each component on a 0 to 100 scale. Component scores are made up of the weighted average of the corresponding survey questions. Scores The change in target variable that results from a five point change in a component score. For example, a 5-point gain in Product Search would yield a 1.0-point improvement in Satisfaction. Impacts

13 © CFI Group 2010 NASA EOSDIS 2007 – 2010 Significant changes from 2009 =Significant Difference vs (+/-) 0.4 (+/-) 0.8 (+/-) 0.5

14 © CFI Group 2010 Areas of Opportunity for NASA EOSDIS Remain consistent year over year Top Improvement Priority Product Search (76) Product Selection and Order (77) Product Documentation (76)

15 © CFI Group 2010 Detailed Analysis

16 © CFI Group 2010 Score Comparison Same CSI inside and outside the USA Respondents outside the USA have the same Satisfaction score with EOSDIS (77). Compared to last year there was no score change for those respondents within the USA and a one-point score increase for those respondents outside the USA. Respondents outside the USA have the same Satisfaction score with EOSDIS (77). Compared to last year there was no score change for those respondents within the USA and a one-point score increase for those respondents outside the USA. 73% of respondents are outside of the USA in 2010 vs. 71% in 2009.

17 © CFI Group 2010 CSI by Data Centers Only one data center shows a statistically sig. change =Significant Difference vs (+/-) 2.1 (+/-) 1.4 (+/-) 2.9 (+/-) 2.2 (+/-) 1.7 (+/-) 3.3 (+/-) 0.6 (+/-) 1.0 (+/-) 1.4 (+/-) 1.7 (+/-) 2.0 (+/-) 2.6

18 © CFI Group 2010 Product Search Key driver of satisfaction 49% used data center’s or data-specific specialized search, online holdings or datapool (40% in 2009) 17% used WIST to search for data and products 16% selected Internet Search Tool (19% in 2009) =Significant Difference vs Impact=1.0

19 © CFI Group 2010 Product Search Score Comparison By method for most recent search How did you search for the data products or services you were seeking? =Significant Difference vs (+/-) 0.6 (+/-) 2.7 (+/-) 5.8 (+/-) 1.2 (+/-) 1.0 (+/-) % 3% 1% 16% 17% 3% *WIST became available in EDG was decommissioned Feb when all data could be accessed through WIST.

20 © CFI Group 2010 Product Search Scores by Data Center; variation in the trends =Significant Difference vs (+/-) 2.4 (+/-) 2.8 (+/-) 2.5 (+/-) 2.6 (+/-) 4.1 (+/-) 0.7 (+/-) 1.1 (+/-) 1.6 (+/-) 1.7 (+/-) 2.4 (+/-) 2.7 (+/-) 3.3

21 © CFI Group 2010 Product Selection and Order Also a top opportunity for continuous improvement 94% said that they are finding what they want in terms of type, format, time series, etc. Impact=1.3 =Significant Difference vs Did you use a sub-setting tool? 36% said No 41% said Yes, by geographic area 4% said Yes by geophysical parameter 16% said Yes, by both geographic area and geophysical parameter 3% said by band 1% said by channel.

22 © CFI Group 2010 Product Selection and Order Scores by Data Center; variation in the trends =Significant Difference vs (+/-) 2.8 (+/-) 2.0 (+/-) 2.5 (+/-) 3.2 (+/-) 0.7 (+/-) 1.1 (+/-) 1.5 (+/-) 1.7 (+/-) 2.2 (+/-) 2.4 (+/-) 3.2 (+/-) 2.3

23 © CFI Group 2010 Product Documentation Data product description most sought after What documentation did you use or were you looking for? Data product description 78% Product format 66% Science algorithm 49% Instrument specifications 44% Tools 40% Science applications 30% Production code 12% Impact=1.0 CSI for those whose documentation was not found is 70 vs. those who got it delivered with the data (77) or online (78). =Significant Difference vs Was the documentation… Delivered with the data (18% vs. 18% in ‘09) Available online (75% vs. 73% in ‘09) Not found (7% vs. 9% in ‘09)

24 © CFI Group 2010 Product Documentation Scores by data center =Significant Difference vs (+/-) 2.2 (+/-) 3.0 (+/-) 2.6 (+/-) 2.7 (+/-) 4.1 (+/-) 0.8 (+/-) 1.3 (+/-) 1.6 (+/-) 1.9 (+/-) 2.6 (+/-) 2.4 (+/-) 3.3

25 © CFI Group 2010 Customer Support Maintain strong performance Did you request assistance from the Data Center’s user services staff during the past year? No=75%. Of those who said yes, 87% used , 2% used the phone, and 11% used both phone and . 88% (89% in 2009) were able to get help on first request. These respondents continue to have a significantly higher CSI (82) than those who did not (66). Impact=1.5 =Significant Difference vs. 2009

26 © CFI Group 2010 Product Quality Preferences in line with actual for the most part In 2009, 58% said products were provided in HDF- EOS and HDF and 44% said they were their preferred method. *Multiple responses allowed

27 © CFI Group 2010 Product Quality Maintains score this year Impact=0.5 =Significant Difference vs. 2009

28 © CFI Group 2010 Delivery Drops one point this year Over half said their data came from MODIS (same in 2009) 28% said ASTER (27% in 2009) Impact=0.6 =Significant Difference vs *Multi-Select

29 © CFI Group 2010 Delivery Methods for receiving … How long did it take to receive your data products? 23% immediate retrieve CSI=81 20% less than 1 hour CSI=78 27% less than a day CSI=76 23% 1-3 days CSI=76 4% 4-7 days CSI=75 2% more than 7 days CSI=69 73% said FTP was their preferred method in 2009

30 © CFI Group 2010 Customers over multiple years Who have answered the survey multiple years … For those answering the survey over multiple years, most scores have seen some positive movement (Difference refers to 2010 vs. 2009) No significant differences were seen between 2009 and 2010 for those who have answered the survey over the last four years.

31 © CFI Group 2010 Customers over the past two years Who answered the survey in 2009 and 2010 For those answering the survey in 2009 and 2010, there are a number of statistically significant score differences. (Difference refers to 2010 vs. 2009)

32 © CFI Group 2010 Customers over the past three years Who answered the survey in 2008, 2009 and 2010 For those answering the survey in 2008, 2009 and 2010 there were no statistically significant score differences between 2009 to (Difference refers to 2010 vs. 2009)

33 © CFI Group 2010 Summary

34 © CFI Group 2010 Summary  NASA EOSDIS has made significant improvements versus last year in a couple of areas (Product Search and Product Selection and Order)  Delivery and Product Documentation both saw a small but significant decrease this year  Product Search, Selection and Order continue to be the top opportunities for improvement  Documentation also continues to be a top opportunity  Customer Support continues to be high impact for those who use it. Imperative to maintain the strong level of service. Ensure those who are providing it realize how it affects satisfaction

35 © CFI Group 2010 Appendix

36 © CFI Group 2010 x1x1 x2x2 x3x3 x4x4 x5x5 x6x6 x 1 x 3 x 4 x 5 x 6 y1y1 y2y2 y3y3 y 3 y 2 y 1 11 22  x i xit i , for i=1,2,3 t=1,2 y jyjj  1, for j=1,2,3     x 2 The Math Behind the Numbers A discussion for a later date…or following this presentation for those who are interested.

37 © CFI Group 2010 A Note About Score Calculation Attributes (questions on the survey) are typically answered on a 1-10 scale –Social science research shows 7-10 response categories are optimal –Customers are familiar with a 10 point scale Before being reported, scores are transformed from a 1-10 to a scale –The transformation is strictly algebraic; e.g. –The scale simplifies reporting: Often no need to report many, if any, decimal places scale is useful as a management tool

38 © CFI Group 2010 Deriving Impacts Remember high school algebra? The general formula for a line is: y = mx + b The basic idea is that x is a “cause” and y is an “effect”, and m represents the slope of the line – summarizing the relationship between x & y CFI Group uses a sophisticated variation of the advanced statistical tool, Partial Least Squares (PLS) Regression, to determine impacts when many different causes (i.e., quality components) simultaneously effect an outcome (e.g., Customer Satisfaction)