SiegelICSTIconf07v4.ppt1 Customer Satisfaction and Performance Metrics Elliot R. Siegel, PhD & Fred B. Wood, DBA US National Library of Medicine January.

Slides:



Advertisements
Similar presentations
[Imagine School at North Port] Oral Exit Report Quality Assurance Review Team School Accreditation.
Advertisements

Usage statistics in context - panel discussion on understanding usage, measuring success Peter Shepherd Project Director COUNTER AAP/PSP 9 February 2005.
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
The Business Support Professional Career Pathway Leonardo Partnership Management Meeting CECA´s headquarter Seville, Spain March 2010.
Copyright © D&D Research, 2010 Structured evaluation of the efficiency of POC project Complex quantitative research report March – April 2010.
Course: e-Governance Project Lifecycle Day 1
1 ACI Annual Audit Committee Survey - Global M A R K E T I N G & C O M M U N I C A T I O N S R E S E A R C H Charles Garbowski Research February 21, 2006.
Identifying enablers & disablers to change
26-27 September 2013 Bucharest Ongoing and planned performance indicators / benchmarking systems Moldova National Association of Water and Sanitation Utilities.
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
OVERVIEW OF ClASS METHODS and ACTIVITIES. Session Objectives By the end of the session, participants will be able to: Describe ClASS team composition.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Evaluation EnAct Campuses Sonoma, Humboldt, Chico, San Francisco, San Jose, Fresno, Bakersfield, Pomona Presentation July 11, 2007 Public Works, Inc. Mikala.
August 2013 B OARD OF D IRECTORS M EETING | A UG 2013| CONFIDENTIAL – NOT FOR DISTRIBUTION SCIP Survey Non Members DRAFT – NOT FOR DISTRIBUTION~
Environmental Justice (EJ) & Community-Based Transportation Planning (CBTP) Grant Programs California Department of Transportation District 3 January 25,
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
EURIDICE project Evaluation of image database use in online learning environment 11/
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Noyce Program Evaluation Conference Thursday, December 6, 2007 Frances Lawrenz Michelle Fleming Pey-Yan Liou Christina Madsen Karen Hofstad-Parkhill 1.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Knowledge is Power Marketing Information System (MIS) determines what information managers need and then gathers, sorts, analyzes, stores, and distributes.
Developing a Global Vision through Marketing Research Chapter 8 McGraw-Hill/Irwin© 2005 The McGraw-Hill Companies, Inc. All rights reserved.
Customer Loyalty Programs – Increasing Customer Loyalty throughout the customer base! Suhail Khan – Director of WW Customer Loyalty Program – FileNet Corporation.
Federal Consulting Group August 2004 Department of Labor Civil Rights Center 2004 Satisfaction Study - Recipients.
A service of the U.S. National Institutes of Health Module 1: Clinical Trials and Requirements for Registration and Results Reporting.
Industry Nuclear Safety Culture* Process Thomas C. Houghton Director, Safety Focused Regulation Nuclear Energy Institute March 12, 2009 *Nuclear Safety.
1 Charles Garbowski Senior Director Research March 16, 2007 R E S E A R C H K P M G L L P ACI Second Annual Global Audit Committee Survey.
NIH's Enterprise Approach to Measuring Customer Satisfaction Presented at ACSI User Group Meeting March 20, 2007 Sue Feldman, National Cancer Institute.
GSA OGP Advisory Committee Engagement Survey ACES 2004 Overall Results September 23, 2004.
NASA Earth Observing System Data and Information Systems
Earth Observing System Data and Information System (EOSDIS) provides access to more than 3,000 types of Earth science data products and specialized services.
THE OECD APPROACH TO ASSESSING ORGANISATIONAL EFFECTIVENESS Frank van Tongeren Head of Division, Policies in Trade and Agriculture (OECD) ICAE pre-conference.
Provider Member Surveys for Resident and Staff Satisfaction Presented By: Jocelyn Martin & Connie Wolfe May 10th, 2010.
1 ACSI American Customer Satisfaction Index TM Citizen Satisfaction with the U.S. Federal Government: A Review of 2011 Results from ACSI Forrest V. Morgeson.
Trends in Corporate Governance Dr. Sandra B. Richtermeyer, CMA, CPA President, Institute of Management Accountants (IMA) June 21, 2011.
NIH ACSI Meeting Oct. 4, Evaluation of NIH ACSI Website Project: Highlights of the Final Report Jennifer Crafts, Ph.D. Westat.
A project of the President’s Emerging Leaders Program in collaboration with the Administrative Services Task Force.
Sustainable Product Innovation Project UNEP’s Policy Component On behalf of UNEP Tran Van Nhan Asc. Prof., Ph.D Director Vietnam Cleaner Production Centre.
1-1 Copyright © 2010 Pearson Education, Inc. Introduction and Early Phases of Marketing Research Chapter 1.
1 Finance Clinic Climate Public Expenditure and Institutional Reviews (CPEIRs) Rakshya Thapa Regional Technical Specialist UNDP LECB Annual Global Meeting.
Disclosure of Financial Conflicts of Interest in Continuing Medical Education Michael D. Jibson, MD, PhD and Jennifer Seibert, MD University of Michigan.
International Institute of Business Analysis The world's leading association for Business Analysis professionals Quantifying the Role of the Business Analysis.
Customer Loyalty Programs – Increasing Customer Loyalty throughout the customer base! Paul Knott– Customer Services Director EMEA Response Center Paul.
Copyright 2003 – Cedar Enterprise Solutions, Inc. All rights reserved. Business Process Redesign & Innovation University of Maryland, University College.
Market research for a start-up. LEARNING OUTCOMES By the end of this lesson I will be able to: –Define and explain market research –Distinguish between.
2005 Customer Satisfaction Study September 2005 NASA Earth Observing System Data and Information Systems.
MEDLINE plus Health Information A service of the U.S. NATIONAL LIBRARY OF MEDICINE MLA Sunrise Seminar - MEDLINEplus  Joyce Backus Public Services Division.
T HE G ALLUP O RGANIZATION GSA OGP Advisory Committee Engagement Survey ACES 2004 Overall Results October 14, 2004.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Queensland University of Technology CRICOS No J HOW RESEARCHERS FIND INFORMATION IN THE NEW DIGITAL AGE Gaynor Austen Director, Library Services.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
1 © 2004 ForeSee Results Best Practices for Managing Citizen Satisfaction On Your Website WebShop 2004 July 28, 2004.
Are you looking for an opportunity to join a company that has a long history and an exciting future? A place where you can grow within an international.
Managing Marketing Information 4 Principles of Marketing.
Relevance to Distance Education of the Experiential Learning Model William E. Garner, Rh.D., CRC, LPC 1.
Human Resources Office of 1 Summary of Results College of Design Dean’s Reports.
Inter-American Development Bank BIMILACI 2007 QUALITY PROCUREMENT Third Party Review May 2007 Project Procurement Division.
Overall NSW Health 2011 YourSay Survey Results YourSay - NSW Health Workplace Survey Results Presentation NSW Health Overall Presented by: Robyn Burley.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
C.A.R. RESEARCH HIGHLIGHTS June 5, 2008 Joel Singer Executive Vice President California Association of REALTORS®
Internal Audit Quality Assessment Guide
Data Mining for Expertise: Using Scopus to Create Lists of Experts for U.S. Department of Education Discretionary Grant Programs Good afternoon, my name.
The Philippine Quality Award Program
AAMC Faculty Forward Engagement Survey Results
UCL Annual Student Experience Review
HeartShare Human Services of New York
Using Customer Feedback to Improve Data Delivery
Faculty use of digital resources and its impact on digital libraries
CAF Quarterly Meeting Measuring the Value of an EA Practice
Presentation transcript:

SiegelICSTIconf07v4.ppt1 Customer Satisfaction and Performance Metrics Elliot R. Siegel, PhD & Fred B. Wood, DBA US National Library of Medicine January 22, 2007 ICSTI Conference, London, UK

2 Copyright Published as Multimedia Appendix 2 in: Wood FB, Siegel ER, Feldman S, Love CB, Rodrigues D, Malamud M, Lagana M, Crafts J Web Evaluation at the US National Institutes of Health: Use of the American Customer Satisfaction Index Online Customer Survey J Med Internet Res 2008;10(1):e4 © the authors. Published under the terms of the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited, including full bibliographic details and the URL (see above).

3 Outline Multidimensional Approach to Web Evaluation Online User Surveys and the American Customer Satisfaction Index (ACSI) Overview of the ACSI Illustrative Top-Level Results Evaluation of the Evaluation Conclusions Acknowledgments

4 Why A Multidimensional Approach? Web-based information dissemination now dominates in the science, technical, and biomedical sectors We need to understand our web users and markets No one evaluation method meets all needs Methods may vary with the web development, operations, improvement life cycle Need to triangulate and integrate evaluative data from several sources

5 Multidimensional Concept (from Wood, Siegel, et al., “A Practical Approach to E-Government Web Evaluation,” Information Technology Professional, May/June 2003)

6 Web Life Cycle Concept (from Wood, Siegel, et al., “A Practical Approach to E-Government Web Evaluation,” Information Technology Professional, May/June 2003)

7 Online User Surveys and the ACSI NLM has a long history with user surveys Transitioned to online surveys in the late 1990s -- but these were snap shots, once a year at most -- no standard methods or benchmarks ACSI offers greater value added -- continuous -- rigorous standardized survey methodology -- randomized, rolling sample -- standardized questions + optional custom questions -- extensive benchmarking of results

8 NLM/NIH and the ACSI US Office of Management and Budget (OMB) approved -- ACSI as recommended customer satisfaction survey method -- expedited contracting (via Federal Consulting Group/US Dept of the Treasury) -- expedited survey clearance -- limited use of cookies (to block repeat surveys) NIH , Pilot Testing by NLM & NCI , NIH Enterprise Wide Implementation w/ 60 web sites -- NLM a member of NIH ACSI Leadership Team

9 How Can ACSI Survey Results Help? (Source: ForeSeeResults, Inc.) A performance metric: measure the satisfaction of your site visitors (customers) on a continuous basis -- Are we meeting visitor needs and exceeding their expectations -- Measure the impact of change (web site redesigns, external events, etc.) Identify the potential impact of web site change on satisfaction and behavior in order to prioritize improvements -- Identify, prioritize and justify site improvements Benchmark web site against leading companies/organizations in your industry or public sector or across sectors

10 ACSI Reporting and Analysis Process (Source: ForeSeeResults Inc.) Implementation/ Kickoff Meeting Finalize Survey and Approve Deployment Recommendations Develop and Test Survey Code Go Live with Survey Data Collection and Monitoring of Responses Review Online Portal Results Satisfaction Insight Reports – Every 6 Weeks* Quarterly Satisfaction Insight Review Meeting Implementation: Reporting: * Timing may vary for sites with low site traffic due to slow data collection

11 Pathway to Actionability (Source: ForeSeeResults Inc.) Get a general overview How is web site doing overall? Segment by Standardized & Custom Questions Assess by key factors, user segments, issues Add Custom Questions Drill down further Analyze New Findings Continue analysis Determine Areas of Opportunity Actionable Results

12 Illustrative Data Reporting of Survey Results (Source: ForeSeeResults, inc.) Element Scores Composite Satisfaction Score Future Behavior Scores Element Impact On Satisfaction Element Impact On Satisfaction Impact On Future Behaviors Satisfaction Impact On Future Behaviors

13 Illustrative Reporting on Standardized Questions (Source: ForeSeeResults, inc.) 10 Point Likert Scale for Response Each Question Reported Separately

14 Illustrative Priority Map for Follow-Up on Element Scores (Source: ForeSeeResults, inc.) Low Score and High Impact on satisfaction 4x4 Matrix to Prioritize Follow-up on Element Scores

15 Illustrative Reporting on Custom Questions (Source: ForeSeeResults, inc.) % Distributions for Custom Questions # Responses for Custom Questions Each Question Reported Separately

16 Illustrative Custom Questions Frequency of Visit Role (Consumer, Health Provider, Researcher, etc) Primary Purpose for Visiting the Site Primary Means of Finding the Site What type of information are you looking for? Demographics – Age, Gender, Racial/Ethnic, etc Did you find the information you were looking for? What did you do with the information found? Search-related custom questions Open-ended questions

17 Illustrative Top-Level ACSI Results The overall customer satisfaction index is based on the combined responses to three ACSI standardized questions: --What is your overall satisfaction with this site? -- How well does this site meet your expectations? -- How does this site compare to your idea of an ideal web site? Responses are 0 to 100, based on a 10-point Likert scale (poor to excellent)

18 Illustrative Top-Level ACSI Results Survey Results on Overall Customer Satisfaction (for participating web sites) Quarter 4 data for US Government web sites Quarter 2 data for private sector web sites Federal Government web sites -- All E-Government web sites, 73.9 (average score) -- All National Institutes of Health web sites, 81.3 News/Information web sites -- All E-Government, All NIH, All private sector, 73.0

19 Top-Level ACSI Results (Cont’d) Leading individual web sites in News/Information Sector NIH web sites -- MedlinePlus in English (NLM/NIH), MedlinePlus en Espanol (NLM/NIH), AIDSinfo (NLM/NIH), NIDDK (NIH), NCI en Espanol (NIH), 83.0 Private sector web sites -- USATODAY.com, CNN.com, ABCNEWS.com, MSNBC.com, NYTimes.com, 72.0

20 Top-Level ACSI Results (Cont’d) Portal web sites -- All E-government, All NIH, All private sector, 76.0 Leading individual web sites in the Portal Sector NIH web sites -- NCI, NHLBI, Office of Science Education/OD, NIAMS, 80.0 Private sector web sites -- Yahoo.com, MSN.com (Microsoft Corp.), AOL.com (Time Warner Inc.), 74.0

21 Evaluating the Evaluation The trans-NIH ACSI project included a major evaluation component, an “evaluation of the evaluation” -- ~$225K for evaluation, of the total project budget of $1.5M -- Westat Inc. was the evaluation contractor, and worked closely with the NIH Leadership Team and participating web sites Included initially 60 web sites from 18 NIH institutes and centers and 13 offices of the NIH Director sites were active well into web sites collected enough survey data to generate ACSI scores

22 Evaluation Methods Baseline pre-project web site profiles Before and after surveys of participating web site teams (51 web sites completed the “after” survey) Interviews with representative cross section of web site staff Observations of ForeSeeResults debriefing meetings with web teams on survey results and analysis Observations and discussions at quarterly trans-NIH ACSI meetings Observations and discussions at bi-weekly NIH Leadership Team meetings Review/analysis of secondary data

23 Evaluation Results—Web Site Specific A major goal was to evaluate the use and value of the ACSI to web site teams Based on user (meaning NIH web team) surveys: -- A majority of respondents strongly or somewhat agreed that the ACSI scores and custom question results were useful -- A majority cited one or more key uses of the ACSI data and plan to use ACSI data in the next redesign -- About three-quarters cited one or more types of site improvements planned using the ACSI data -- About two-thirds strongly or somewhat agreed that they were satisfied overall with the ACSI

24 Usefulness of Custom Questions and ACSI Scores

25 Site Teams’ Use of ACSI Data

26 Types of Site Improvements Planned Using ACSI Data

27 Plans to Use ACSI Data for Next Web Site Redesign

28 Overall Satisfaction With Use of ACSI to Evaluate Site

29 Evaluation Results—Trans-NIH Another major goal was to evaluate the importance of the ACSI to NIH as a whole. The project: -- greatly increased the focus on measurement of customer satisfaction with NIH web sites -- encouraged a user-centered approach to NIH web site design and improvement -- strengthened the network of NIH web site professionals -- provided opportunities to share experiences, lessons learned, and informal mentoring

30 Trans-NIH Evaluation Results (Cont’d) The project also enhanced the NIH leadership position re web evaluation -- The Trans-NIH project was the first “Enterprise-Wide” ACSI application, and the largest enterprise web evaluation project to date in the US Government. -- NIH web sites performed well overall against other US Govt and private sector benchmarks, and as a result NIH received significant positive media coverage. -- NIH received an E-Government award from the Federal Consulting Group/US Dept of the Treasury conferred by a senior OMB official.

31 Trans-NIH Evaluation Results (Cont’d) The project identified key factors: Associated with successful use of ACSI— -- Timing of the surveys with the web site redesign cycle -- Supportive management -- Sufficient financial resources Associated with issues/difficulties— -- Low traffic web sites (insufficient volume for valid online surveys) -- Intranet web sites (few or no outside users) -- Skeptical staff and/or management attitude toward surveys or web evaluation generally

32 Conclusions Online user surveys can provide helpful information about and better understanding of web site users, and contribute to a user-centered approach to web site design. The ACSI provides additional value added because of its rigorous and proven methodology, standardized questions, benchmarking, optional custom questions, and good price-value ratio. The ACSI, or similar, is not for all web sites, and requires sufficient site traffic and customer base, plus adequate management and financial support.

33 Conclusions (Cont’d) The ACSI, like all online surveys in the web environment, has relatively low response rates (typically in the range of 4 to 8 percent). The ACSI uses random intercepts and several cross-checks to help assure that non- response bias is minimized, but the latter is an issue that warrants greater research attention. Overall, based on the NIH experience, the ACSI would seem applicable: -- to medium to high traffic web sites in any country -- in other fields of Science and Technology as well as Medicine -- that have a significant “public” user base (meaning doctors, scientists, other professionals, librarians, students, and faculty, researchers, and interested lay persons outside the agency or organization).

34 Conclusions (Cont’d) The encouragement of such customer survey methods would seem consistent with the ICSTI mission to encourage broad public access to the highest quality STI throughout the world. The World Wide Web is now the global standard for STI dissemination, and use of methods such as the ACSI can help assure that the web sites and the information available from them are the best that they can be. Thanks to the NLM and NIH staff and others who contributed to the success of the ACSI project.

35 Acknowledgments Other NIH Leadership Team Members: -- Sue Feldman, Cindy Love, Mark Malamud, Dennis Rodrigues, Marie Lagana NIH Contractor Support: -- Larry Freed, Rick Jacobson, Joel Van Haaften, ForeSeeResults Inc. -- Jennifer Crafts, Westat Inc. -- Ron Oberbillig, Federal Consulting Group This presentation is based in part on material developed by Larry Freed, Joel van Haaften, Jennifer Crafts, Sue Feldman, and Cindy Love.

36 For Further Information: Contact: Dr. Elliot R. Siegel Associate Director for Health Information Programs Development US National Library of Medicine US National Institutes of Health US Department of Health and Human Services Bldg. 38, Room 2S Rockville Pike Bethesda, MD 20894, USA Ph: