NIH's Enterprise Approach to Measuring Customer Satisfaction Presented at ACSI User Group Meeting March 20, 2007 Sue Feldman, National Cancer Institute.

Slides:



Advertisements
Similar presentations
CDCs 21 Goals. CDC Strategic Imperatives 1. Health impact focus: Align CDCs people, strategies, goals, investments & performance to maximize our impact.
Advertisements

High-Level Data Analysis Presentation Slide Deck
SiegelICSTIconf07v4.ppt1 Customer Satisfaction and Performance Metrics Elliot R. Siegel, PhD & Fred B. Wood, DBA US National Library of Medicine January.
00 Innovative Citizen-Centred Services Through Collaboration Institute for Citizen-Centred Service.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
The Power of Collaboration CRS Intranet Catholic Relief Services Or Dashevsky October,
PRODUCT FOCUS 4/14/14 – 4/25/14 INTRODUCTION Our Product Focus for the next two weeks is Microsoft Office 365. Office 365 is Microsoft’s most successful.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Office of Enabling Technologies Erik Hofer, Director 11/1/2011.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Center for Health Care Quality Licensing & Certification Program Evaluation 1 August 2014 rev.
Federal Consulting Group August 2004 Department of Labor Civil Rights Center 2004 Satisfaction Study - Recipients.
A service of the U.S. National Institutes of Health Module 1: Clinical Trials and Requirements for Registration and Results Reporting.
U.S. Department of Agriculture eGovernment Program FSIS Web site Re-design Project Information Architecture Strategy October 8, 2003.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
United Nations Millennium Action Plan Health InterNetwork World Health Organization April 2001.
American College of Healthcare Executives ACHE Update Leadership Knowledge Relationships Marketability.
Northwest Center for Public Health Practice Preparing for the Future: Public Health Leadership & Management Preparedness Series The Future of Leadership.
Energy & Materials Tracking Terri Goldberg, NEWMOA.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
1 Involving Data Users in the Planning Process Who are your data users? Why do you meet with your data users? When do you meet with your data users? What.
Overview: FY12 Strategic Communications Plan Meredith Fisher Director, Administration and Communication.
NASA Earth Observing System Data and Information Systems
Earth Observing System Data and Information System (EOSDIS) provides access to more than 3,000 types of Earth science data products and specialized services.
This project was supported by Award No VF-GX-0001, awarded by the National Institute of Justice, Office of Justice Programs, U.S. Department of Justice.
BC NSQIP SITE ASSESSMENT SUMMARY FINDINGS SURGICAL QUALITY ACTION NETWORK MEETING FEBRUARY 18 TH 2015.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
NIH ACSI Meeting Oct. 4, Evaluation of NIH ACSI Website Project: Highlights of the Final Report Jennifer Crafts, Ph.D. Westat.
Elizabeth A. Martinez, MD, MHS Johns Hopkins Medical Institutions September 10, 2008 Organization of Care and Outcomes in Cardiac Surgery AHRQ grant 1K08HS A1.
Towards a European network for digital preservation Ideas for a proposal Mariella Guercio, University of Urbino.
Leverus Annual Internet Survey for Associations and Non-profit Organizations: The state of the Internet 2004.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Why Use MONAHRQ for Health Care Reporting? March 2015 Note: This is one of eight slide sets outlining MONAHRQ and its value, available at
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Overview of CAHPS ® and the National CAHPS ® Database Assessing Patients’ Experiences with Care: Using CAHPS ® as a Standardized Quality Metric Dale Shaller,
Georgia Institute of Technology CS 4320 Fall 2003.
Catawba County Board of Commissioners Retreat June 11, 2007 It is a great time to be an innovator 2007 Technology Strategic Plan *
CITIZEN SATISFACTION SURVEY OVERVIEW REPORT PRESENTATION TO PARLIAMENTARY PORTFOLIO COMMITTEE ON PUBLIC SERVICE AND ADMINISTRATION 09 APRIL 2003.
The Importance of a Strategic Plan to Eliminate Health Disparities 2008 eHealth Conference June 9, 2008 Yvonne T. Maddox, PhD Deputy Director Eunice Kennedy.
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Developing and applying business process models in practice Statistics Norway Jenny Linnerud and Anne Gro Hustoft.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
TRANSPORTATION RESEARCH BOARD WATER SCIENCE AND TECHNOLOGY BOARD TRANSPORTATION RESEARCH BOARD TRB’s Vision for Transportation Research.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
Arlington, VA March 31, 2004 Presentation for the Advisory Committee for Business & Operations Effective Practices Research Overview For Merit Review This.
1 © 2004 ForeSee Results Best Practices for Managing Citizen Satisfaction On Your Website WebShop 2004 July 28, 2004.
A.O.F. Internal Confidential Communication Project REVIEW PROTOCOL for SMART-HEALTH-COMPUTER STATIONS For Churches.
The Los Angeles Public Health Leadership Institute: An Intra-organizational Approach To Leadership Development APHA Session: The Challenge of Leadership.
1 An Overview of Process and Procedures for Health IT Collaboration GSA Office of Citizen Services and Communications Intergovernmental Solutions Division.
Strategic Planning Chester County Library System Strategic Planning Steering Committee November 14, 2008 Gail Griffith.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Overall NSW Health 2011 YourSay Survey Results YourSay - NSW Health Workplace Survey Results Presentation NSW Health Overall Presented by: Robyn Burley.
SIX PLUS ONE COLUMBUS CITY SCHOOLS IMPLEMENTATION MODEL OF PARENT ENGAGEMENT = 7.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
November | 1 CONTINUING CARE COUNCIL Report to Forum Year
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
Transforming Government, Transforming Communities Strengthening the Federal Workforce to Help Communities Implement Place-Based Initiatives Transforming.
Demographic Full Count Review Presentation to the FSCPE March 26, 2001 Washington D.C.
Shared Services Initiative Summary of Findings and Next Steps.
©2012 THE ADVISORY BOARD COMPANY ADVISORY.COM Gaining Provider Feedback In February – March 2014, we administered a medical staff survey to employed &
Industry Relations Strategic Planning
Internet as a tool for health education of medical personnel
Energy & Materials Tracking
MUHC Innovation Model.
Clinical Engineering Lecture (3).
Engagement Follow-up Resources
Engagement Follow-up Resources
Presentation transcript:

NIH's Enterprise Approach to Measuring Customer Satisfaction Presented at ACSI User Group Meeting March 20, 2007 Sue Feldman, National Cancer Institute Cindy Love, National Library of Medicine

Copyright Published as Multimedia Appendix 4 in: Wood FB, Siegel ER, Feldman S, Love CB, Rodrigues D, Malamud M, Lagana M, Crafts J Web Evaluation at the US National Institutes of Health: Use of the American Customer Satisfaction Index Online Customer Survey J Med Internet Res 2008;10(1):e4 © the authors. Published under the terms of the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited, including full bibliographic details and the URL (see above).

ACSI Trans-NIH Evaluation ACSI Trans-NIH Leadership Team National Library of Medicine Elliot Siegel Fred Wood Cindy Love National Cancer Institute Sue Feldman National Heart, Lung, and Blood Institute Mark Malamud NIH Office of Communications and Public Liaison Dennis Rodrigues NIH Center for Information Technology Marie Lagana Evaluation Contractor Westat Jennifer Crafts

ACSI Trans-NIH Evaluation ForeSee Results Larry Freed Joel VanHaaften Errol Hau Rick Jacobson Federal Consulting Group Ron Oberbillig

Transforming Health and Medicine Through Discovery National Institutes of Health

NIH Mission Uncover new knowledge that leads to better health for everyone by:  Supporting peer-reviewed scientific research at universities, medical schools, hospitals, and research institutions throughout United States and overseas  Conducting research in its own laboratories  Training research investigators  Developing and disseminating credible health information based on scientific discovery

General Public Scientists Voluntary Organizations Scientific Review Committees U.S. President Boards of Scientific Counselors Public Members of Advisory Councils Professional Societies Industry Patients & Their Advocacy Groups NIH Staff Congress Scientist Council Members Ad Hoc Advisors Physicians & Other Health Professionals Foreign Governments Every Voice Counts

Overview Why the National Institutes of Health (NIH) Decided to Take an Enterprise Approach to Measuring Customer Satisfaction Project Background Evaluation Results –How NIH Web Sites Have Used the ACSI –What NIH Learned

To strengthen each participating organization’s Web evaluation capability To share Web evaluation learning and experience with the ACSI across NIH Why the NIH Decided to Take an Enterprise Approach to Measuring Customer Satisfaction

To evaluate the use of the ACSI as a Web evaluation tool for NIH Web sites –Usefulness in evaluating individual NIH Web sites –When and how the ACSI would be most useful –How sites might benefit –Whether ACSI could be integrated into the redesign cycles of the various different Web sites at NIH

Project Background Fall of 2003, NLM and NCI had both implemented the ACSI on a number of Web sites May 2004, NLM and NCI shared their experiences at trans-NIH Web Authors Group (WAG) meeting WAG members polled for indication of interest in participating in Trans-NIH ACSI project Team of co-principal investigators assembled and NIH Evaluation Set-Aside funds were applied for

Project Background October 2004, NIH Evaluation Branch funded effort –Initially funded for 18 months –In 2005, supplemental funding extended the project for 6 months The project was managed by a trans-NIH ACSI Leadership Team, made up of the co-principal investigators Evaluation was conducted by Westat

Project Background Participation by 18 NIH institutes and centers and 13 offices of the Office of the NIH Director 60 initial licenses 55 licenses active into Web sites collected enough surveys to generate ACSI scores

Project Background The Web sites included: –Organization home pages and/or portals –Sites supporting access to and use of research data –Sites for dissemination of medical information –Sites for transacting extramural business such as grant applications –Sites promoting access to clinical trials –Intranet sites –Niche sites Audiences included patients, family/friends of patients, health professionals, scientist/researchers, educators, administrators, librarian/information professionals, journalist/reporters, students, government employees, and other general public.

Evaluation Questions Through the offer of an ACSI license, were teams encouraged to use an online customer satisfaction survey? What was the value of using the ACSI? Did broad ACSI use provide additional enterprise- wide benefits? Did the evaluation provide any additional understanding about how NIH sites are used?

Evaluation Methodology Data collected from October 2004 to May 2006 included: –Review of Related Data from NIH Web site teams –Surveys of NIH Web site teams –Interviews with NIH Web site teams –Observations of meetings

Evaluation Results How NIH Web Sites Have Used the ACSI What NIH Learned

How NIH Web Sites Have Used the ACSI Teams used the ACSI as: A ready-to-use customer satisfaction metric that provided pre- approved Office of Management and Budget (OMB) clearance A tool for incorporating custom questions in order to identify specific site issues and problems A source of information about audience demographics A source for planning any follow-up work involving additional evaluation methods An archive of data for future use and analysis

How NIH Web Sites Have Used the ACSI To benchmark against other government and industry sites To gain insights about and opportunities for improving Web presence through site-specific feedback To respond more quickly and effectively to ever- evolving Web To determine impact of proposed Web site changes To evaluate whether programs are performing significantly better or worse over evaluation period

What NIH Learned ACSI Use for Individual Web Site Teams ACSI Trans-NIH Activities

ACSI Use for Individual Web Site Teams Web Site Team Rating of Key Start-up Activities Overall Satisfaction With Use of ACSI to Evaluate Site Usefulness of Custom Questions and ACSI Scores Site Teams Use of ACSI Data Barriers to Making Changes to Site

Web Site Team Rating of Key Start-Up Activities

Overall Satisfaction With Use of ACSI to Evaluate Site

Usefulness of Custom Questions and ACSI Scores

Site Teams Use of ACSI Data

Types of Site Improvements Planned Using ACSI Data

Teams’ Plans to Use ACSI Data for Next Redesign

Barriers to Making Changes to Site

What NIH Learned: Considerations for Use of ACSI Successful implementation of ACSI methodology requires –Buy-in from staff and management –Resources (staff time, license time) Commitment to evaluation and customer satisfaction takes time –To get familiar w/ reports and methodology –To identify priorities for what to work on, revise Understanding of how to take full advantage of custom questions and segmentation Coordination of license with site maintenance/revision cycle –When will you benefit most from customer satisfaction data?

What NIH Learned: NIH Site Characteristics and the ACSI Associated with successful use: –Timing license period with redesign cycle –Committed resources –Supportive management –Adequate traffic volume –Public site –No-niche site Associated with issues/difficulties: –Intranet –Low traffic volume –Manual page coding required –Skeptical attitude within organization –Lack of support from staff or management –Fit of Web site team and SRA –Niche or specialty Web sites

What NIH Learned: ACSI Use for Individual Web Site Teams The majority of Web site teams were able to implement the ACSI and receive results for their sites Issues surfaced in cases where: –Adding code to Web site pages was a labor-intensive process –Internal staff or management were skeptical about the ACSI methodology –ACSI data accumulated slowly (e.g., for intranet sites or sites with low traffic volume)

What NIH Learned: ACSI Use for Individual Web Site Teams Across all sites, teams derived the most value from their custom question and segmentation data rather than from their ACSI model data –Data provided valuable insight about audience profiles and visit characteristics –Teams took advantage of having a continuous feedback source for identifying site problems and audience information needs –Teams used their custom question data to plan a variety of types of site improvements to address areas identified as important for improving customer satisfaction

What NIH Learned: ACSI Use for Individual Web Site Teams Timing of the license was a key factor in perceived value of the ACSI –Teams that were actively involved in updating or redesigning their sites used the custom questions and segmentation analyses to address needs. These teams tended to have their resources ready to act on results and implement site changes –Teams that did not currently have the staff time to devote to reviewing results indicated that they were saving their qualitative data for use in planning their next redesign

What NIH Learned: ACSI Use for Individual Web Site Teams Longevity was a key factor in making optimal use of the ACSI for Web site evaluation –Teams that used the ACSI the longest tended to be satisfied with and find value in its use, especially for planning site changes and comparing versions of the site before and after revisions –Teams for sites with relatively later license term start dates and/or slow rates of collecting ACSI surveys tended to be dissatisfied with the ACSI because they did not have sufficient time or opportunity to receive and/or act on ACSI results

What NIH Learned: ACSI Use for Individual Web Site Teams Web site teams expressed some dissatisfaction with the process of using the ACSI in cases where: –There was turnover of the Satisfaction Research Analysts (SRAs) assigned from ForeSee –Teams perceived that ACSI Satisfaction Scores did not truly reflect site quality (e.g., sites for which visitors look for content that does not fit within the site mission) –Staff time constraints were a barrier to attending to or acting on the perceived large volume of ACSI data Some of these teams would prefer to use an online survey on a more intermittent basis

Web Sites That Were Less Successful in Using the ACSI Timely Collection of 300 Completed ACSI Surveys Timing of License Period With Web Site Development/Redesign Schedule Fit of Web Site Team and SRA “Niche” or Specialty Web Sites Lack of Support From Staff or Management

ACSI Trans-NIH Activities Increased interest in Web evaluation and customer satisfaction measurement Promoted user-centered design Encouraged collaboration across NIH Secured permission for NIH Web sites to use persistent cookies in conjunction with ACSI

ACSI Trans-NIH Activities Provided sharing of lessons learned and experiences across NIH –Shared case studies –Shared value of custom questions –Encouraged use of different types of custom questions –Demonstrated use of custom questions to investigate timely topics –Discussed opportunities for improving NIH Web presence

ACSI Trans-NIH Activities Enabled benchmarking of performance against other agencies, departments, organizations –Where NIH fits with ForeSee trends and insights –Areas of relative strengths and weaknesses By aggregating similar custom questions across sites –Provided better understanding of user needs for health information and user motivations for seeking and using information –Provided deeper insights into roles and demographics of users and better understanding of why they came to Web site and what they did with information found

ACSI Trans-NIH Activities NIH-wide meetings –Highlighted contributions and challenges of ACSI –Provided forum to share lessons learned and identify future directions and opportunities –Contributed to increasing awareness and understanding of Web evaluation at NIH Network of NIH Web site professionals –Provided informal mentoring by experienced/knowledgeable Web site team members and teams

Conclusions ACSI is a useful methodology for Web evaluation –Online user surveys can provide helpful information about and better understanding of web site users, and contribute to a user-centered approach to web site design. –The ACSI provides additional value added because of its rigorous and proven methodology, standardized questions, benchmarking, optional custom questions, and good price- value ratio. Overall, NIH sites derived benefit from use

Conclusions This project enhanced the NIH leadership position re web evaluation -- The Trans-NIH project was the first “Enterprise-Wide” ACSI application, and the largest enterprise web evaluation project to date in the US Government. -- NIH web sites performed well overall against other US Govt and private sector benchmarks, and as a result NIH received significant positive media coverage.

Conclusions Most NIH sites were only beginning to integrate ACSI into their respective redesign cycles The ACSI is not for all web sites, and requires sufficient site traffic and customer base, plus adequate management and financial support. Use of the ACSI can help assure that Web sites and the information available from them are the best that they can be.

Conclusions Thanks to the NIH staff and others who contributed to the success of the ACSI project and a special thanks to Cindy Love for collaborating with me on this presentation.

Sue Feldman Cindy Love

Questions?