Download presentation
Presentation is loading. Please wait.
Published byCrystal Greer Modified over 10 years ago
1
New Ways of Listening to Library Users: New Tools for Measuring Service Quality A. Parasuraman University of Miami Washington, DC November 4, 2005
2
Defining, Assessing, and Measuring Service Quality: A Conceptual Overview © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 2
3
Multi-Phase, Multi-Sector, Multi-Year Program of Research to Address the Following Issues How do customers perceive and evaluate service quality? What are managers’ perceptions about service quality? Do discrepancies exist between the perceptions of customers and those of managers? Can customers’ and managers’ perceptions be combined into a general model of service quality? How can service organizations improve customer service and achieve excellence? © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 3
4
Determinants of Perceived Service Quality Expected Service Perceived Service Quality Gap Perceived Service Quality Word of Mouth Personal Needs Past Experience External Communication to Customers © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 4
5
A “GAPS” MODEL OF SERVICE QUALITY Customers’ Service Expectations CUSTOMERSERVICE ORGANIZATION Service Quality Gap Customers’ Service Perceptions GAP 5 Organization’s Understanding of Expectations Organization’s Service Standards Organization’s Service Performance Organization’s Communications to Customers Market Information Gap Service Performance Gap Internal Communication Gap Service Standards Gap GAP 1 GAP 2 GAP 3 GAP 4 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 5
6
POTENTIAL CAUSES OF INTERNAL SERVICE GAPS [GAPS 1 - 4] © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 6
7
GAP 1 Customer Expectations Key Factors: Insufficient marketing research Inadequate use of marketing research Lack of interaction between management and customers Insufficient communication between contact employees and managers Management Perceptions of Customer Expectations Lack of “Upward Communication” © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 7
8
GAP 2 Key Factors: Inadequate management commitment to service quality Absence of formal process for setting service quality goals Inadequate standardization of tasks Perception of infeasibility -- that customer expectations cannot be met Management Perceptions of Customer Expectations Service Quality Specifications © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 8
9
GAP 3 Key Factors: Lack of teamwork Poor employee - job fit Poor technology - job fit Lack of perceived control (contact personnel) Inappropriate evaluation/compensation system Role conflict among contact employees Role ambiguity among contact employees Service Quality Specifications Service Delivery © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 9
10
GAP 4 Key Factors: Inadequate communication between salespeople and operations Inadequate communication between advertising and operations Differences in policies and procedures across branches or departments Puffery in advertising & personal selling Service Delivery External Communications to Customers Lack of “Horizontal Communication” © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 10
11
SUGGESTIONS FOR CLOSING INTERNAL SERVICE GAPS [GAPS 1 - 4] © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 11
12
Suggestions for Closing the Market Information Gap Conduct systematic marketing research Make senior managers interact with customers Make senior managers occasionally perform customer-contact roles Encourage upward communication from customer- contact employees © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 12
13
Suggestions for Closing the Service Standards Gap Make a blueprint of the service and standardize as many components of it as possible Institute a formal, ongoing process for setting service specifications Eliminate “perception of infeasibility” on the part of senior managers Make a true commitment to improving service quality © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 13
14
Suggestions for Closing the Service Performance Gap Invest in ongoing employee training Support employees with appropriate technology and information systems Give customer-contact employees sufficient flexibility Reduce role conflict and role ambiguity among customer-contact employees Recognize and reward employees who deliver superior service © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 14
15
Suggestions for Closing the Internal Communication Gap Facilitate effective horizontal communication across functional areas (e.g., marketing and operations) Have consistent customer-related policies and procedures across branches or departments Resist the temptation to promise more than the organization can deliver © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 15
16
Process Model for Continuous Measurement and Improvement of Service Quality Do your customers perceive your offerings as meeting or exceeding their expectations? Do you have an accurate understanding of customers’ expectations? Are there specific standards in place to meet customers’ expectations? Do your offerings meet or exceed the standards? Is the information communicated to customers about your offerings accurate? Continue to monitor customers’ expectations and perceptions YES NO YES Take corrective action NO © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 16
17
SERVQUAL: Development, Refinement, and Empirical Findings © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 17
18
Determinants of Perceived Service Quality Dimensions of Service Quality 1. Access 2. Communication 3. Competence 4. Courtesy 5. Credibility 6. Reliability 7. Responsiveness 8. Security 9. Tangibles 10.Understanding/Knowing the Customer Expected Service Perceived Service Quality Gap Perceived Service Quality Word of Mouth Personal Needs Past Experience External Communication to Customers © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 18
19
Correspondence between SERVQUAL Dimensions and Original Ten Dimensions for Evaluating Service Quality Original Ten Dimensions for Evaluating Service Quality TANGIBLES RELIABILITY RESPONSIVENESS COMPETENCE COURTESY CREDIBILITY SECURITY ACCESS COMMUNICATION UNDERSTANDING/ KNOWING THE CUSTOMER TANGIBLES RELIABILITY RESPONSIVENESS ASSURANCE EMPATHY SERVQUAL Dimensions © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 19
20
Definitions of the SERVQUAL Dimensions Tangibles: Appearance of physical facilities, equipment, personnel, and communication materials. Reliability: Ability to perform the promised service dependably and accurately. Responsiveness: Willingness to help customers and provide prompt service. Assurance: Knowledge and courtesy of employees and their ability to inspire trust and confidence. Empathy: Caring, individualized attention the firm provides its customers. © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 20
21
Relative Importance of Service Dimensions When Respondents Allocate 100 Points [Study 1] TANGIBLES 11% EMPATHY 16% RELIABILITY 32% ASSURANCE 19% RESPONSIVENESS 22% © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 21
22
Relative Importance of Service Quality Dimensions [Study 2] Mean Number of Points Allocated out of 100 Points 37 9 13 18 23 Computer Manufacturer All CompaniesRetail Chain Auto InsurerLife Insurer ReliabilityResponsivenessAssuranceEmpathyTangibles © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 22
23
1.00 0.00 -2.00 TangiblesReliabilityResponsive- ness AssuranceEmpathy Mean SERVQUAL Scores by Service Dimension [Study 1] © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 23
24
Nature of Service Expectations Desired Service Zone of Tolerance Adequate Service Level Customers Believe Can and Should Be Delivered Minimum Level Customers Are Willing to Accept © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 24
25
Measure of Service Adequacy (MSA) Measure of Service Superiority (MSS) = = Perceived Service Perceived Service - - Adequate Service Desired Service The Two Levels of Expectations Imply Two Corresponding Measures of GAP 5: © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 25
26
TWO APPROACHES FOR MEASURING MSA AND MSS Two-Column Format Questionnaire –Direct measures of MSA and MSS Three-Column Format Questionnaire –Difference-score measures of MSA and MSS © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 26
27
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 27
28
© A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 28
29
Measurement Error: Percent of Respondents Answering Incorrectly Type of Company © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 29
30
Mean Service Quality Scores (Combined Across All Companies) SERVQUAL Dimensions © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 30
31
Revised SERVQUAL Items Reliability 1. Providing services as promised 2. Dependability in handling customers' service problems 3. Performing services right the first time 4. Providing services at the promised time 5. Keeping customers informed about when services will be performed Responsiveness 6. Prompt service to customers 7. Willingness to help customers 8. Readiness to respond to customers' requests Assurance 9. Employees who instill confidence in customers 10. Making customers feel safe in their transactions 11. Employees who are consistently courteous 12. Employees who have the knowledge to answer customer questions Empathy 13. Giving customers individual attention 14. Employees who deal with customers in a caring fashion 15. Having the customer's best interest at heart 16.Employees who understand the needs of their customers Tangibles 17. Modern equipment 18. Visually appealing facilities 19. Employees who have a neat, professional appearance 20. Visually appealing materials associated with the service 21. Convenient business hours © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 31
32
Service Quality Perceptions Relative to Zones of Tolerance by Dimension Computer Manufacturer 0 1 2 3 4 5 6 7 8 9 ReliabilityResponsivenessAssuranceEmpathyTangibles Zone of ToleranceS.Q. Perception © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 32
33
Service Quality Perceptions Relative to Zones of Tolerance by Dimension Computer Manufacturer 0 1 2 3 4 5 6 7 8 9 ReliabilityResponsivenessAssuranceEmpathyTangibles Zone of ToleranceS.Q. Perception © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 33
34
Service Quality Perceptions Relative to Zones of Tolerance by Dimension On-Line Services 0 1 2 3 4 5 6 7 8 9 ReliabilityResponsivenessAssuranceEmpathyTangibles Zone of ToleranceS.Q. Perception 6.8 7.0 6.7 7.0 8.38.4 6.8 8.4 6.8 8.3 5.7 7.5 6.8 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 34
35
Service Quality Perceptions Relative to Zones of Tolerance by Dimension Tech-Support Services 0 1 2 3 4 5 6 7 8 9 ReliabilityResponsivenessAssuranceEmpathy Zone of ToleranceS.Q. Perception 8.5 6.9 8.4 6.1 6.6 6.7 8.1 6.4 6.3 8.3 6.3 6.8 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 35
36
LIBQUAL+: An Adaptation of SERVQUAL 36 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission © Association of Research Libraries, Washington DC (2003)
37
MULTIPLE METHODS OF LISTENING TO CUSTOMERS Transactional surveys* Mystery shopping New, declining, and lost-customer surveys Focus group interviews Customer advisory panels Service reviews Customer complaint, comment, and inquiry capture Total market surveys* Employee field reporting Employee surveys Service operating data capture * A SERVQUAL-type instrument is most suitable for these methods © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 37
38
The Role Of Technology In Service Delivery: Electronic Service Quality (e-SQ) and Technology Readiness (TR) © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 38
39
Technology’s Growing Role in Marketing to and Serving Customers: Pyramid Model Company Employees Customers Technology Internal Marketing Interactive Marketing External Marketing © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 39
40
Ongoing Research on e-Service Quality: Conceptual Framework and Preliminary Findings © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 40
41
Research Phases and Questions PHASE 1: What is good service on the Web? What are the underlying dimensions of superior electronic service quality (e-SQ?) How can e-SQ be conceptualized? PHASE 2: How do these dimensions compare to those of traditional service quality? How can e-SQ be measured and thereby assessed? © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 41
42
Definition of e-Service Quality (e-SQ) e-SQ is the extent to which a Website facilitates efficient and effective shopping, purchasing and delivery of products and services © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 42
43
Dimensions of e-Service Quality from Focus Groups Access Ease of Navigation Efficiency Customization/ Personalization Security/Privacy Responsiveness Assurance/Trust Price Knowledge Site Aesthetics Reliability Flexibility © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 43
44
Reliability SAMPLE ATTRIBUTES Site does not crash Accurate billing Accuracy of order Accuracy of account information Having items in stock Truthful information Merchandise arrives on time DEFINITION Correct technical functioning of the site and the accuracy of service promises, billing and product information. © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 44
45
Efficiency SAMPLE ATTRIBUTES Site is well organized Site is simple to use Site provides information in reasonable chunks Site allows me to click for more information if I need it DEFINITION The site is simple to use, structured properly, and requires a minimum of information to be input by the customer. © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 45
46
Means-End Model SPECIFIC/ CONCRETE ABSTRACT Dimensions Higher-level Abstractions Perceptual Attributes Concrete Cues © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 46
47
Ease of Navigation Easy to Maneuver through Site Easy to Find What I Need Speed of Checkout Search Engine One-click Ordering Tab Structuring Site Map Means-End Model of e-Service Quality Dimensions Higher-Level Abstractions Perceptual Attributes Concrete Cues © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 47
48
Perceived e-Service Quality Security/ Privacy Price Knowledge Assurance/ Trust Responsive- ness Site Aesthetics Reliability Flexibility Efficiency Ease of Navigation Personali- zation Access Dimensions Higher-Level Abstractions Perceptual Attributes Concrete Cues © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 48
49
Means-End Model of e-Service Quality Perceived Value Perceived Convenience Perceived Control Perceived e-Service Quality Perceived Price Dimensions Higher-Level Abstractions Perceptual Attributes Concrete Cues Behaviors Purchase Loyalty W.O.M © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 49
50
Customer Web site Requirements Perceived e-SQ Perceived Value Purchase/ Repurchase Management’s Beliefs about Customer Requirements Design and Operation of the Web site Marketing of the Web site Design Gap Information Gap Conceptual Model for Understanding and Improving e-Service Quality Customer Company Communication Gap Fulfillment Gap Customer Web site Experiences 50 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
51
Dimensions of e-SQ Core Dimensions [E-S-QUAL] Efficiency Fulfillment System Availability Privacy Recovery Dimensions [E-RecS-QUAL] Responsiveness Compensation Contact Source: Parasuraman, Zeithaml, and Malhotra, “E-S-QUAL: A Multiple-Item Scale for Assessing Electronic Service Quality,” Journal of Service Research, February 2005. © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 51
52
Definitions of e-SQ Dimensions E-S-QUAL Dimensions Efficiency: The ease and speed of accessing and using the site. Fulfillment: The extent to which the site’s promises about order delivery and item availability are fulfilled. System Availability: The correct technical functioning of the site. Privacy: The degree to which the site is safe and protects customer information. E-RecS-QUAL Dimensions Responsiveness: Effective handling of problems and returns through the site. Compensation: The degree to which the site compensates customers for problems. Contact: The availability of assistance through telephone and online representatives. Source: Parasuraman, Zeithaml, and Malhotra, “E-S-QUAL: A Multiple-Item Scale for Assessing Electronic Service Quality,” Journal of Service Research, February 2005. © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 52
53
An Important Implication of the Pyramid Model An organization’s ability to use technology effectively in marketing to and serving customers critically depends on the technology readiness of its customers and employees © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 53
54
What is Technology Readiness [TR]? TR refers to “people’s propensity to embrace and use new technologies for accomplishing goals in home life and at work” © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 54
55
Multinational Research Studies on Technology Readiness Began in 1997 in the USA and still ongoing Being conducted in collaboration with Charles Colby, President, Rockbridge Associates Have thus far involved several qualitative and quantitative studies Completed studies include three “National Technology Readiness Surveys” in the USA [NTRS 1999, 2000, 2001, 2002 and 2004] National studies also have been done or are underway in Austria, Chile, Germany, Singapore and Sweden © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 55
56
Key Insights from Qualitative Research Studies TR doesn’t just refer to possessing technical skills; TR is much more a function of people’s beliefs and feelings about technology People’s beliefs can be positive about some aspects of technology but negative about other aspects The relative strengths of the of positive and negative beliefs determine a person’s receptivity to technology © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 56
57
Receptive to Technology Neutral Resistant to Technology Technology-Beliefs Continuum 57 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
58
Link between Technology Beliefs and Technology Readiness High Receptive to Technology Neutral Resistant to Technology Low Medium Technology Readiness Technology-Beliefs Continuum © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 58
59
Quantitative Survey Methodology Each NTRS in the U.S. included a random sample of adults: –1000 respondents 1999 & 2000 and 500 respondents in 2001, 2002 & 2004 Data collected via computer-assisted telephone interviewing Survey included questions about technology beliefs, demographics, psychographics, and technology-related behaviors and preferences 59 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
60
Key Insights from Quantitative Research Studies TR consists of four facets or dimensions that are fairly independent of one another People’s ratings on a set of belief statements about technology can be combined to create a reliable and valid measure of TR -- i.e., a “Technology Readiness Index” [TRI] The TRI is a good predictor of people’s technology-related behaviors and preferences A meaningful typology of customers can be created based on their TR scores on the four dimensions © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 60
61
Drivers of Technology Readiness Technology Readiness DiscomfortInsecurity Inhibitors Contributors Innovativeness Optimism © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 61
62
Definitions of the TR Drivers Optimism: Positive view of technology; belief that it offers increased control, flexibility and efficiency Innovativeness: Tendency to be a technology pioneer and thought leader Discomfort: Perceived lack of control over technology and a feeling of being overwhelmed by it Insecurity: Distrust of technology and skepticism about its working properly © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 62
63
Optimism10 items Innovativeness 7 items Discomfort 10 items Insecurity 9 items The TRI: A 36-Item, 4-Dimensional Scale to Measure TR © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 63
64
Example of Optimism: “Technology gives people more control over their daily lives” F% of respondents agreeing: 61% in 1999 68% in 2000 65% in 2001 65% in 2002 67% in 2004 Example of Innovativeness: “You keep up with the latest technological developments in your areas of interest” F% of respondents agreeing: 68% in 1999 69% in 2000 65% in 2001 59% in 2002 60% in 2004 Customer Beliefs About Technology © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 64
65
Example of Discomfort: “It is embarrassing when you have trouble with a high-tech gadget while people are watching” F% of respondents agreeing: 52% in 1999 54% in 2000 55% in 2001 51% in 2002 46% in 2004 Example of Insecurity: “Any business transaction you do electronically should be confirmed later with something in writing” F% of respondents agreeing: 87% in 1999 88% in 2000 82% in 2001 82% in 2002 78% in 2004 Customer Beliefs About Technology © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 65
66
OPT. TRI INS.DIS.INN. Mean TR Scores TR Scores by Dimension and Overall TRI 19992000200120042002 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 66
67
Low TR High TR % © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 67
68
TRI Scores by Demographics (NTRS 2004) © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 68
69
Predicted Change in TR of Age Cohorts over Time TR Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Year 1-5 Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Age Cohort 1 Age Cohort 2 Age Cohort N Age Cohort X Age Cohort Y Year 6-10 Year 11-15 Year 16-20 Year 21-25 Year 26-30 Time Age Range Covered in TR Surveys © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 69
70
Five TR-Based Customer Segments OptimismInnovative -ness DiscomfortInsecurity ExplorersHigh Low PioneersHigh SkepticsLow ParanoidsHighLowHigh LaggardsLow High © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 70
71
% Typology of Technology Customers: Percent of Population in Each Segment 19992000200120042002 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 71
72
TR Segments and Technology Adoption High Low Technology Readiness Time of Adoption of New Technologies Explorers Pioneers Skeptics Paranoids Laggards Early Late 72 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
73
73
74
High-Tech versus High-Touch Customer Service High Low Appeal of High-Tech Service Channels Appeal of High-Touch Service Channels Explorers Pioneers Skeptics Paranoids Laggards Low High 74 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
75
In Conclusion, to Deliver Superior Service in Library Environments: Understand customers’ service expectations and how well those expectations are being met Work systematically to remove organizational barriers that lead to poor customer service -- offline and online Recognize and capitalize on the increasing role of technology in serving customers, but … Be cognizant of customers’ and employees’ readiness to embrace technology-based services Recognize that e-service quality as perceived by customers involves much more than having a state-of-the-art website Put in place a solid behind-the-scenes infrastructure -- information systems, logistics, and human resources -- to deliver what a website’s façade promises. Continuously monitor customers’ and employees’ reactions to and experiences with your electronic interfaces 75 © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission
76
Sources of Information about Customer Service and Technology Readiness www.technoreadymarketing.com 76
77
Thank You! © A. Parasuraman, University of Miami; not to be reproduced or disseminated without the author’s permission 77
Similar presentations
© 2025 SlidePlayer.com Inc.
All rights reserved.