Download presentation

Presentation is loading. Please wait.

Published byBruce Janey Modified about 1 year ago

1
Messung und statistische Analyse von Kundenzufriedenheit KF Qualitätsmanagement Vertiefungskurs V

2
Messung & Analyse von Kundenzufriedenheit 2 Outline Customer satisfaction measurement The Structural Equation Model (SEM) Estimation of SEMs Evaluation of SEMs Practice of SEM-Analysis

3
Messung & Analyse von Kundenzufriedenheit 3 The ACSI Model Ref.:

4
Messung & Analyse von Kundenzufriedenheit 4 ACSI-Model: Latent Variables Customer Expectations: combine customers’ experiences and information about it via media, advertising, salespersons, and word-of-mouth from other customers Perceived Quality: overall quality, reliability, the extent to which a product/service meets the customer’s needs Customer Satisfaction: overall satisfaction, fulfillment of expectations, comparison with ideal Perceived Value: overall price given quality and overall quality given price Customer Complaints: percentage of respondents who reported a problem Customer Loyalty: likelihood to purchase at various price points

5
Messung & Analyse von Kundenzufriedenheit 5 Base line* Q Q Q Q Q Q Q Q Q Q % Chan ges MANUFACTURING/DURA BLES % Personal Computers %-5.1% Apple Computer, Inc % Dell Inc. NM %9.7% Gateway, Inc. NM %-2.6% All Others NM %1.4% Hewlett-Packard Company – HP %-9.0% Hewlett-Packard Company – Compaq %-11.5%

6
Messung & Analyse von Kundenzufriedenheit 6 The European Customer Satisfaction Index (ECSI) Ref.:

7
Messung & Analyse von Kundenzufriedenheit 7 ACSI e -Model for Food Retail Custo- mer Satis- faction Loyalty Expec- tations Perceived Quality Value Emotional Factor Hackl et al. (2000) Latent variables and path coefficients

8
Messung & Analyse von Kundenzufriedenheit 8 Austrian Food Retail Market Pilot for an Austrian National CS Index (Zuba, 1997) Data collection: December 1996 by Dr Fessel & GfK (professional market research agency) 839 interviews, 327 complete observations Austria-wide active food retail chains (1996: ~50% from the 10.5 B’EUR market) Billa: well-assorted medium-sized outlets Hofer: limited range at good prices Merkur: large-sized supermarkets with comprehensive range Meinl: top in quality and service

9
Messung & Analyse von Kundenzufriedenheit 9 The Data IndicatorsLatent total expected quality (EGESQ), expected compliance with demands (EANFO), expected shortcomings (EMANG) Expectations (E) total perceived quality (OGESQ), perceived compliance with needs (OANFO), perceived shortcomings (OMANG) Perceived Quality (Q) value for price (VAPRI), price for value (PRIVA)Value (P) total satisfaction (CSTOT), fulfilled expectations (ERWAR), comparison with ideal (IDEAL) Customer Sa- tisfaction (CS) number of oral complaints (NOBES), number of written complaints (NOBRI) Voice (V) repurchase probability (WIEDE), tolerance against price- change (PRVER) Loyalty (L)

10
Messung & Analyse von Kundenzufriedenheit 10 The Emotional Factor Principal component analysis of satisfaction drivers staff (availability, politeness) outlet (make-up, presentation of merchandise, cleanliness) range (freshness and quality, richness) price-value ratio (value for price, price for value) customer orientation (access to outlet, shopping hours, queuing time for checkout, paying modes, price information, sales, availability of sales) identifies (Zuba, 1997) staff, outlet, range: “Emotional factor” price-value ratio: “Value” customer orientation: “Cognitive factor”

11
Messung & Analyse von Kundenzufriedenheit 11 Structural Equation Models Combine three concepts Latent variables Pearson (1904), psychometrics Factor analysis model Path analysis Wright (1934), biometrics Technique to analyze systems of relations Simultaneous regression models Econometrics

12
Messung & Analyse von Kundenzufriedenheit 12 Customer Satisfaction Is the result of the customer‘s comparison of his/her expectations with his/her experiences has consequences on loyalty future profits of the supplier

13
Messung & Analyse von Kundenzufriedenheit 13 Expectation vs. Experience Expectation reflects customers‘ needs offer on the market image of the supplier etc. Experiences include perceived performance/quality subjective assessment etc.

14
Messung & Analyse von Kundenzufriedenheit 14 CS-Model: Path Diagram Custo- mer Satis- faction Loyalty Perceived Quality Expecta- tions

15
Messung & Analyse von Kundenzufriedenheit 15 A General CS-Model Custo- mer Satis- faction Loyalty Perceived Quality Expecta- tions Voice Profits

16
Messung & Analyse von Kundenzufriedenheit 16 CS-Model: Structure to from EXPQCSLY EXXX0 PQ0X0 CS00X LY000 EX: expectation PQ: perceived quality CS: customer satisfaction LY: loyalty Recursive structure: triangular form of relations

17
Messung & Analyse von Kundenzufriedenheit 17 CS-Model: Equations PQ = 1 + 11 EX + 1 CS = 2 + 21 PQ + 21 EX + 2 LY = 3 + 32 CS + 3 Simultaneous equations model in latent variables Exogenous: EX Endogenous: PQ, CS, LY Error terms (noises): 1, 2, 3

18
Messung & Analyse von Kundenzufriedenheit 18 Simple Linear Regression Model: Y = + X + Observations: (x i, y i ), i=1,…,n Fitted Model: Ŷ = a + cX OLS-estimates a, c: minimize the sum of squared residuals s xy : sample-covariance of X and Y

19
Messung & Analyse von Kundenzufriedenheit 19 Criteria of Model Fit R 2 : coefficient of determination the squared correlation between Y and Ŷ: R 2 = r yŷ 2 t-Test: Test of H 0 : =0 against H 1 :≠0 t=c/s.e.(c) s.e.(c): standard error of c F-Test: Test of H 0 : R 2 =0 against H 1 : R 2 ≠0 follows for large n the F-distribution with n-2 and 2 df

20
Messung & Analyse von Kundenzufriedenheit 20 Multiple Linear Regression Model: Y = + X 1 + + X k += + x’ + Observations: (x i1,…, x ik, y i ), i=1,…,n In Matrix-Notation: y = + X + y, : n-vectors, :k-vector, X: n x k-matrix Fitted Model: ŷ = a + Xc OLS-estimates a, c: R 2 = r yŷ 2 F-Test t-Test

21
Messung & Analyse von Kundenzufriedenheit 21 Simultaneous Equations Models A 2-equations model: PQ = 1 + 11 EX + 1 CS = 2 + 21 PQ + 21 EX + 2 In matrix-notation: Y = BY + X + with path coefficients

22
Messung & Analyse von Kundenzufriedenheit 22 Simultaneous Equations Models Model: Y = BY + X + Y, : m-vectors, B: (m xm) -matrix :(m xK) -matrix, X: K-vector Problems: Simultaneous equation bias: OLS-estimates of coefficients are not consistent Identifiability: Can coefficients be consistently estimated? Some assumptions: : E()=0, Cov() = Exogeneity: Cov(X,) = 0

23
Messung & Analyse von Kundenzufriedenheit 23 Path Analytic Model CS PQ EX Var( 1 ) = EX 2 PQ = 11 EX + 1 CS = 21 PQ + 21 EX + 2

24
Messung & Analyse von Kundenzufriedenheit 24 Path Analysis Wright (1921, 1934) A multivariate technique Model: Variables may be structurally related structurally unrelated, but correlated Decomposition of covariances allows to write covariances as functions of structural parameters Definition of direct and indirect effects

25
Messung & Analyse von Kundenzufriedenheit 25 Example CS PQ EX CS,EX = 21 EX + 21 PQ,EX = 21 2 EX + 11 21 2 EX CS,EX = 21 + 11 21 with standardized variable EX:

26
Messung & Analyse von Kundenzufriedenheit 26 Direct and Indirect Effects CS,EX = 21 + 11 21 Direct effect: coefficient that links independent with dependent variable; e.g., 21 is direct effect of EX on CS Indirect effect: effect of one variable on another via one or more intervening variable(s), e.g., 11 21 Total indirect effect: sum of indirect effects between two variables Total effect: sum of direct and total indirect effects between two variables

27
Messung & Analyse von Kundenzufriedenheit 27 Decomposition of Covariance yx : variable on path from X to Y YI : path coefficient of variable I to Y

28
Messung & Analyse von Kundenzufriedenheit 28 First Law of Path Analysis Decomposition of covariance xy between Y and X : Assumptions: Exogenous (X) and endogenous variables (Y) have mean zero Errors or noises () have mean zero and equal variances across observations are uncorrelated across observations are uncorrelated with exogenous variables are uncorrelated across equations

29
Messung & Analyse von Kundenzufriedenheit 29 Identification PQ = 11 EX + 1 Y 1 = 11 X + 1 CS = 21 PQ + 21 EX + 2 Y 2 = 21 Y 1 + 21 X + 2 In matrix-notation: Y = BY + X + Number of parameters: p=6 Model is identified, if all parameters can be expressed as functions of variances/covariances of observed variables

30
Messung & Analyse von Kundenzufriedenheit 30 Identification, cont’d Y 1 = 11 X + 1 Y 2 = 21 Y 1 + 21 X + 2 1X = 11 X 2 2X = 21 1X + 21 X 2 21 = 21 12 + 21 1X X 2 = X 2 y1 2 = 11 1X + 1 2 y2 2 = 21 21 + 21 2X + 2 2 p=6 first 3 equations allow unique solution for path coefficients, last three for variances of and

31
Messung & Analyse von Kundenzufriedenheit 31 Condition for Identification Just-identified: all parameters can be uniquely derived from functions of variances/covariances Over-identified: at least one parameter is not uniquely determined Under-identified: insufficient number of variances/covariances Necessary, but not sufficient condition for identification: number of variances/covariances at least as large as number of parameters A general and operational rule for checking identification has not been found

32
Messung & Analyse von Kundenzufriedenheit 32 Latent variables and Indicators Latent variables (LVs) or constructs or factors are unobservable, but We might find indicators or manifest variables (MVs) for the LVs that can be used as measures of the latent variable Indicators are imperfect measures of the latent variable

33
Messung & Analyse von Kundenzufriedenheit 33 Indicators for “Expectation” EX E3E3 E2E2 E1E1 E1: When you became a customer of AB-Bank, you probably knew something about them. How would you grade your expectations on a scale of 1 (very low) to 10 (very high)? E2: Now think about the different services they offer, such as bank loans, rates, … Rate your expectations on a scale of 1 to 10? E3: Finally rate your overall expectations on a scale of 1 to 10? 11 22 33 From: Swedish CSB Questionnaire, Banks: Private Customers E 1, E 2, E 3 : „block“ of LVs for Expectation

34
Messung & Analyse von Kundenzufriedenheit 34 Notation X3X3 X2X2 X1X1 11 22 3 X 1 = 1 + 1 X 2 = 2 + 2 X 3 = 3 + 3 : latent variable, factor X i : indicators, manifest variables i : factor loadings i : measurement errors, noise Some properties: LV: unit variance noise i : has mean zero, variance i 2, uncorrela- ted with other noises “reflective” indicators

35
Messung & Analyse von Kundenzufriedenheit 35 Notation X3X3 X2X2 X1X1 11 22 3 X 1 = 1 + 1 X 2 = 2 + 2 X 3 = 3 + 3 X = + In matrix-notation: with vectors X, , and e.g., X = (X 1, X 2, X 3 )‘ : latent variable, factor X i : indicators, manifest variables i : factor loadings i : measurement error, noise

36
Messung & Analyse von Kundenzufriedenheit 36 CS-Model: Path Diagram CS PQ EX Q1Q1 Q2Q2 Q3Q3 E3E3 E2E2 E1E1 C1C1 C2C2 C3C3 11 33 22 33 22 11 44 55 66

37
Messung & Analyse von Kundenzufriedenheit 37 SEM-Model: Path Diagram 11 Y1Y1 Y2Y2 Y3Y3 X3X3 X2X2 X1X1 Y4Y4 Y5Y5 Y6Y6 11 33 22 33 22 11 44 55 66 =++ X = x +Y= y +

38
Messung & Analyse von Kundenzufriedenheit 38 SEM-Model: Notation =++ X = x +Y= y + X, : 3-component vector Y, : 6-component vector Inner relations, inner model Outer relations, measurement model

39
Messung & Analyse von Kundenzufriedenheit 39 Statistical Assumptions Error terms of inner model () have zero means constant variances across observations are uncorrelated across observations are uncorrelated with exogenous variables Error terms of measurement models () have zero means constant variances across observations are uncorrelated across observations are uncorrelated with latent variables and with each other Latent variables are standardized

40
Messung & Analyse von Kundenzufriedenheit 40 Covariance Matrix of Manifest Variables Unrestricted covariance matrix (order: K = k x +k y ) = Var{(X’,Y’)’} Model-implied covariance matrix

41
Messung & Analyse von Kundenzufriedenheit 41 Estimation of the Parameters Covariance fitting methods search for values of parameters so that the model- implied covariance matrix fits the observed unrestricted covariance matrix of the MVs LISREL (LInear Structural RELations): Jöreskog (1973), Keesling (1972), Wiley (1973) Software LISREL by Jöreskog & Sörbom PLS techniques partition of in estimable subsets of parameters iterative optimizations provide successive approximations for LV scores and parameters Wold (1973, 1980)

42
Messung & Analyse von Kundenzufriedenheit 42 Discrepancy Function The discrepancy or fitting function F(S;) = F(S; ) is a measure of the “distance” between the model- implied covariance-matrix and the estimated unrestricted covariance-matrix S Properties of the discrepancy function: F(S;) ≥ 0; F(S;) = 0 if S=

43
Messung & Analyse von Kundenzufriedenheit 43 Covariance Fitting (LISREL) Estimates of the parameters are derived by F(S;) min Minimization of (K: number of indicators) F(S;) = log|| – log|S| + trace (S -1 ) – K gives ML-estimates, if the manifest variables are independently, multivariate normally distributed Iterative Algorithm (Newton-Raphson type) Identification Choice of starting values is crucial Other choices of F result in estimation methods like OLS and GLS; ADF (asymptotically distribution free)

44
Messung & Analyse von Kundenzufriedenheit 44 PLS Techniques Estimates factor scores for latent variables Estimates structural parameters (path coefficients, loading coefficients), based on estimated factor scores, using the principle of least squares Maximizes the predictive accuracy “Predictor specification”, viz. that E(|) equals the systematic part of the model, implies E(|)=0: the error term has (conditional) mean zero No distributional assumptions beyond those on 1st and 2nd order moments

45
Messung & Analyse von Kundenzufriedenheit 45 The PLS-Algorithm Step 1: Estimation of factor scores 1.Outer approximation 2.Calculation of inner weights 3.Inner approximation 4.Calculation of outer weights Step 2: Estimation of path and loading coefficients by minimizing Var() and Var() Step 3: Estimation of location parameters (intercepts) B o from = B o + B + + o from X = o + x +

46
Messung & Analyse von Kundenzufriedenheit 46 Estimation of Factor Scores Factor i : realizations Y in, n=1,…,N Y in (o) : outer approximation of Y in Y in (i) : inner approximation of Y in Indicator Y ij : observations y ijn ; j=1,…,J i ; n=1,…,N 1.Outer approximation: Y in (o) = j w ij y ijn s.t. Var(Y i (o) )=1 2.Inner weights: v ih =sign(r ih ), if i and h adjacent; otherwise v ih =0; r ih =corr( i, h ) (“centroid weighting”) 3.Inner approximation: Y in (i) = h v ih Y hn (o) s.t. Var(Y i (i) )=1 4.Outer weights: w ij =corr(Y ij,Y i (i) ) Start: choose arbitrary values for w ij Repeat 1. through 4. until outer weights converge

47
Messung & Analyse von Kundenzufriedenheit 47 Example CS PQ EX Q1Q1 Q2Q2 Q3Q3 E3E3 E2E2 E1E1 C1C1 C2C2 C3C3 11 33 22 33 22 11 44 55 66

48
Messung & Analyse von Kundenzufriedenheit 48 Example, cont’d Starting values w EX,1,…,w EX,3,w PQ,1,…,w PQ,3,w CS,1,…,w CS,3 Outer approximation: EX n (o) = j w EX,j E jn ; similar PQ n (o), CS n (o) ; standardized Inner approximation: EX n (i) = + PQ n (o) + CS n (o) PQ n (i) = + EX n (o) + CS n (o) CS n (i) = + EX n (o) + PQ n (o) standardized Outer weights: w EX,j = corr(E j,EX (i) ), j=1,…,3; similar w PQ,j, w CS,j

49
Messung & Analyse von Kundenzufriedenheit 49 Choice of Inner Weights Centroid weighting scheme: Y in (i) = h v ih Y hn (o) v ij =sign(r ih ), if i and h adjacent, v ij =0 otherwise with r ih =corr( i, h ); these weights are obtained if v ih are chosen to be +1 or -1 and Var(Y i (i) ) is maximized Weighting schemes: h predecessor h successor centroidsign(r ih ) factor, PCr ih pathb ih r ih b ih : coefficient in regression of i on h

50
Messung & Analyse von Kundenzufriedenheit 50 Measurement Model: Examples Latent variables from Swedish CSB Model 1.Expectation E 1 : new customer feelings E 2 : special products/services expectations E 3 : overall expectation 2.Perceived Quality Q1: range of products/services Q2: quality of service Q3: clarity of information on products/services Q4: opening hours and appearance of location Q5: etc.

51
Messung & Analyse von Kundenzufriedenheit 51 Measurement Models Reflective model: each indicator is reflecting the latent variable (example 1) Y ij = ij i + ij Y ij is called a reflective or effect indicator (of i ) Formative model: (example 2) i = y 'Y i + i y is a vector of k i weights; Y ij are called formative or cause indicators Hybrid or MIMIC model (for “multiple indicators and multiple causes”) Choice between formative and reflective depends on the substantive theory Formative models often used for exogenous, reflective and MIMIC models for endogenous variables

52
Messung & Analyse von Kundenzufriedenheit 52 Estimation of Outer Weights “Mode A” estimation of Y i (o) : reflective measurement model weight w ij is coefficient from simple regression of Y i (i) on Y ij : w ij = corr(Y ij,Y i (i) ) “Mode B” estimation of Y i (o) : formative measurement model weight w ij is coefficient of Y ij from multiple regression of Y i (i) on Y ij, j=1,…,J i multicollinearity?! MIMIC model

53
Messung & Analyse von Kundenzufriedenheit 53 Properties of Estimators A general proof for convergence of the PLS-algorithm does not exists; practitioners experience no problems Factor scores are inconsistent but “consistent at large”: consistency is achieved with increasing sample size and block size Loading coefficients are inconsistent and seem to be overestimated Path coefficients are inconsistent and seem to be underestimated

54
Messung & Analyse von Kundenzufriedenheit 54 ACSI Model: Results Custo- mer Satis- faction Loyalty Expec- tations Perceived Quality Value Voice EQS-estimates PLS-estimates

55
Messung & Analyse von Kundenzufriedenheit 55 Evaluation of SEM-Models Depends on estimation method Covariance-fitting methods: distributional assumptions, optimal parameter estimates, factor indeterminacy PLS path modeling: non-parametric, optimal prediction accuracy, LV scores Step 1: Inspection of estimation results (R 2, parameter estimates, standard errors, LV scores, residuals, etc.) Step 2: Assessment of fit Covariance-fitting methods: global measures PLS path modeling: partial fitting measures

56
Messung & Analyse von Kundenzufriedenheit 56 Inspection of Results Covariance-fitting methods: global optimization Model parameters and their standard errors; do they confirm theory? Correlation residuals: s ij -s ij () Graphical methods PLS techniques: iterative optimization of outer models and inner model Model parameters Resampling procedures like blindfolding or jackknifing give standard errors of model parameters LV scores Graphical methods

57
Messung & Analyse von Kundenzufriedenheit 57 Fit Indices Covariance-fitting methods: covariance fit measures such as Chi-square statistics Goodness of Fit Index (GFI), AGFI Normed Fit Index (NFI), NNFI, CFI Etc. Basis is the discrepancy function PLS path modeling: prediction-based measures Communality Redundancy Stone-Geisser’s Q 2

58
Messung & Analyse von Kundenzufriedenheit 58 Chi-square Statistic Test of H 0 : = ( ) against non-specified alternative Test-statistic X 2 =(N-1)F(S;( )) If model is just identified (c=p): X 2 =0 [c=K(K+1)/2, p: number of parameters in ] Under usual regularity conditions (normal distribution, ML-estimation), X 2 is asymptotically 2 (c-p)-distributed Non-significant X 2 indicate: the over-identified model does not differ from a just-identified version Problem: X 2 increases with increasing N Some prefer X 2 /(c-p) to X 2 (has reduced sensitivity to sample size); rule of thumb: X 2 /(c-p) < 3 is acceptable

59
Messung & Analyse von Kundenzufriedenheit 59 Goodness of Fit Indices Goodness of Fit Index (Jöreskog & Sörbom): Portion of observed covariances explained by the model-implied covariances “How much better fits the model as compared to no model at all” Ranges from 0 (poor fit) to 1 (perfect fit) Rule of thumb: GFI > 0.9 AGFI penalizes model complexity:

60
Messung & Analyse von Kundenzufriedenheit 60 Other Fit Indices Normed Fit Index, NFI (Bentler & Bonett) Similar to GFI, but compares with a baseline model, typically the independence model (indicators are uncorrelated) Ranges from 0 (poor fit) to 1 (perfect fit) Rule of thumb: NFI > 0.9 Comparative Fit Index, CFI (Bentler) Less depending of sample size than NFI Non-Normed Fit Index, NNFI (Bentler & Bonett) Also known as Tucker-Lewis Index Adjusted for model complexity Root mean squared error of approximation, RMSEA (Steiger):

61
Messung & Analyse von Kundenzufriedenheit 61 Assessment of PLS Results Not a single but many optimization steps; not a global measure but many measures of various aspects of results Indices for assessing the predictive relevance Portions of explained variance (R 2 ) Communality, redundancy, etc. Stone-Geisser’s Q 2 Reliability indices NFI, assuming normality of indicators Allows comparisons with covariance-fitting results

62
Messung & Analyse von Kundenzufriedenheit 62 Some Indices Assessment of diagonal fit (proportion of explained variances) SMC (squared multiple correlation coefficient) R 2 : ( average) proportion of the variance of LVs that is explained by other LVs; concerns the inner model Communality H 2 : ( average) proportion of the variance of indicators that is explained by the LVs directly connected to it; concerns the outer model Redundancy F 2 : ( average) proportion of the variance of indicators that is explained by predictor LVs of its own LV r 2 : proportion of explained variance of indicators

63
Messung & Analyse von Kundenzufriedenheit 63 Some Indices, cont’d Assessment of non-diagonal fit Explained indicator covariances r s = 1 c/s with c = rms(C), s = rms(S); C: estimate of Cov() Explained latent variable correlation r r = 1 q/r with q = rms(Q), r = rms(Cov(Y)); Q: estimate of Cov() r eY = rms (Cov(e,Y)), e: outer residuals r eu = rms (Cov(e,u)), u: inner residuals rms(A) = ( i j a ij 2 ) 1/2 : root mean squared covariances (diagonal elements of symmetric A excluded from summation)

64
Messung & Analyse von Kundenzufriedenheit 64 Stone-Geisser’s Q 2 Similar to R 2 E: sum of squared prediction errors; O: sum of squared deviations from mean Prediction errors from resampling (blindfolding, jackknifing) E.g., communality of Y ij, an indicator of i

65
Messung & Analyse von Kundenzufriedenheit 65 Lohmöller’s Advice Check fit of outer model Low unexplained portion of indicator variances and covariances High communalities in reflective blocks, low residual covariances Residual covariances between blocks close to zero Covariances between outer residuals and latent variables close to zero Check fit of inner model Low unexplained portion of latent variable indicator variances and covariances Check fit of total model High redundancy coefficient Low covariances of inner and outer residuals

66
Messung & Analyse von Kundenzufriedenheit 66 ACSI Model: Results Custo- mer Satis- faction Loyalty Expec- tations Perceived Quality Value Voice EQS-estimates PLS-estimates

67
Messung & Analyse von Kundenzufriedenheit 67 Diagnostics: EQS ACSIACSI e df81173 NNFI RMSEA

68
Messung & Analyse von Kundenzufriedenheit 68 Diagnostics: PLS (centroid weighting) ACSIACSI e HuiSchenk R2R Q2Q r H2H F2F r2r r eY r eu

69
Messung & Analyse von Kundenzufriedenheit 69 Practice of SEM Analysis Theoretical basis Data Scaling: metric or nominal (in LISREL not standard) Sample-size: a good choice is 10p (p: number of parameters); <5p cases might result in unstable estimates; large number of cases will result in large values of X 2 Reflective indicators are assumed to be uni-dimensional; it is recommended to use principal axis extraction, Cronbach’s alpha and similar to confirm the suitability of data Model Identification must be checked for covariance fitting methods Indicators for LV can be formative or reflective; formative indicators not supported in LISREL

70
Messung & Analyse von Kundenzufriedenheit 70 Practice of SEM Analalysis cont’d Model LISREL allows for more general covariance structures e.g., correlation of measurement errors Estimation Repeat estimation with varying starting values Diagnostic checks Use graphical tools like plots of residuals etc. Check each measurement model Check each structural equation Lohmöller’s advice Model trimming Stepwise model building (Hui, 1982; Schenk, 2001)

71
Messung & Analyse von Kundenzufriedenheit 71 LISREL vs PLS Models PLS assumes recursive inner structure PLS allows for higher complexity w.r.t. B, , and ; LISREL w.r.t. and Estimation method Distributional assumptions in PLS not needed Formative measurement model in PLS Factor scores in PLS PLS: biased estimates, consistency at large LISREL: ML-theory In PLS: diagnostics much richer Empirical facts LISREL needs in general larger samples LISREL needs more computation

72
Messung & Analyse von Kundenzufriedenheit 72 The Extended Model Custo- mer Satis- faction Loyalty Expec- tations Perceived Quality Value Emotional Factor EQS-estimates PLS-estimates

73
Messung & Analyse von Kundenzufriedenheit 73 Diagnostics: EQS ACSIACSI e df81173 NNFI RMSEA

74
Messung & Analyse von Kundenzufriedenheit 74 Diagnostics: PLS (centroid weighting) ACSIACSI e HuiSchenk R2R Q2Q r H2H F2F r2r r eY r eu

75
Messung & Analyse von Kundenzufriedenheit 75 Model Building: Hui’s Approach Custo- mer Satis- faction Loyalty Expec- tations Perceived Quality Value Emotional Factor

76
Messung & Analyse von Kundenzufriedenheit 76 Model Building: Schenk’s Approach Custo- mer Satis- faction Expec- tations Perceived Quality Value Emotional Factor

77
Messung & Analyse von Kundenzufriedenheit 77 The end

78
Messung & Analyse von Kundenzufriedenheit 78 Data-driven Specification No solid a priori knowledge about relations among variables Stepwise regression Search of the “best” model Forward selection Backward elimination Problem: omitted variable bias General to specific modeling

79
Messung & Analyse von Kundenzufriedenheit 79 Stepwise SE Model Building Hui (1982): models with interdependent inner relations Schenk (2001): guaranties causal structure, i.e., triangular matrix B of path coefficients in the inner model η = B η + ζ

80
Messung & Analyse von Kundenzufriedenheit 80 Stepwise SE Model Building Hui’s algorithm Stage 1 1.Calculate case values Y ij for LVs η i as principal component of corresponding block, calculate R = Corr(Y) 2.Choose for each endogenous LV the one with highest correlation to form a simple regression 3.Repeat until a stable model is reached a.PLS-estimate the model, calculate case values, and recalculate R b.Drop from each equation LVs with t-value |t|<1,65 c.Add in each equation the LV with highest partial correlation with dependent LV

81
Messung & Analyse von Kundenzufriedenheit 81 Stepwise SE Model Building Hui’s algorithm, cont’d Stage 2 1.Use rank condition for checking identifiability of each equation 2.Use 2SLS for estimating the path coefficients in each equation

82
Messung & Analyse von Kundenzufriedenheit 82 Hui’s vs. Schenk’s Algorithm Hui’s algorithm is not restricted to a causal structure; allows cycles and an arbitrary structure of matrix B Schenk’s algorithm uses an iterative procedure similar to that used by Hui makes use of a priori information about the structure of the causal chain connecting the latent variables latent variables are to be sorted

83
Messung & Analyse von Kundenzufriedenheit 83 Stepwise SE Model Building Schenk’s algorithm 1.Calculate case values Y ij for LVs η i as principal component of corresponding block, calculate R = Corr(Y) 2.Choose pair of LVs with highest correlation 3.Repeat until a stable model is reached a.PLS-estimate the model, calculate case values, and recalculate R b.Drop LVs with non-significant t -value c.Add LV with highest correlation with already included LVs

84
Messung & Analyse von Kundenzufriedenheit 84 Data, special CS dimensions Staff2availability 1 (PERS), politeness 1 (FREU) Outlet3make-up 1 (GEST), presentation of mer- chandise 1 (PRAE), cleanliness 1 (SAUB) Range2freshness and quality (QUAL), richness (VIEL) Customer- orientation 7access to outlet (ERRE), shopping hours (OEFF), queuing time for checkout 1 (WART), paying modes 1 (ZAHL), price information 1 (PRAU), sales (SOND), availability of sales (VERF) 1 Dimension of “Emotional Factor”

85
Messung & Analyse von Kundenzufriedenheit 85 References C. Fornell (1992), “A National Customer Satisfaction Barometer: The Swedish Experience”. Journal of Marketing, (56), C. Fornell and Jaesung Cha (1994), “Partial Least Squares”, pp in R.P. Bagozzi (ed.), Advanced Methods of Marketing Research. Blackwell. J.B. Lohmöller (1989), Latent variable path modeling with partial least squares. Physica-Verlag. H. Wold (1982), “Soft modeling. The basic design and some extensions”, in: Vol.2 of Jöreskog-Wold (eds.), Systems under Indirect Observation. North-Holland. H. Wold (1985), “Partial Least Squares”, pp in S. Kotz, N.L. Johnson (eds.), Encyclopedia of Statistical Sciences, Vol. 6. Wiley.

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google