Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing Collaborative Modeling Quality Based on Modeling Artifacts

Similar presentations


Presentation on theme: "Assessing Collaborative Modeling Quality Based on Modeling Artifacts"— Presentation transcript:

1 Assessing Collaborative Modeling Quality Based on Modeling Artifacts
D. (Denis) Ssebuggwawo1, S.J.B.A. (Stijn) Hoppenbrouwers1 & H.A. (Erik) Proper1,2 1ICIS, Radboud University Nijmegen, The Netherlands 2Public Research Centre -- Henri Tudor, Luxembourg 3rd Working Conference on The Practice of Enterprise Modeling (PoEM’10) Delft University, The Netherlands 9-10 November, 2010

2 MENU Overview Collaborative Modeling Evaluation Hypothesized Model & Alternative Model Empirical Results Conclusion & Future Direction

3 Overriding Goals Overview
Determine the Efficacy: (Efficiency & Effectiveness ) - evaluate the different constructs (ML, MP, EP, ST) to determine the overall efficiency and effectiveness Efficiency : reduce the effort Effectiveness: improve the quality of the result Determine the Success of collaborative effort : (Success factors) - evaluate the modeling effort to determine (critical) success factors that influence the efficiency & effectiveness.

4 Modeling Artifacts Overview
Anchoring Collaborative modeling Evaluation on modeling artifacts Modeling Language (ML) Modeling Procedure (MP) End-Products (EP) Support Tool or Medium (ST/M)

5 The Modeling Artifacts
Overview The Modeling Artifacts Artifact Explanation ML Concepts (constructs) in which the modelers express and communicate the solution. MP Processes (methods) for defining the problem and is reaching solution . EP Intermediary and end-products (models ). ST Enabling environment and support tools for the interaction and collaboration, communication, etc.

6 Supporting Frameworks
CM Evaluation Supporting Frameworks SEQUAL (Lindland et al., 1994; Krogstie, et al., 2006) Based on: Semiotic theory Understanding the quality of conceptual models TAM (Davis 1986; Davis et al., 1989) TRA/TPB (Fishbein, 1975; Ajzen, 1991) About: Attitudes, Beliefs, Intentions/Perceptions, Behaviour Explaining & predicting user acceptance of IS/ITs MEM (Moody, 2001; Moody, 2003) Based on : Methodological pragmatism (Theo.Know validation) Evaluating IS design methods & TAM

7 Theory of Reasoned Action (TRA)
CM Evaluation Theory of Reasoned Action (TRA) PERCEPTIONS INTENTIONS BEHAVIOUR External/ Internal Psychological Behavioural Environment variables variables variables Attitudes toward Act or Behaviour (AB) Subjective Norm (SN) Behavioural Intention (BI) Actual Behaviour (B) Behavioural Beliefs and Outcome Evaluations (bbioei) Normative Beliefs and Motivation to Comply (nbjmcj) Uncontrollable factors Controllable factors Fig. 1. TRA Model

8 Hypothesized Model Interactions
CM Evaluation Hypothesized Model Interactions MP_1 MP_2 ML_n ML_2 MP_n ML MP ML_1 Fig. 2. Hypothesized Model Interactions EP ST EP_n EP_1 EP_2 ST_1 ST_2 ST_n

9 The Constructs CM Evaluation
Perceived Quality of the Modeling Language (PQML) Perceived Usefulness of the Modeling Procedure (PUMP) Perceived Quality of the End-Products (PQEP) Ease of Use of the Medium or Support Tool (EOUM/ST)

10 CM Evaluation: Original Quality Dimensions
Artifact Quality Dimensions # Sources Modeling Language (ML) construct deficit, construct overload, construct redundancy, construct excess; expressive power, directness, systematicity; syntax, semantic & pragmatic clarity; modeling primitive adequacy 10 Wand and Weber (1993), Lindland et al. (1994), Krogstie et al. (2006), Krogstie et al. (2001), List and Korherr (2006), Nysetvold and Krogstie (2005), Soderstrom et al. (2002), Stirna and Persson (2007) Modeling Procedure (MP) efficiency; effectiveness; ease of application, in-out- description adequacy, process & relation description adequacy, method compatibility, interaction & collaboration adequacy, communication & negotiation adequacy; rule & goal commitment, shared understanding de Brabander and Thiers (1984), Duivenvoorde et al. (2009), Krogstie et al. (2006), Gemino and (2003), Hengst et al. (2006), Reinig (2003), Siau and Wang (2007), Siau and Rossi (1998), Recker (2006), Stirna and Persson (2007), Renger et al. (2008), Becker et al. (2000), Ssebuggwawo et al. (2009) End-Product (EP) correctness, completeness, propriety, clarity, consistency, orthogonality, generality, syntax adherence adequacy, semantics adequacy, pragmatics adequacy; user-comprehensibility; modifiability, re- usability, flexibility; user satisfaction. 15 Lindland et al. (1994), Krogstie et al. (2006), Sedera et al. (2003), Pfeiffer and Niehaves (2005), Paul et al. (2004), Reinig (2003), Rosemann et al. (2001), Stirna and Persson (2007), Schuette and Rotthowe (1998) Medium/ Support Tool (M/ST) tool functionality, performance & reliability; efficiency, effectiveness; satisfaction; synchronicity, negotiation/argumentation adequacy, commenting/proposition adequacy, planning/agenda setting adequacy 9 Dean et al. (1994), Fjermestad and Hiltz (1999), Stirna and Persson (2007), Krogstie et al. (2006), Ssebuggwawo et al. (2009)

11 CM Evaluation: Synthesized Quality
Dimensions Construct Quality Dimensions - New Groupings PQML Understandability (ML1), Clarity (ML2), Syntax correctness (ML3), Conceptual minimalism (ML4) PUMP Efficiency (MP 5), Effectiveness (MP6), Satisfaction (MP7), Commitment & Shared Understanding (MP8) PQEP Product Quality (EP9), Understandability (EP10), Modifiability & Maintainability (EP11), Satisfaction (EP12) EOUM /ST Functionality (ST13), Usability (ST14), Satisfaction & Enjoyment (ST15), Collaboration & Communication Facilitation (ST16)

12 Hypothesized (a priori) Model
Fig. 3. Hypothesized Model

13 Alternative (Competing) Model
Fig. 4. Competing Model

14 Modeling Expt. & Evaluation
Empirical Results Modeling Expt. & Evaluation Modeling Experiment: Collaborative modeling session using COMA tool Evaluation: Using a measurement instrument (Questionnaire) 7-pt Likert Scale Constructs to assess: PQML, PUMP, PQEP and EOUM

15 Validation & Reliability Tests Exploratory Factor Analysis(EFA)
Empirical Results Validation & Reliability Tests Exploratory Factor Analysis(EFA) Goal: Retain factors that account for significant amount variance in the data. Precursor to CFA. Principal Component Analysis (PCA): Data reduction : determing the number of factors Factor rotation : determining the (non-)correlation of factors Common Factor Analysis/Principal Factor Analysis: understanding the relationship btwn: indicators (measured: MLs, MPs, EPs, STs) variables in terms of factor (latent: PQML, PUMP, PQEP, EOUM) variables

16 Validation & Reliability Tests Exploratory Factor Analysis(EFA)
Empirical Results Validation & Reliability Tests EFA Results       c Exploratory Factor Analysis(EFA)

17 Confirmatory Factor Analysis(CFA)
Empirical Results Validation & Reliability Tests Confirmatory Factor Analysis(CFA) (Structural Equation Modeling (SEM)) A priori hypotheses: Testing: a priori hypotheses/theories Assessing Goodness-of-fit: Assessing the goodness-of-fit based on variance after factor reduction in EFA Assessing validity & reliability: Testing and confirming the validity & reliability of a measurement instrument

18 Model 1: Hypothesized CFA Results Model:
Fig. 5. Path diagram Model 1

19 Model 2: Competing CFA Results Model:
Fig. 6. Path diagram Model 2

20 Validation & Reliability Tests Confirmatory Factor Analysis(CFA)
Empirical Results Validation & Reliability Tests CFA Results       c Confirmatory Factor Analysis(CFA)

21 Conclusion & Future Direction
Conclusion: Rather than model quality, other artifacts can be used in the evaluation of quality and success of a collaborative modeling effort. Future Direction: Establishing the interdependencies of the artifacts and their impact on the overall quality Measuring the acceptability and adoption of the quality framework in practice

22 Thank you. Questions ?


Download ppt "Assessing Collaborative Modeling Quality Based on Modeling Artifacts"

Similar presentations


Ads by Google