Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cognitive Task Analysis for Teams

Similar presentations


Presentation on theme: "Cognitive Task Analysis for Teams"— Presentation transcript:

1 Cognitive Task Analysis for Teams
Nancy J. Cooke New Mexico State University CTA Resource On-line Seminar October 11, 2002

2 Acknowledgements NMSU Faculty: Peter Foltz NMSU Post Doc: Brian Bell
NMSU Graduate Students: Janie DeJoode, Jamie Gorman, Preston Kiekel, Rebecca Keith, Melanie Martin, Harry Pedersen US Positioning, LLC: Steven Shope UCF: Eduardo Salas, Clint Bowers Sponsors: Air Force Office of Scientific Research, Office of Naval Research, NASA Ames Research Center, Army Research Laboratory October 11, 2002 Nancy Cooke

3 Overview What is team cognition? Q&A “Shared” mental models
Holistic CTA for teams Conclusions October 11, 2002 Nancy Cooke

4 What is Team Cognition? October 11, 2002 Nancy Cooke

5 Team Cognition in Practice
October 11, 2002 Nancy Cooke

6 A Synthetic Task Environment for the Study of Team Cognition
Experimental Context CERTT (Cognitive Engineering Research on Team Tasks) Lab A Synthetic Task Environment for the Study of Team Cognition Five Participant Consoles Experimenter Console October 11, 2002 Nancy Cooke

7 Defining Team “…a distinguishable set of two or more people who interact dynamically, interdependently, and adaptively toward a common and valued goal/object/mission, who have each been assigned specific roles or functions to perform, and who have a limited life span of membership” Salas, Dickinson, Converse, and Tannenbaum (1992) October 11, 2002 Nancy Cooke

8 Defining Team Cognition
It is more than the sum of the cognition of individual team members. It emerges from the interplay of the individual cognition of each team member and team process behaviors October 11, 2002 Nancy Cooke

9 Team Cognition Framework Team Process Behaviors
Individual knowledge Team Process Behaviors Team Knowledge Team Performance October 11, 2002 Nancy Cooke

10 Team Cognition Framework Team Process Behaviors
Collective level + + Individual knowledge Team Process Behaviors Holistic Level Team Knowledge Team Performance October 11, 2002 Nancy Cooke

11 Team Knowledge Long-term knowledge Taskwork Teamwork
Fleeting Knowledge (i.e., momentary understanding, situation model) October 11, 2002 Nancy Cooke

12 Measurement Limitations
Measures tend to assume homogeneous teams Measures tend to target collective level Aggregation methods are limited Measures are needed that target the more dynamic and fleeting knowledge Measures are needed that target different types of long-term team knowledge A broader range of knowledge elicitation methods is needed A need for streamlined and embedded measures Newly developed measures require validation October 11, 2002 Nancy Cooke

13 Other Related Work Group Think (Janis, 1972)
Distributed Cognition (Hutchins, 1991) Common Ground in Discourse (Clark & Schaefer, 1987; Wilkes-Gibbs & Clark 1992 ) Group Decision Support (Fulk, Schmitz, & Ryu, 1995) Social Decision Schemes (Davis, 1973; Hinsz, 1999) Transactive Memory (Wegner, 1986) Shared Mental Models (Cannon-Bowers, Salas, & Converse, 1993) October 11, 2002 Nancy Cooke

14 Why Do We Care? Outcome measures of team performance do not reveal why performance is effective or ineffective Team cognition is assumed to contribute to team performance Understanding the team cognition behind team performance should facilitate interventions (design, training, selection) to improve that performance October 11, 2002 Nancy Cooke

15 Team Cognition and Functions of Cognitive Task Analysis
Elicitation: Interviews, observations, think aloud used to make knowledge explicit Assessment: Judgments are made regarding specific elicited knowledge (e.g., accuracy, intrateam similarity) Diagnosis: Patterns in elicited knowledge (i.e. symptoms associated with dysfunctional or exceptional performance) are tied to a diagnosis October 11, 2002 Nancy Cooke

16 Questions or Comments? October 11, 2002 Nancy Cooke

17 Shared Mental Models October 11, 2002 Nancy Cooke

18 “Shared Mental Models”
Shared Knowledge October 11, 2002 Nancy Cooke

19 “Shared” vs. Sharing = to have compatible knowledge
Sharing = to have the same knowledge “Shared beliefs” “Share the pie” vs. To hold in common To distribute October 11, 2002 Nancy Cooke

20 The “Apples and Oranges” Problem
Measures to assess team knowledge often assume knowledge homogeneity among team members. Shared knowledge = similar knowledge Accuracy is relative to single referent Person A Person B Referent October 11, 2002 Nancy Cooke

21 Teams, by Definition, Consist of “Apples and Oranges”
Airport Incident Command Center Telemedicine October 11, 2002 Nancy Cooke

22 “Shared” Knowledge Knowledge Base Person A Person B Shared = Common
October 11, 2002 Nancy Cooke

23 “Shared” Knowledge Shared = Complementary October 11, 2002 Nancy Cooke

24 “Shared” Knowledge Shared = Common and Complementary October 11, 2002
Nancy Cooke

25 “Shared” Knowledge Common and Complementary Knowledge and Shared Perspectives/Varied Granularity October 11, 2002 Nancy Cooke

26 Common and Complementary Knowledge and Shared Perspectives
“Shared” Knowledge Conflicting Knowledge Irrelevant Knowledge No Coverage Common and Complementary Knowledge and Shared Perspectives October 11, 2002 Nancy Cooke

27 An Approach to the Apples and Oranges Problem
Measures of team knowledge with heterogeneous accuracy metrics October 11, 2002 Nancy Cooke

28 Experimental Context Five studies: Two different 3-person tasks: UAV (Uninhabited Air Vehicle) and Navy helicopter rescue-and-relief Procedure: Training, several missions, knowledge measurement sessions Manipulate: co-located vs. distributed environments, training regime, knowledge sharing capabilities, workload October 11, 2002 Nancy Cooke

29 Experimental Context MEASURES Team performance: composite measure
Team process: observer ratings and critical incident checklist Other: Communication (flow and audio records), video, computer events, leadership, demographic questions, working memory Taskwork & Teamwork Knowledge, Situation Awareness October 11, 2002 Nancy Cooke

30 Long-term Taskwork Knowledge
Factual Tests Psychological scaling The camera settings are determined by a) altitude, b) airspeed, c) light conditions, d) all of the above. How related is airspeed to restricted operating zone? October 11, 2002 Nancy Cooke

31 Long-term Teamwork Knowledge
Given a specific task scenario, who passes what information to whom? Teamwork Checklist ___AVO gives airspeed info to PLO ___DEMPC gives waypoint restrictions to AVO ___PLO gives current position to AVO AVO= Air Vehicle Operator PLO = Payload Operator DEMPC = Navigator October 11, 2002 Nancy Cooke

32 Team Situation Awareness
Assess accuracy and similarity of situation models of team members SPAM (Situation Present Assessment Method) queries--display not interrupted Queries about future events Team members queried in random order at designated point in scenario within a 5-minute interval Durso, et al., 1998 How many targets are left to photograph? October 11, 2002 Nancy Cooke

33 Traditional Accuracy Metrics
Team Referent .50 Team Member: Air Vehicle Operator 50% ACCURACY October 11, 2002 Nancy Cooke

34 Heterogeneous Accuracy Metrics
AVO Referent DEMPC Referent PLO Referent Team Referent .33 1.0 .50 ACCURACY Overall: .50 Positional: 1.0 Interpositional: .17 Team Member: AVO AVO= Air Vehicle Operator PLO = Payload Operator DEMPC = Navigator October 11, 2002 Nancy Cooke

35 Results Across Studies
Taskwork knowledge is predictive of team performance But… True for psychological scaling, not factual tests Timing of knowledge test is critical October 11, 2002 Nancy Cooke

36 Knowledge Profiles of Two Tasks
Knowledge metric Common (UAV) Distributed (Navy helicopter) Overall accuracy + Intrateam similarity Positional accuracy Interposit. Knowledge profile characterizing effective teams depends on task (UAV vs. Navy) October 11, 2002 Nancy Cooke

37 Knowledge Profiles of Two Tasks Planning and execution
Complementary Common UAV Task Command-and-Control Interdependent Knowledge sharing Navy Helicopter Task Planning and execution Less interdependent Face-to-Face October 11, 2002 Nancy Cooke

38 Knowledge Acquisition
Training Mission Experience Procedure: Taskwork Knowledge Knowledge Acquired: Teamwork Knowledge Teamwork knowledge is acquired through mission experience and its acquisition seems dependent on a foundation of taskwork knowledge acquired in training. October 11, 2002 Nancy Cooke

39 Results: Team Situation Awareness
Team SA mirrors the performance acquisition function and generally improves with mission experience Team SA is generally good predictor of team performance (especially a repeated query) SA and Performance data from first UAV study. October 11, 2002 Nancy Cooke

40 Implications of Heterogeneous Metrics
Can deal with “apples and oranges” issue Can assess knowledge underlying task performance Knowledge profiles of tasks can inform training and design interventions October 11, 2002 Nancy Cooke

41 Future Directions on Apples and Oranges Problem
Apply metrics to fleeting knowledge Embed knowledge measures in task Need a taxonomy of tasks and additional profile work Need to connect the knowledge profile (symptoms) to diagnosis of team dysfunction or excellence October 11, 2002 Nancy Cooke

42 Questions or Comments? October 11, 2002 Nancy Cooke

43 Holistic CTA for Teams October 11, 2002 Nancy Cooke

44 Team Cognition Framework Team Process Behaviors
Collective level + + Individual knowledge Team Process Behaviors Holistic Level Team Knowledge Team Performance October 11, 2002 Nancy Cooke

45 The “Sum of All Team Members” Problem
Team Process Behaviors Team Knowledge Team Performance Individual knowledge Collective level + + The Problem: Measures are taken at the individual level and aggregated, as opposed to being taken at the holistic level. Holistic Level October 11, 2002 Nancy Cooke

46 The Sum of All Team Members Problem
Aggregating individual data is problematic given the apples and oranges problem Team process behavior is missing from collective measures Cognition at the holistic level should be more directly related to team performance October 11, 2002 Nancy Cooke

47 Our Approach to the Sum of All Team Members Problem
Consensus assessment tasks Consensus concept ratings Consensus teamwork checklist Consensus SA queries Communication as a measure of team cognition October 11, 2002 Nancy Cooke

48 Consensus Assessment Tasks
An Example: Concept Ratings Step One: Individual Concept Ratings collected Present to each individual: airspeed – altitude (1=related, 5=unrelated) Responses: AVO=4, PLO=1, DEMPC=5 2) Consensus Ratings Collected Present to the team: Prior responses: AVO=4, PLO=1, DEMPC=5 Team discussion: PLO: “Well I said related since my camera settings for shutter speed and focus are dependent on each of these values” DEMPC: “OK, let’s go with that 1 it is” AVO= Air Vehicle Operator PLO = Payload Operator DEMPC = Navigator October 11, 2002 Nancy Cooke

49 Consensus Assessment Tasks
Results Consensus measures correlate moderately with performance compared to collective measures Perhaps consensus does not adequately tap in-mission process behavior Although collective measures and process behaviors predict team performance for co-located teams better than holistic measures, this is not true for distributed teams October 11, 2002 Nancy Cooke

50 Communication as a Window to Team Cognition
The “Good” Observable Team behavior diagnostic of team performance Think aloud “in the wild” Reflects team cognition at the holistic level Rich, multidimensional (amount, flow, speech acts, content) October 11, 2002 Nancy Cooke

51 Communication as a Window to Team Cognition
The “Bad” Communication data Analyses do not fully exploit data (e.g., dynamic, sequential aspect) Time spent talking October 11, 2002 Nancy Cooke

52 Communication as a Window to Team Cognition
AND The “Ugly” Labor intensive transcription, coding, and interpretation October 11, 2002 Nancy Cooke

53 Our Approach to Solving the Sum of All Team Members Problem Via Communication Analysis
Communication Flow Analysis Content Analysis Using LSA October 11, 2002 Nancy Cooke

54 Analyzing Flow: CERTT Lab ComLog Data
Team members use push-to-talk intercom buttons to communicate At regular intervals speaker and listener identity are logged October 11, 2002 Nancy Cooke

55 Analyzing Flow: ProNet-- Procedural Networks
Cooke, Neville, & Rowe, 1996 Nodes define events that occur in a sequence An Example from UAV study: 6 nodes: Abeg, Aend, Pbeg, Pend, Dbeg, Dend ProNet: Find representative event sequences Quantitative: Chain lengths-->Performance Mission 2: R2 = .509, F(2, 8) = 4.144, p = .058 Mission 3: R2 = .275, F(1, 9) = 3.415, p = .098 Mission 5: R2 = .628, F(2, 8) = 5.074, p = .051 October 11, 2002 Nancy Cooke

56 Analyzing Flow: ProNet-- Procedural Networks
Qualitative: Communication patterns predictive of performance Abeg Aend Pend Dbeg Dend Pbeg Team 2 before PLO-DEMPC’s fight Team 2 after PLO-DEMPC’s fight AVO= Air Vehicle Operator PLO = Payload Operator DEMPC = Navigator October 11, 2002 Nancy Cooke

57 Content Analysis with Latent Semantic Analysis (LSA)
Landauer, Foltz, & Laham, 1998 A tool for measuring cognitive artifacts based on semantic information Provides measures of the semantic relatedness, quality, and quantity of information contained in discourse Automatic and fast We can derive the meaning of words through analyses of large corpora Large constraint satisfaction of estimating the meaning of many passages based on their contained words (like factor analysis) Method represents units of text (words, sentences, discourse, essays) as vectors in a high dimensional semantic space based on correlations of usage across text contexts Compute degree of semantic similarity between any two units of text October 11, 2002 Nancy Cooke

58 Content Analysis with Latent Semantic Analysis (LSA)
An Example from UAV Study 1 67 Transcripts from missions 1-7 XML tagged with speaker and listener information ~2700 minutes of spoken dialogue 20,545 separate utterances (turns) 232,000 words (660 k bytes of text) Semantic Space: 22,802 documents Utterances from dialogues Training material Interviews with domain experts Derived several statistical measures of the quality of each transcript October 11, 2002 Nancy Cooke

59 Content Analysis with Latent Semantic Analysis (LSA)
Team 1 Mission 3 Score: 750 Team 7 Mission 3 Score 580 Team 3 Mission 4 Score: 620 Team 5 Mission 4 Score 460 Team 6 Mission 3 Score 490 Team 8 Mission 3 Score ???? Team 8 Mission 6 Score 560 LSA-based communication score predicts performance (r =.79). October 11, 2002 Nancy Cooke

60 Other Communication Analysis Approaches
Flow Analyses Measure of speaker dominance Deviations from ideal flow Clustering model-based patterns Content Analyses Automatic transcript coding Coherence in team dialogue Measures of individual contributions October 11, 2002 Nancy Cooke

61 Conclusions October 11, 2002 Nancy Cooke

62 Summary Teams think Understanding team cognition is critical for diagnosis of team dysfunction or excellence and later intervention Measuring team cognition is critical for understanding it There are challenges (e.g., apples and oranges, sum of all team members) October 11, 2002 Nancy Cooke

63 More to Do Further application of heterogeneous metrics
Embedded, streamlined knowledge measures Further validation Investigate generality across tasks Individual cognitive differences Beyond assessment to diagnosis October 11, 2002 Nancy Cooke

64 New Mexico State University cooke@crl.nmsu.edu
Contact Nancy J. Cooke New Mexico State University Moving to Arizona State University East January 2003 AZ NM October 11, 2002 Nancy Cooke

65 Bibliography Methodological Reviews Empirical Studies
Cooke, N. J. (1999). Knowledge elicitation. In. F.T. Durso, (Ed.), Handbook of Applied Cognition, pp UK: Wiley. Cooke, N. J., Salas, E., Cannon-Bowers, J. A., & Stout, R. (2000). Measuring team knowledge. Human Factors, 42, Cooke, N. J., Salas, E., Kiekel, P. A., & Bell, B. (in press). Advances in measuring team cognition. In E. Salas and S. M. Fiore (Eds.), Team cognition: Process and performance at the inter- and intra-individual level. Washington, DC: American Psychological Association. Empirical Studies Cooke, N. J., Cannon-Bowers, J. A., Kiekel, P. A., Rivera, K., Stout, R., and Salas, E. (2000). Improving team's interpositional knowledge through cross training. Proceedings of the Human Factors and Ergonomics Society 44th Annual Meeting. Cooke, N. J., Kiekel, P. A., & Helm E. (2001). Measuring team knowledge during skill acquisition of a complex task. International Journal of Cognitive Ergonomics: Special Section on Knowledge Acquisition, 5, CERTT Lab & UAV STE Cooke, N. J., Rivera, K., Shope, S.M., & Caukwell, S. (1999). A synthetic task environment for team cognition research. Proceedings of the Human Factors and Ergonomics Society 43rd Annual Meeting, Cooke, N. J., & Shope, S. M. (2002). The CERTT-UAV Task: A Synthetic Task Environment to Facilitate Team Research. Proceedings of the Advanced Simulation Technologies Conference: Military, Government, and Aerospace Simulation Symposium, pp San Diego, CA: The Society for Modeling and Simulation International. Cooke, N. J., & Shope, S. M. (in press), Designing a synthetic task environment. In S. G. Schiflett, L. R. Elliott, E. Salas, & M. D. Coovert, Scaled Worlds: Development, Validation, and Application. UK: Ashgate. Communication Analyses Kiekel, P. A., Cooke, N. J., Foltz, P. W., & Shope, S. M. (2001). Automating measurement of team cognition through analysis of communication data. In M. J. Smith, G. Salvendy, D. Harris, and R. J. Koubek (Eds.), Usability Evaluation and Interface Design, pp , Mahwah, NJ: Lawrence Erlbaum Associates. Kiekel, P. A., Cooke, N.J., Foltz, P.W., Gorman, J. C., & Martin, M.J. (2002). Some promising results of communication-based automatic measures of team cognition. Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting, October 11, 2002 Nancy Cooke

66 References Cannon-Bowers, J. A., Salas, E. & Converse, S. (1993). Shared Mental Models in Expert Team Decision Making. In N. J. Castellan, Jr. (Ed.). Current issues in individual and group decision making (pp ). Hillsdale, NJ: Erlbaum. Hillsdale, NJ: Lawrence Erlbaum Associates. Clark, H. H., & Schaefer, E. F. (1987). Collaborating on Contributions to Conversations. Language and Cognitive Processes, 2, Cooke, N. J., Neville, K. J., & Rowe, A. L. (1996) Procedural network representations of sequential data. Human-Computer Interaction, 11, Davis, J. H. (1973). Group decision and social interaction: A theory of social decision schemes. Psychological Review, 80, Durso, F. T., Hackworth, C. A., Truitt, T. R., Crutchfield, J., & Nikolic, D. & Manning, C. A. (1998). Situation awareness as a predictor of performance in en route air traffic controllers. Air Traffic Control Quarterly, 5, 1-20. Fulk, J., Schmitz, J., & Ryu, D. (1995). Cognitive elements in the social construction of communication technology. Management Communication Quarterly, 8, Hinsz, V. B. (1999). Group decision making with responses of a quantitative nature: The theory of social decision schemes for quantities. Organizational Behavior and Human Decision Processes, 80, Hutchins, E. (1991). The social organization of distributed cognition. In L. B. Resnick, J. M. Levine, S. D. Teasley (Eds.), Perspectives on socially shared cognition (pp ). Washington, DC, USA: American Psychological Association. Janis, L. J. (1972). Victims of groupthink. Boston: Houghton Mifflin. Landauer, T. K, Foltz, P. W. & Laham, D. (1998). An introduction to Latent Semantic Analysis. Discourse Processes, 25, Wegner, D. M. (1986). Transactive Memory: A contemporary analysis of the group mind. In B. Mullen and G. Goethals (Eds.), Theories of group behavior (pp ). New York: Springer-Verlag. Wilkes-Gibbs, D., & Clark, H. H. (1992). Coordinating Beliefs in Conversation. Journal of Memory and Language, 31, October 11, 2002 Nancy Cooke

67 Questions or Comments? October 11, 2002 Nancy Cooke


Download ppt "Cognitive Task Analysis for Teams"

Similar presentations


Ads by Google