Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Credibility of NOSS Data Chris Henry The University of Texas Human Factors Research Project The University of Texas at Austin 2 nd ICAO TEM & NOSS.

Similar presentations


Presentation on theme: "The Credibility of NOSS Data Chris Henry The University of Texas Human Factors Research Project The University of Texas at Austin 2 nd ICAO TEM & NOSS."— Presentation transcript:

1 The Credibility of NOSS Data Chris Henry The University of Texas Human Factors Research Project The University of Texas at Austin 2 nd ICAO TEM & NOSS Symposium Washington DC February 8, 2007

2 Presentation Objectives 1. NOSS Characteristics in Action – Part I: Ensuring Consistent Data 2. NOSS Characteristics in Action – Part II: Gaining Trust 3. Relating NOSS to other sources of information 4. Discussion of the FAA Laboratory Study

3 Ensuring Consistent Data

4 What should we be looking for? Developing the TEM taxonomy for ATC Review accidents and incidents Consult subject matter experts Listen to what our observers say and re-evaluate The TEM Framework codes Threats – 120 threat codes to date Errors – 80 error codes to date Undesired States – 30 US codes to date

5 Ueberlingen - Threats Maintenance threats – Frequently observed in NOSS Equipment and Software threats - Frequently observed in NOSS Non-Standard Traffic - Frequently observed in NOSS Workspace/Difficult to Access information – Has been observed in NOSS Communication Difficulties with other controllers – Has been observed in NOSS Simultaneous and blocked R/T transmissions – Frequently observed in NOSS Threats similar to those seen in the Ueberlingen Accident as seen in normal operations

6 NOSS Characteristics – Ensuring Consistency How do we know observers are reporting events consistently? Common Framework - TEM Standardized data collection instrument Observers are trained and calibrated

7 NOSS Characteristics – Ensuring Consistency Reviewing the Observations Data Verification Phase I: Analyst Review Check for omissions/inconsistencies Data Verification Phase II: Data Verification Roundtables Organizational experts and analyst review TEM data to ensure accuracy and consistency with procedures

8 Does NOSS Capture an Accurate Snapshot of Normal Operations?

9 NOSS Success Factors NOSS success is dependent upon methodology and execution Low controller trust = Low quality data because there will be no differentiation between NOSS and proficiency checks Angel Performance Natural Performance Formal Check Nobody Regulator NOSS Observer - NOSS value + - + - Controller Trust + - +

10 How do we know NOSS provides an accurate snapshot? Presenting the Results to and Receiving Feedback From: NOSS Observers The final report is a good representation of what they saw Air Traffic Controllers from the observed Complexes Thats how we move traffic Youve got us Check and Audit Controllers More surprised than other groups

11 Does NOSS provide a valid and accurate snapshot? Are observed controllers behaving normally? Procedural Non-Compliance: some numbers Non-operational conversation: 20-40% of observations Non-standard phraseology: 20-40% of observations Procedural Non-Compliance: some anecdotes Dozing off Walking away from position Reading magazines

12 Does NOSS provide a valid and accurate snapshot? Would these situations be considered reportable events at your ANSP? Clearing aircraft below MVAs Clearing an aircraft to land on a runway being controlled by another controller Expired SAR times not investigated Aircraft entering next sector (15 miles) without being handed off Aircraft progressing through entire sectors while not on frequency

13 Contributions to SMS: A Few Concrete Examples

14 NOSS: Augmenting other sources of information – Case One Complex A had an elevated number of incidents pertaining to strip indicated / actual altitudes NOSS detected an elevated number of associated USs NOSS able to provide additional information on the errors leading to the USs and incidents

15 NOSS: Augmenting other sources of information – Case Two Sector X had a reputation as being a challenging piece of airspace, but there was little objective information to substantiate it More threats, mismanaged threats, errors, and undesired states in Sector X than other sectors Impartial observers agree, the sector is a mess!

16 The FAA Laboratory Study

17 FAA Laboratory Study Purpose – To assess how consistently multiple observers would record events Scope (5 days) Observer Training (2 days) Practice observations on a high fidelity simulator (1 day) Individual coding exercise (1 day) Simulated data verification session (1 day)

18 Qualitative Findings TEM made sense; easy to learn & apply (face validity) Very high degree of overlap in what the observers captured during the simulated traffic sessions QA participants reported NOSS offered a more structured method of observing than the current regime

19 Quantitative Findings Cohens Kappa – A measure of observer agreement Threats -.76 Errors -.87 Undesired States -.71 Gold Standard for Cohens Kappa is.70 Observers consistently identified and coded threats, errors, and undesired states

20 Summary NOSS was developed to discover what is going on in normal operations NOSS is premised on data consistency and controller trust Field trials indicate that NOSS complements and augments existing sources of information The NOSS process serves not only as a source of safety information, but as a catalyst for safety change


Download ppt "The Credibility of NOSS Data Chris Henry The University of Texas Human Factors Research Project The University of Texas at Austin 2 nd ICAO TEM & NOSS."

Similar presentations


Ads by Google