Presentation is loading. Please wait.

Presentation is loading. Please wait.

CEN3722 Human Computer Interaction Automation

Similar presentations


Presentation on theme: "CEN3722 Human Computer Interaction Automation"— Presentation transcript:

1 CEN3722 Human Computer Interaction Automation
Dr. Ron Eaglin

2 Objectives Define the term automation
Provide examples of automated systems Describe appropriate circumstances for using automation in human- machine systems Describe potential problems with automation Define the term situational awareness Describe the levels of situational awareness Describe how automation effects situational awareness

3 Automation Define: Applications
When a machine assumes responsibility for a task usually performed by a human operator. Applications Aviation Automatic pilot Nuclear power plant control Automobiles

4 Nuclear Power Plant Control
Photo courtesy of NASA

5 Automation Purposes Performing functions that the human operator cannot perform because of inherent limitations Control guidance of booster rockets Complex nuclear reactions Robots for manipulation of hazardous materials

6 Automation Purposes Performing functions that the human operator can do but performs poorly or at the costs of high workload Autopilots that control many aspects of flight Ground proximity warning systems Traffic alter and collision avoidance system (TCAS)

7 Automation Purposes Augmenting or assisting performance in tasks in which humans show limitations Not used as a replacement to human operators, but as an aid in peripheral tasks. Automatic tagging of (symbols, flight info) on ATC display Computer-displayed checklists of flight procedures

8 Air Traffic Control System

9 Automation Problems Unreliability
Component may fail or be mis-programmed More components, more that can fail Operator who sets up automation may make an error KAL 007 Automated system does what it’s supposed to do, but logic behind system is faulty

10 KAL 007 Korean Airlines 007 – shot down by Soviet SU-15 interceptor September 1, 1983 KAL007’s route (Anchorage to Seoul) was the Romeo 20 airway which passes 18 miles from Soviet airspace. 10 minutes after take-off KAL007 began to deviate north from assigned heading (for 5 ½ hours). This was due to the autopilot operating in HEADING rather than INS mode, eventually being in Russian airspace Part of the cause was attributed to autopilot system errors

11 Problems in Automation
Trust: calibration and mistrust Perceived reliability of the system will affect the operator’s trust in the automation Calibration: trust system in proportion to its reliability Mistrust may result from failure to understand the nature of the automated algorithms

12 Problems in Automation
Over trust and complacency Trusting the automation more than it warrants Complacency: Failure to monitor adequately Real problem only when something fails, and human needs to intervene Detection: complacent operator slower to detect failure Situation awareness: active participants are more aware of the state of a system than a passive observer. Skill loss: long term consequence is that operator is deskilled. A gradual loss of skills an operator experiences by virtue of not having been an active perceiver, decision maker, or controller.

13 Problems in Automation
Automation and workload Goal of automation often to reduce operator workload Allow user to apply mental resources to other tasks Sometimes automation reduces workload too far, leading to low level of arousal

14 Problems in Automation
Loss of human cooperation Non-automated, multi-person systems may allow nonverbal, subtle types of communication ATC can tell if a pilot is in trouble by their voice Digital datalink: replace air-to-ground communications with digital messages typed and appear on a display panel

15 Problems in Automation
Job satisfaction Goes beyond performance to issues of morale Reduction in morale and job satisfaction may have serious implications. Etymology of Sabot (clogs) being used to stop machines used in automation (sabotage).

16 Automation Critical issues Design from the human perspective
Give operators purposeful, meaningful, and relevant tasks so they remain aware of system status and operation Automated subsystems should be simple to understand and predictable Mental model Humans should be able to monitor the system

17 Automation Critical issues
Rapid prototype to determine task allocation User acceptability Provide alternate means to carry out task Failure of automation should be unambiguously announced Control automation should be used to reduce operator workload.

18 Three Mile Island – Case Study

19 TMI – Case Study The day before the accident, operators attempted to fix a problem with a blockage in the condensate polishers (filters) using compressed air (a common procedure). A small amount of water made it past a check valve (one way valve) into an instrument air line. This would eventually cause the feedwater pumps, condensate booster pumps, and condensate pumps to turn off around 4:00 am

20 TMI – Case Study At this point the reactor was scrammed (rod inserted to shut down nuclear reaction) and an auxiliary pump was started. Heat was still being generated (decay heat of reactor) but was not being removed (steam generators were not removing heat at same rate it was being generated). Reactor vessel pressure began to rise, so a relief valve was opened but it failed to close when the pressure had returned to normal. In the secondary loop (removes heat from primary loop) no feedwater was reaching the steam generators due to another valve being closed – which should have been opened.

21 TMI – Case Study (Design Flaws)
The design of the pilot-operated relief valve indicator light was fundamentally flawed. The bulb was simply connected in parallel with the valve solenoid, thus implying that the pilot-operated relief valve was shut when it went dark, without actually verifying the real position of the valve.

22 TMI Case Study – Human Factors
When everything was operating correctly, the indication was true and the operators became habituated to rely on it. However, when things went wrong and the main relief valve stuck open, the unlighted lamp was actually misleading the operators by implying that the valve was shut. This caused the operators considerable confusion, because the pressure, temperature and coolant levels in the primary circuit, so far as they could observe them via their instruments, were not behaving as they would have if the pilot-operated relief valve were shut.

23 TMI Case Study When the second shift came in – they discovered the problem, but by then the damage was done.

24 TMI Case Study - Analysis
The operators had not been trained to understand the ambiguous nature of the pilot-operated relief valve indicator and to look for alternative confirmation that the main relief valve was closed.  A temperature indicator downstream of the pilot-operated relief valve in the tail pipe between the pilot-operated relief valve and the pressurizer that could have told them the valve was stuck open. Operators had not been trained to use this as a verification indicator  The location of the temperature on the back of the seven-foot-high instrument panel also meant that it was effectively out of sight of the operators

25 TMI Lessons Valve Control Speed Control OFF Lower Raise
Lack of consistency - Two side-by-side control panels designed with mirror image symmetry Valve Control Speed Control OFF Lower Raise

26 TMI Lessons Violations of basic anthropometry
Many controls were hard to reach and displays hard to see Violations of stimulus-response compatibility Displays not necessarily positioned close to the proximity to the controls they reflected (collocation) Absence of meaningful groupings

27 Situation Awareness Define:
Perceiving information in order to determine a system’s status The perception of the elements in the environment within a volume of time and space, the comprehension of their meaning and the projection of their status in the near future.

28 Situational Awareness
1SA: perceiving critical factors in the environment Flight parameters, state of on-board systems, own location and location of critical reference points. 2SA: Understanding what those factors mean, particularly when integrated together in relation to the operator’s goals Pattern of info indicates they are at a stall point. 3SA: Understanding of what will happen with the system in the near future.

29 Situational Awareness
Automation affects SA through 3 mechanisms: Changes in vigilance and complacency associated with monitoring Assumption of a passive role instead of an active role in controlling the system Changes in the quality or form of feedback provided to the human operator

30 Developing and maintaining SA
Development and maintenance of SA involves keeping up with a large quantity of rapidly changing system parameters Integrating then with other parameters, active goals, and one’s mental model of the system To understand what is happening and project what the system is going to do. Allows operators to behave proactively to optimize system performance and take actions to forestall potential future problems.

31 Objectives Define the term automation
Provide examples of automated systems Describe appropriate circumstances for using automation in human- machine systems Describe potential problems with automation Define the term situational awareness Describe the levels of situational awareness Describe how automation effects situational awareness


Download ppt "CEN3722 Human Computer Interaction Automation"

Similar presentations


Ads by Google