Presentation is loading. Please wait.

Presentation is loading. Please wait.

Presentation : Analyzing Software Requirements Errors in Safety-Critical Embedded Systems.

Similar presentations


Presentation on theme: "Presentation : Analyzing Software Requirements Errors in Safety-Critical Embedded Systems."— Presentation transcript:

1 Presentation : Analyzing Software Requirements Errors in Safety-Critical Embedded Systems

2 INTRODUCTION A software error is defined to be a software related discrepancy between a computed, observed or a measured value or condition and the true specified, or theoretically correct value or condition There are 3 types of errors: negligible, significant and catastrophically

3 METHODOLOGY a. Work of Nakajo and Kume on software error cause and effect relationship offers an appropriate framework for classifying software errors. b. Their work analyses 3 points in the path from s/w error backwards to its source allowing classification of not only s/w errors but also human errors.

4 OVERVIEW OF CLASSIFICATION Program Faults Human Error Process Flaws

5 PROGRAM FAULTS Internal Faults : Syntax, Programming Language Semantics Interface Faults:Interaction with other Software Components,Interaction with hardware in the system Functional Faults:Operating Faults, Conditional Faults, Behavioral Faults

6 Program Faults (Contd.) Safety related errors account for 56% of total errors in Voyager and 48% in Galileo. A few internal faults were also found during integration and system testing The tables in the appendix of the paper provide a lot of statistical data is provided

7 HUMAN ERROR Coding Errors Communication Errors Within a Team Between Teams Requirements Errors in Recognition Errors in Deployment

8 PROCESS FLAWS Flaws in Inspection and Testing Methods Inadequate Interface-Specification & Communication Between Software Designers and Programmers Between Software and Hardware Engineers Requirements Not Identified or Understood Incomplete Documentation Missing Requirements Inadequate Design

9 RELATION BETWEEN PROGRAM FAULTS AND ROOT CAUSES: Second step is to track backwards in time to find human factors involved in program faults Interface faults human factors are mainly miscommunications between development/department teams.

10 RELATION BETWEEN PROGRAM FAULTS AND ROOT CAUSES: (Contd.) Safety related interface faults are largely due to communication errors between teams rather within teams (67% on voyager and 48% on Galileo) Non-safety related errors are equally likely due to misunderstood hardware as well as software specifications.

11 RELATION BETWEEN PROGRAM FAULTS AND ROOT CAUSES (Contd.) Safety related functional faults are primarily due to errors in recognizing requirements. Operational faults and behavioral faults are caused by errors in recognizing requirements more often than due to errors in development.- [Lutz 92]

12 RELATION BETWEEN PROGRAM FAULTS AND ROOT CAUSES (Contd.) BOTTOM LINE: Difficulties with safety requirements are a root cause of safety-related software errors until integration and system testing is done.

13 RELATION BETWEEN ROOT CAUSES AND PROCESS FLAWS: The third step is to associate a pair of flaws to each program fault This first element identifies the process flaw or inadequacy in control of the system complexity-[Lutz 92] The second element identifies the associated process flaw in the comm. or development methods.-[Lutz 92]

14 RELATION BETWEEN ROOT CAUSES AND PROCESS FLAWS: Safety related interface faults which are the most common process flaws are not inadequately identified along with undocumented h/w behavior Misunderstood requirements could lead to design flaws and may result in imprecise specifications

15 COMPARISON OF RESULTS WITH PREVIOUS WORK Role of interface specifications in controlling software was underestimated Previous reports analyzed fairly simple systems in well-understood domains

16 COMPARISON OF RESULTS WITH PREVIOUS WORK (Contd.) Assumes that requirement specifications are correct Distinction between causes of safety critical and non-safety critical s/w errors was not adequately investigated- [Lutz 92]

17 Recommendations Focus more on developing the interface between software and the system Identify safety-critical hazards during requirement analysis

18 Recommendations (Contd.) Use formal specifications techniques in addition to natural-language software requirements Promote informal communication among teams

19 Recommendations (Contd.) Teams must communicate better as requirements change Include requirements for “defensive design”

20 STRENGTH AND WEAKNESS STREGNTHS Detailed analysis of safety critical errors in spacecrafts and good classification of errors WEAKNESS Does not take into consideration the wide range of safety critical embedded systems


Download ppt "Presentation : Analyzing Software Requirements Errors in Safety-Critical Embedded Systems."

Similar presentations


Ads by Google