Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Information Systems Development (IS501) Dr.Doaa Nabil.

Similar presentations


Presentation on theme: "1 Information Systems Development (IS501) Dr.Doaa Nabil."— Presentation transcript:

1 1 Information Systems Development (IS501) Dr.Doaa Nabil

2 2 Part (3) Information Systems Development Life Cycle Phases

3 The use of a methodology improves the practice of information systems development. A methodology may include: 3 Methodology - A series of phases. - A series of techniques. - A series of tools. - A training scheme. - A philosophy. Method consists of the collection of data through observation and experimentation, and the formulation and testing of hypotheses.

4 What’s the Difference Between “Method” and “Methodology”? Method: Techniques for gathering evidence The various ways of proceeding in gathering information Methodology: The underlying theory and analysis of how research does or should proceed, often influenced by discipline 4

5 Assessing Methods Research Question(s) is/are key Methods must answer the research question(s) Methodology guides application 5

6 summary a research method is a technique for (or way of proceeding in) gathering evidence" methodology is a theory and analysis of how research does or should proceed“ 6

7 Techniques include ways to evaluate the costs and benefits of different solutions and methods to formulate the detailed design necessary to develop computer applications. 7 Techniques Examples: - Flowcharts. - An organization Chart. - Manual documents specification. - Grid chart. - Discussion records.

8 Tools are software packages that aid aspects of information systems development. 8 Tools Examples: - MS Project. - Power designer. - Visio.

9 A Simple System Development Process Simplified System Development Process General Problem-Solving Steps System initiation 1.Identify the problem. System analysis 2.Analyze and understand the problem. 3.Identify solution requirements or expectations. System design 4.Identify alternative solutions and choose the “best” course of action. 5.Design the chosen solution. System implementation 6.Implement the chosen solution. 7.Evaluate the results. If the problem is not solved, return to step 1 or 2 as appropriate. 1-9

10 Systems Development Process Overview 10

11 System Development Process Overview System initiation – the initial planning for a project to define initial project scope, goals, tasks schedule, and budget. System analysis – the study of a business problem domain to recommend improvements and specify the business requirements and priorities for the solution. System design – the specification or construction of a technical, computer-based solution for the business requirements identified in a system analysis. System implementation – the construction, installation, testing, and delivery of a system into production, provide training for the system users. 11

12 1- Initiation (Planning)Phase 12 It involves determining a solid plan for developing your information system. It involves the following three primary activities: 1- define the system to be developed (determine which system is required to support the strategic goals of organization) 2- set the project scope ( it defines the high level system requirements through writing project scope document in one paragraph) 3- develop the project plan ( what, when, and who questions of systems activities to be performed)

13 13 part(4) System Analysis& Design

14 14 system analysis It involves end users and IT specialists working together to gather, understand, and document the business requirements for the proposed system through writing (Joint Application Development report)  A requirement is a feature that the system must have or a constraint that it must satisfy to be accepted by the client.  Requirements engineering aims at defining the requirements of the system under construction.

15 15 Joint Application Development report It is a highly structured workshop that brings together users, managers, and information systems specialists to jointly define and specify user requirements, technical options, and external designs( inputs, outputs, and screen)

16 16 Joint Application Development report There are numerous of benefits to JAD: 1- it tends to improve the relationship between users, management, and information systems professionals ( increase confidence between user and management) 2- it tends to improve the computer literacy of users and managers as well as the business and application literacy of information systems specialists 3- it places the responsibility for conflict resolution where it belongs 4- it decrease the total system development time by integrating and getting multiple interviews into the structured workshop 5- it lowers the cost of the systems development by getting the requirements correctly defined and prioritized the first time

17 17 System Analysis Phases Systems analysis consists of three phases: 1- survey project feasibility ( survey phase) 2- study and analyze the current system ( study phase) 3- define and prioritize users’ requirements ( definition phase)

18 18 System Analysis Phases (survey phase) It answer the question “ is this project worth looking at?” The fundamental objectives of the survey phase are: 1- to identify the problem, opportunities, and,or directives that initiated this project request 2- to determine if solving the problems exploiting the opportunities, and satisfying the directives will benefit the business

19 19 System Analysis Phases (survey phase) Survey Phase Activities: 1- Conduct initial interview ( 45-60 minutes) to record lists of people, data, activities, locations and networks, and existing technology, list of problems, opportunities, constraints, ideas, opinions ( fact finding techniques) 2- define the project scope ( of the proposed project through drawing context model that determine the boundaries and scope of the system – data scope- process scope- network scope –function point analysis) 3- classify problems, opportunities, and possible solutions ( a quick fix, enhancement, new development, visibility, priority, and solution in the matrix form) 4- established a proposed project plan 5- present survey findings and recommendations

20 20 System Analysis Phases (Study phase) It answer the questions: “ are the problems really worth solving?” “ is a new system really worth building?” ( using JAD in one week and one – three day workshop) The fundamental objectives of the study phase are: 1- to understand the business environment of the system 2- to understand the underlying causes and effects of the problems 3- to understand the benefits of exploiting opportunities 4- to understand the implications of noncompliance with directives

21 21 System Analysis Phases (Study phase) Study Phase Activities: 1- assign project roles 2- learn about current system 3- model the current systems 4- analyze problems and opportunities 5- establish new system’s objectives 6- modify project plan and scope 7- review findings and recommendations

22 22 System Analysis Phases (Definition phase) It answer the questions: “ What does the user need and want from a new system?” The fundamental objectives of the definition phase are: 1- to define business ( nontechnical) requirements that address problems identified with the current system 2- to define business requirements that discover opportunities identified with the current system 3- to define business requirements that fulfill directives

23 23 Definition Phase Activities: 1- identify requirements Requirements engineering includes two main activities:  Requirements elicitation, which results in the specification of the system that the client understands. It requires the collaboration of several groups of participants with different backgrounds. (functional requirements, nonfunctional requirements, use cases, and scenarios)  Requirements Analysis, which results in an analysis model that the developers can unambiguously interpret 2- model system requirements through drawing: * data model diagram to model data requirements for many new systems that serve at the starting point for designing files and databases System Analysis Phases (Definition phase)

24 24 System Analysis Phases (Definition phase) Data flow diagram to model the processing requirements for most new systems that serve at starting point for designing computer based methods and application programs Connectivity diagrams that map the above people, data, and activities to geographical locations that serve at start point for designing the communication systems for distributing the data, activities, and people 3- build discovery prototype ( if necessary) 4- prioritize business requirements 5- review requirements specifications

25 25 System Analysis Modeling and Techniques 1- Functional Decomposition Diagrams 2- Data Flows Diagrams 3- Unified Modeling Language ( Use Case Diagrams, Sequence Diagrams) 4- Fact - Finding

26 26 1- Functional Decomposition Diagrams(FDD)  It is a top –down representation of a function or process ( Structure chart)  Break main business functions down into lower – level functions and processes Course Administration Course Enrolment Course Attendance Course Completion Course Assessment Course Certification Course Payment Course Application

27 27 Functional Decomposition Diagrams(FDD) example

28 28 2- Data Flows Diagrams(DFD) It show how the system stores, processes, and transforms data

29 29  DFD Very popular tool for describing functions of a system in terms of processes and data used by them  FDD may be done before DFD or we may prepare DFDs directly  Have more contents than FDDs  Flow of data is shown, not flow of control  DFDs are simple pictorial representations; easily understood by users and management.  facilitate top-down development

30 30 3- Unified Modeling Language ( Use Case Diagrams)  It is widely used method of visualizing and documenting information systems from a user’s view point  It uses object oriented design concepts, but it is independent of any specific programming language  It use to describes business processes and requirements generally  It provides various graphical tools such as use case diagrams and sequence diagrams

31 31 Use Case Diagrams  It represents the interaction between users and the information systems  User becomes an actor, with specific role that describes how he interacts with the system

32 Example of use case diagram 32 Find an item Order an item Check order Customer Registered customer Order fast delivery Free search Structured search > Actor (person)

33 Example of use case diagram 33 Teacher Student Printing administrator Grade system Record grades View grades Distribute Report cards Create report cards

34 Example of use case diagram 34

35 35 Sequence Diagrams  It shows the timing of interactions between objects as they occur  System analyst might use a sequence diagram to show all possible outcomes or focus on a single scenario

36 Example sequence diagram 36 getViolation(id) Clerk :Violations Dialog :Violations Controller :Violations DBProxy lookup viewButton() id=getID() v:Traffic Violation display(v) > v Lookup Traffic Violation May be a pseudo- method DB is queried and the result is returned as an object

37 37 4- Fact -Finding Techniques  It involves answers to five familiar questions: who, what,where, when, and how  For each question you must ask another very important question : why

38 38 Chapter (5) System Design

39 System design is the transformation of an analysis model into a system design model ( build a technical blueprint of how the proposed system will work). System Design: Focuses on the solution domain Project team turns its attention to the system from a physical to technical point of view Developers also select strategies for building the system, such as: The hardware/software strategy. The persistent data management strategy. The global control flow. The access control policy. The handling of boundary conditions. 39

40 The Design Phase The primary objective is to convert the description of the recommended alternative solution into system specification High-level (architectural) design consists of developing an architectural structure for software programs, databases, the user interface, and the operating environment Low-level (detailed) design entails developing the detailed algorithms and data structures that are required for program development 40

41 System Design Output 41 The result of system design is a model that includes a subsystem decomposition and a clear description of each of these strategies. System design is not algorithmic ( ERD). Developers have to make trade-offs among many design goals that often conflict with each other. They also cannot anticipate all design issues that they will face because they do not yet have a clear picture of the solution domain.

42 System Design structure System Design 2. System Layers/Partitions Cohesion/Coupling 5. Data 1. Design Goals Definition Trade-offs 4. Hardware/ Special purpose Software Buy or Build Trade-off Allocation Connectivity 3. Concurrency Data structure Persistent Objects Files Databases Management Access control Security 6. Global Resource Handling 8. Boundary Conditions Initialization Termination Failure Decomposition Mapping 7. Software Control Identification of Threads Monolithic Event-Driven Threads Conc. Processes

43 43 System Design Phases 1- design the technical architecture ( hardware, software, telecommunications equipment 2- design the system models

44 The Design Phase activities The design phase includes seven activities: Design and integrate the network Design the application network Design the user interfaces Design the system interfaces Design and integrate the database Prototype for design details Design and integrate the system controls 44

45 List of Design Goals Reliability Modifiability Maintainability Understandability Adaptability Reusability Efficiency Portability Traceability of requirements Fault tolerance Backward-compatibility Cost-effectiveness Robustness High-performance  Good documentation  Well-defined interfaces  User-friendliness  Reuse of components  Rapid development  Minimum # of errors  Readability  Ease of learning  Ease of remembering  Ease of use  Increased productivity  Low-cost  Flexibility

46 Relationship Between Design Goals Reliability Low cost Increased Productivity Backward-Compatibility Traceability of requirements Rapid development Flexibility Client End User (Customer, Portability Good Documentation Runtime Efficiency Sponsor) Developer/ Maintainer Minimum # of errors Modifiability, Readability Reusability, Adaptability Well-defined interfaces Functionality User-friendliness Ease of Use Ease of learning Fault tolerant Robustness

47 Typical Design Trade-offs Functionality vs. Usability Cost vs. Robustness Efficiency vs. Portability Rapid development vs. Functionality Cost vs. Reusability Backward Compatibility vs. Readability

48 Object Design (1/2) During analysis, we describe the application objects. During system design, we describe the system in terms of its architecture, such as its subsystem decomposition, global control flow, persistency management, and hardware/software platform on which we build the system. This allows the selection of off-the-shelf components that provide a higher level of abstraction than the hardware. During object design, we close the gap between the application objects and the off-the-shelf components by identifying additional solution objects and refining existing objects. Object design includes: reuse, during which we identify off-the-shelf components and design patterns to make use of existing solutions service specification, during which we precisely describe each class interface object model restructuring, during which we transform the object design model to improve its understandability and extensibility object model optimization, during which we transform the object design model to address performance criteria such as response time or memory utilization. 48

49 Object Design (2/2) During object design, we identify and refine solution objects to realize the subsystems defined during system design. During this activity, our understanding of each object deepens: we specify the type signatures and the visibility of each of the operations. we describe the conditions under which an operation can be invoked and those under which the operation raises an exception. The focus of object design is on specifying the boundaries between objects. At this stage in the project, a large number of developers concurrently refines and changes many objects and their interfaces. The focus of interface specification is for developers to communicate clearly and precisely about increasingly lower-level details of the system. The interface specification activities of object design include: identifying missing attributes and operations specifying type signatures and visibility specifying invariants specifying preconditions and postconditions. 49

50 50 Chapter (6) System Testing

51 Testing Testing is the process of finding differences between the expected behavior specified by system models and the observed behavior of the implemented system. Unit testing finds differences between the object design model and its corresponding component. Structure testing finds differences between the system design model and a subset of integrated subsystems. Functional testing finds differences between the use case model and the system. Performance testing finds differences between nonfunctional requirements and actual system performance. When differences are found, developers identify the defect causing the observed failure and modify the system to correct it. In other cases, the system model is identified as the cause of the difference, and the model is updated to reflect the state of the system. The goal of testing is to design tests that exercise defects in the system and to reveal problems. Testing is usually accomplished by developers that were not involved with the construction of the system. 51

52 Terminology Reliability: The measure of success with which the observed behavior of a system confirms to some specification of its behavior. Failure: Any deviation of the observed behavior from the specified behavior. Error: The system is in a state such that further processing by the system will lead to a failure. Fault (Bug): The mechanical or algorithmic cause of an error. 52

53 Dealing with Errors Verification: Assumes hypothetical environment that does not match real environment Proof might be buggy (omits important constraints; simply wrong) Modular redundancy: Expensive Declaring a bug to be a “feature” Bad practice Patching Slows down performance Testing Testing is never good enough 53

54 Another View on How to Deal with Errors Error prevention (before the system is released): Use good programming methodology to reduce complexity Use version control to prevent inconsistent system Apply verification to prevent algorithmic bugs Error detection (while system is running): Testing: Create failures in a planned way Debugging: Start with an unplanned failures Monitoring: Deliver information about state. Find performance bugs Error recovery (recover from failure once the system is released): Data base systems (atomic transactions) Modular redundancy Recovery blocks 54

55 Some Observations It is impossible to completely test any nontrivial module or any system Theoretical limitations: Halting problem Practial limitations: Prohibitive in time and cost Testing can only show the presence of bugs, not their absence (Dijkstra) 55

56 Testing Activities 56 Tested Subsystem Code Functional Integration Unit Tested Subsystem Requirements Analysis Document System Design Document Tested Subsystem Test Test Unit Test Unit Test User Manual Requirements Analysis Document Subsystem Code Subsystem Code All tests by developer Functioning System Integrated Subsystems

57 Global Requirements Testing Activities continued 57 User’s understanding Tests by developer PerformanceAcceptance Client’s Understanding of Requirements Test Functioning System Test Installation User Environment Test System in Use Usable System Validated System Accepted System Tests (?) by user Tests by client

58 Types of Testing Unit Testing: Individual subsystem Carried out by developers Goal: Confirm that subsystems is correctly coded and carries out the intended functionality Integration Testing : Groups of subsystems (collection of classes) and eventually the entire system Carried out by developers Goal: Test the interface among the subsystem 58

59 System Testing: The entire system Carried out by developers Goal: Determine if the system meets the requirements (functional and global) Acceptance Testing: Evaluates the system delivered by developers Carried out by the client. May involve executing typical transactions on site on a trial basis Goal: Demonstrate that the system meets customer requirements and is ready to use Implementation (Coding) and testing go hand in hand 59 Types of Testing

60 Unit Testing Informal: Incremental coding Static Analysis: Hand execution: Reading the source code Walk-Through (informal presentation to others) Code Inspection (formal presentation to others) Automated Tools checking for syntactic and semantic errors departure from coding standards Dynamic Analysis: Black-box testing (Test the input/output behavior) White-box testing (Test the internal logic of the subsystem or object) Data-structure based testing (Data types determine test cases) 60

61 Black-box Testing Focus: I/O behavior. If for any given input, we can predict the output, then the module passes the test. Almost always impossible to generate all possible inputs ("test cases") Goal: Reduce number of test cases by equivalence partitioning: Divide input conditions into equivalence classes Choose test cases for each equivalence class. (Example: If an object is supposed to accept a negative number, testing one negative number is enough) 61

62 Black-box Testing (Continued) Selection of equivalence classes (No rules, only guidelines): Input is valid across range of values. Select test cases from 3 equivalence classes: Below the range Within the range Above the range Input is valid if it is from a discrete set. Select test cases from 2 equivalence classes: Valid discrete value Invalid discrete value Another solution to select only a limited amount of test cases: Get knowledge about the inner workings of the unit being tested => white-box testing 62

63 White-box Testing Focus: Thoroughness (Coverage). Every statement in the component is executed at least once. Four types of white-box testing Statement Testing Loop Testing Path Testing Branch Testing 63

64 Integration Testing: Big-Bang Approach 64 Unit Test F Unit Test E Unit Test D Unit Test C Unit Test B Unit Test A System Test

65 Bottom-up Testing Strategy The subsystem in the lowest layer of the call hierarchy are tested individually Then the next subsystems are tested that call the previously tested subsystems This is done repeatedly until all subsystems are included in the testing Special program needed to do the testing, Test Driver: A routine that calls a subsystem and passes a test case to it 65

66 Bottom-up Integration 66 A B C D G F E Layer I Layer II Layer III Test F Test E Test G Test C Test D,G Test B, E, F Test A, B, C, D, E, F, G

67 Pros and Cons of bottom up integration testing Bad for functionally decomposed systems: Useful for integrating the following systems Object-oriented systems real-time systems systems with strict performance requirements 67

68 System Testing Functional Testing Structure Testing Performance Testing Acceptance Testing Installation Testing Impact of requirements on system testing: The more explicit the requirements, the easier they are to test. Quality of use cases determines the ease of functional testing Quality of subsystem decomposition determines the ease of structure testing Quality of nonfunctional requirements and constraints determines the ease of performance tests: 68

69 Structure Testing Essentially the same as white box testing. Goal: Cover all paths in the system design Exercise all input and output parameters of each component. Exercise all components and all calls (each component is called at least once and every component is called by all possible callers.) Use conditional and iteration testing as in unit testing. 69

70 Functional Testing Essentially the same as black box testing Goal: Test functionality of system Test cases are designed from the requirements analysis document (better: user manual) and centered around requirements and key functions (use cases) The system is treated as black box. Unit test cases can be reused, but in end user oriented new test cases have to be developed as well. 70..

71 Performance Testing Stress Testing Stress limits of system (maximum # of users, peak demands, extended operation ) Volume testing Test what happens if large amounts of data are handled Configuration testing Test the various software and hardware configurations Compatibility test Test backward compatibility with existing systems Security testing Try to violate security requirements Timing testing Evaluate response times and time to perform a function Environmental test Test tolerances for heat, humidity, motion, portability Quality testing Test reliability, maintain- ability & availability of the system Recovery testing Tests system’s response to presence of errors or loss of data. Human factors testing Tests user interface with user 71

72 Acceptance Testing Goal: Demonstrate system is ready for operational use Choice of tests is made by client/sponsor Many tests can be taken from integration testing Acceptance test is performed by the client, not by the developer. Majority of all bugs in software is typically found by the client after the system is in use, not by the developers or testers. Therefore two kinds of additional tests: Goal: Demonstrate system is ready for operational use Choice of tests is made by client/sponsor Many tests can be taken from integration testing Acceptance test is performed by the client, not by the developer. Majority of all bugs in software is typically found by the client after the system is in use, not by the developers or testers. Therefore two kinds of additional tests: Alpha test: Sponsor uses the software at the developer’s site. Software used in a controlled setting, with the developer always ready to fix bugs. Beta test: Conducted at sponsor’s site (developer is not present) Software gets a realistic workout in target environ- ment Potential customer might get discouraged Alpha test: Sponsor uses the software at the developer’s site. Software used in a controlled setting, with the developer always ready to fix bugs. Beta test: Conducted at sponsor’s site (developer is not present) Software gets a realistic workout in target environ- ment Potential customer might get discouraged 72


Download ppt "1 Information Systems Development (IS501) Dr.Doaa Nabil."

Similar presentations


Ads by Google