8.1 Lecture 8 Finalising Design Specifications; Quality in Systems Development IMS2805 - Systems Design and Implementation.

Slides:



Advertisements
Similar presentations
© 2005 by Prentice Hall Chapter 13 Finalizing Design Specifications Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F. George.
Advertisements

Software Project Management
Software Modeling SWE5441 Lecture 3 Eng. Mohammed Timraz
© 2005 by Prentice Hall Appendix 2 Automated Tools for Systems Development Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F.
11.1 Lecture 11 CASE tools IMS Systems Design and Implementation.
1 IMS Systems Analysis and Design Communication and documentation during systems development.
Ethics and Professional Conduct; Quality Assurance in Information Systems Development CSE Information Systems 1.
© Prentice Hall CHAPTER 9 Application Development by Information Systems Professionals.
Copyright 2002 Prentice-Hall, Inc. Chapter 4 Automated Tools for Systems Development 4.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Copyright 2002 Prentice-Hall, Inc. Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer Joey F. George Joseph S. Valacich Chapter 15 Finalizing.
Monash University, SIMS, Semester One, DATA GATHERING FOR INFORMATION SYSTEMS DEVELOPMENT CSE Information Systems 1 CSE Information Systems.
Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design Third Edition.
Computers: Tools for an Information Age
Chapter 13 Finalizing Design Specifications
Chapter 5: Project Scope Management
Supplement 02CASE Tools1 Supplement 02 - Case Tools And Franchise Colleges By MANSHA NAWAZ.
1 CMPT 275 Software Engineering Requirements Analysis Process Janice Regan,
Software Quality Assurance For Software Engineering && Architecture and Design.
Planning. SDLC Planning Analysis Design Implementation.
© 2005 by Prentice Hall Appendix 2 Automated Tools for Systems Development Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F.
Copyright © 2003 by Prentice Hall Computers: Tools for an Information Age Chapter 14 Systems Analysis and Design: The Big Picture.
S/W Project Management
1 Building and Maintaining Information Systems. 2 Opening Case: Yahoo! Store Allows small businesses to create their own online store – No programming.
Appendix 2 Automated Tools for Systems Development © 2006 ITT Educational Services Inc. SE350 System Analysis for Software Engineers: Unit 2 Slide 1.
Managing Software Quality
1 Shawlands Academy Higher Computing Software Development Unit.
CSI315 Web Applications and Technology Overview of Systems Development (342)
Copyright 2002 Prentice-Hall, Inc. Chapter 1 The Systems Development Environment 1.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Software Quality Assurance SE Software Quality Assurance What is “quality”?
1 The Software Development Process  Systems analysis  Systems design  Implementation  Testing  Documentation  Evaluation  Maintenance.
Software Development Cycle What is Software? Instructions (computer programs) that when executed provide desired function and performance Data structures.
Finalizing Design Specifications
Systems Analysis and Design in a Changing World, Fourth Edition
© 2006 ITT Educational Services Inc. SE350 System Analysis for Software Engineers: Unit 10 Slide 1 Chapter 13 Finalizing Design Specifications.
The Software Development Process
CSSE Software Engineering Process and Practice Lecture 5 Q UALITY A SSURANCE.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
Systems Development Life Cycle
1 Quality Attributes of Requirements Documents Lecture # 25.
Chapter 2 Object-Oriented Paradigm Overview. Getting Acquainted with the Class Project Read the requirements specification carefully Make note of any.
Week 3: Requirement Analysis & specification
CSE 303 – Software Design and Architecture
Software Development Life Cycle (SDLC)
IS2210: Systems Analysis and Systems Design and Change Twitter:
Chapter 13 Finalizing Design Specifications
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
1 Modern Systems Analysis and Design Hoffer, George & Valacich Chapter 13 Finalizing Design Specifications Course: Physical System Design and Implementation.
by: Er. Manu Bansal Deptt of IT Software Quality Assurance.
Information Technology Project Management, Seventh Edition.
 System Requirement Specification and System Planning.
Chapter 15 Finalizing Design Specifications
Chapter 15 Finalizing Design Specifications
Appendix 2 Automated Tools for Systems Development
Modern Systems Analysis and Design Third Edition
Modern Systems Analysis and Design Third Edition
The Systems Engineering Context
FORMAL SYSTEM DEVELOPMENT METHODOLOGIES
Chapter 4 Automated Tools for Systems Development
Chapter 15 Finalizing Design Specifications
Modern Systems Analysis and Design Third Edition
Thursday’s Lecture Chemistry Building Musspratt Lecture Theatre,
Software life cycle models
Modern Systems Analysis and Design Third Edition
Project Management Process Groups
CHAPTER 10 METHODOLOGIES FOR CUSTOM SOFTWARE DEVELOPMENT
Chapter 13 Finalizing Design Specifications
Software Reviews.
Modern Systems Analysis and Design Third Edition
Presentation transcript:

8.1 Lecture 8 Finalising Design Specifications; Quality in Systems Development IMS Systems Design and Implementation

8.2 References  HOFFER, J.A., GEORGE, J.F. and VALACICH (2002) 3rd ed., Modern Systems Analysis and Design, Prentice-Hall, New Jersey, Chap 5, 8, 15, 16  HOFFER, J.A., GEORGE, J.F. and VALACICH (2005) 4th ed., Modern Systems Analysis and Design, Prentice-Hall, New Jersey, Chap 13  WHITTEN, J.L., BENTLEY, L.D. and DITTMAN, K.C. (2001) 5th ed., Systems Analysis and Design Methods, Irwin/McGraw-Hill, New York, NY. Chapter 11, 17

8.3 References ANDERSON, P.V. (1995). Technical writing: A reader-centred approach, 3rd ed. Harcourt, Brace & Co., Fort Worth. BROCKMAN, J. R. (1990). Writing better computer user documentation - From paper to hypertext. John Wiley and Sons, Inc., New York. Abel, D.E. and Rout, T.P. (1993) Defining and Specifying the Quality Attributes of Software Products The Australian Computer Journal 25:3 pp (particularly pp ) Dahlbom, B. and Mathiassen, L.(1993). Computers in Context: The Philosophy and Practice of Systems Design. Cambridge, Mass.: NCC Blackwell. (particularly pp ) Elliot, S. (1993) Management of Quality in Computing Systems Education, Journal of Systems Management, September 1993 pp 6- 11, (particularly pp 6-8)

8.4 References Perry, W.E. (1992) Quality Concerns in Software Development. The Challenge is Consistency. Information Systems Journal 9:2 pp Standards Association of Australia (1991). Australian Standard : Software Quality Management System. Part 1 Requirements Standards Association of Australia (1991). Australian Standard : Software Quality Management System. Part 2 Implementation Guide

8.5 Need for systems to be developed more quickly today The lines between analysis and design and design and implementation are blurring Traditional approaches allowed for a break between design and implementation Other approaches, such as CASE and prototyping, have caused overlap between the two phases Finalising Design Specifications from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3 rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

8.6 The Process of Finalizing Design Specifications  Less costly to correct and detect errors during the design phase  Two sets of guidelines  Writing quality specification statements  The quality of the specifications themselves  Quality requirement statements  Correct  Feasible  Necessary  Prioritized  Unambiguous  Verifiable from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3 rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

8.7 The Process of Finalizing Design Specifications  Quality requirements  Completely described  Do not conflict with other requirements  Easily changed without adversely affecting other requirements  Traceable back to origin from Hoffer, J.A., et al. (2002). Modern Systems Analysis and Design, (3 rd edition) Addison-Wesley, Reading MA, USA, Chap 15.

8.8 Finalising design specifications  Key deliverable: A set of physical design specifications  Sample contents:  Overall system description  Interface requirements  System features  Non-functional requirements  Other requirements  Supporting diagrams and models see Hoffer et al (2002) p 502

8.9 Finalising design specifications Overall system description:  Functions  Operating environment  Types of users  Constraints  Assumptions and dependencies Interface requirements  User interfaces  Hardware interfaces  Software interfaces  Communication interfaces

8.10 Finalising design specifications System features  Feature 1 description….  Feature n description Non-functional requirements  Performance  Safety  Security  Software quality  Business rules Other requirements Supporting diagrams and models

8.11 Finalising design specifications Representing design specifications:  Written document with diagrams and models – traditional approach uses structure charts  Computer-based requirements/CASE tool to input and manage specifications  Working prototype to capture specifications  Combinations of the above

8.12 Finalising design specifications Structure Charts  Shows how an information system is organized in hierarchical models  Shows how parts of a system are related to one another  Shows breakdown of a system into programs and internal structures of programs written in third and fourth generation languages

8.13 Finalising design specifications Structure Charts  Module  A self-contained component of a system, defined by a function  One single coordinating module at the root of structure chart  Single point of entry and exit  Communicate with each other by passing parameters  Data couple  A diagrammatic representation of the data exchanged between two modules in a structure chart  Flag  A diagrammatic representation of a message passed between two modules

8.14 Finalising design specifications Structure Charts  Pseudocode  Method used for representing the instructions inside a module  Language similar to computer programming code  Two functions  Helps analyst think in a structured way about the task a module is designed to perform  Acts as a communication tool between analyst and programmer

8.15 Prototyping  Construction of the model of a system  Allows developers and users to  Test aspects of the overall design  Check for functionality and usability  Iterative process  Two types  Evolutionary Prototyping  Throwaway Prototyping Finalising design specifications

8.16 Evolutionary Prototyping  Begin by modeling parts of the target system  If successful, evolve rest of the system from the prototype  Prototype becomes actual production system  Often, difficult parts of the system are prototyped first  Exception handling must be added to prototype Throwaway Prototyping  Prototype is not preserved  Developed quickly to demonstrate unclear aspect of system design  Fast, easy to use development environment aids this approach Finalising design specifications

8.17 Finalising design specifications: data design  Physical file and database design requires:  Normalised relations  Attribute definitions  Where and when data is entered, retrieved, updated and deleted  Response time and data integrity needs  Implementation technologies to be used  Design physical tables and file structures according to performance factors and constraints of the particular application

8.18 Quality in Systems Development (must be embedded in the process) Initiation Analysis Design Implementation Review Maintenance Quality Documentation Ethics Project Management Analysts Role

8.19 Definitions of Quality  Degree of excellence (Oxford)  Fitness for purpose (AS1057)  includes quality of design, the degree of conformance to design, and it may include such factors as economic or perceived values  Ability to satisfy stated/implied needs (ISO8402)  Conformance to requirements (Crosby, Horch)

8.20 Determining Quality..  when having a meal in a restaurant  when purchasing a car  when buying a computer The requirements vary immensely, and some of the success measures are very hard to quantify... Quality means different things to different people.. and it varies in different situations

8.21 Information systems: quality issues The system:  does not meet the client’s business or processing needs  does not support the client’s working methods  is unstable and unreliable  does not improve productivity  is difficult to use or requires excessive training to use  is expensive to maintain

8.22 The system:  is incomplete  is expensive to operate  has a short life span  is delivered late  costs more than budget  cannot grow with the organisation  does not produce a return on investment Information systems: quality issues

8.23  “Effort spent on software maintenance is greater than that spent on software development.”  “An error is typically 100 times more expensive to correct in the maintenance phase on large projects, than in the requirements phase.” Boehm, B. (1981) Software Engineering Economics Error detection in systems

8.24 Error Detection * The cost of detecting and correcting errors rises greatly during the systems development cycle. In addition to this is the cost to the organisation of having an incorrect system InitiationAnalysisDesignImplementation $

8.25 Quality Costs Review, Inspection, Re-do, User complaints, Downtime, Loss of sales, Re-testing, Re-documenting, Re-training, Overtime, Customer complaints, Financial losses, Employee turnover The tip of the Iceberg The hidden costs of not having quality systems Obvious upfront costs to the organisation

8.26 Quality dimensions  Correctness - Does it accurately do what is intended?  Reliability - Does it do it right every time?  Efficiency- Does it run as well as it could?  Integrity - Is it precise and unambiguous?  Usability - Is it easy to use?

8.27 Quality dimensions  Maintainability - Is it easy to fix?  Testability - Is correctness easy to check and verify?  Flexibility - Is it easy to adapt and extend?  Portability - Can it be easily converted?  Reusability - Does it consist of general purpose modules?  Interoperability - Will it integrate easily with other systems?

8.28 The Quality Process  The quality process involves the functions of:  Quality control - monitoring a process and eliminating causes of unsatisfactory performance  Quality assurance - planning and controlling functions required to ensure a quality product or process

8.29 Implementing a Quality System  Quality must start at the top - Executive sponsorship is vital.  Everyone must be involved and motivated to realise that they have a responsibility towards the final product, its use, and its quality.  Improve job processes by using standards, and preparing better documentation (using project control methodologies).  Use a Quality Assurance group.  Use technical reviews.

8.30 Standards  Levels of standards  Industry / National / International  Organisational  Industry  Capability Maturity Model (Humphrey 1989) See Whittten et al (2001) pp  National / International  Standards Australia (AS 3563)  International Standards Organisation (ISO 9000)  Organisational  The organisation may adopt or tailor industry, national or international standards.

8.31 Standards  Standards can be of varying levels of enforcement and types which will depend on the organisation and project variables. For example :  mandatory practice: must be adhered to  advisable practice: can be breached with good reason  in the form of a checklist, template, or form.

8.32 Standards - Examples Document template (form, e.g. template for these slides) Acceptance test sign off form (form) Screen standards (standard - mandatory practice) Unit test process (standard - mandatory practice) COBOL II standards (standard - mandatory practice) Post implementation review procedure (advisory practice) Note: different organisations and projects will have different views about whether a standard is mandatory or advisable.

8.33 Quality reviews  Reviews are used in the quality control and quality assurance functions. There are two main forms of review:  Quality Assurance:  management reviews  Quality Control  technical reviews

8.34 Management or Project Review  Management must check the baseline for a deliverable to see that it meets the quality assurance requirements.  This may involve simply noting that a technical review has passed a particular deliverable. The manager can then be assured of quality(given that the manager has actively taken part in the development of the quality system)  The manager can then alter the project plan if necessary to allow for delays or early completion.

8.35 Technical Reviews  A technical review is a structured meeting where a piece of work, which has previously been distributed to participants, is checked for errors, omissions, and conformance to standards.  All deliverables need review, otherwise how do you control quality?  The review is part of quality control and must produce a report so that the quality assurance function can be satisfied.  The report may be a checklist which indicates that the deliverable passes/fails the quality requirements for that type of deliverable.  This report is part of the baseline for the deliverable.

8.36 Technical Reviews  A technical review:  is a formal meeting of a project team which is guided by an agenda and standards  allows input from many people  produces a report which is made public  requires committed participants to be responsible and accountable for their work  is educational as it clarifies standards, and highlights strengths and weaknesses of the team’s skills and knowledge  expects all participants to be responsible for the resulting quality of the artefact

8.37 Technical Review Roles  review member  Know the review process  Be positive and supportive  Interested in improving the quality of the deliverable and the review process  review leader  Secure agreement on objectives and standards  Encourage input from all participants, and politely silence overly talkative participants  Facilitate agreement to ensure action on the deliverable  scribe  Record all issues clearly, accurately, and unambiguously. If not sure of a particular issue, seek clarification

8.38 Review Member  Pre-requisites for review member:  Know the review process  Be positive and supportive  Interested in improving the quality of the deliverable and the review process  Preparation for review:  Read and annotate material before review (be prepared)

8.39 Review Leader  Pre-requisites for review leader:  Know the review process  Know the objectives & standards for the review  Be objective & aware of the implications the review may have for certain participants  Prepare:  Agenda  Organise venue & any ancillary materials necessary  Notify participants & ensure they have review deliverable, agenda, & standards in advance.

8.40 Review Leader  During the review:  Were preparations suitable ? - if not you may have to re-schedule  Secure agreement on objectives and standards  Encourage input from all participants, and politely silence overly talkative participants  Facilitate agreement to ensure action on the deliverable  Ensure participants understand he ramifications of these actions  Ensure scribe has accurately documented review

8.41 Review Leader  After the review:  Was the review successful?  Is the outcome likely to produce a better quality product?  Can the review process be improved? Can the quality of the quality system be improved ? - this is continuous improvement.  Do the objectives and standards for the type of deliverable need review  Is the presenter of the deliverable capable of improving the quality of the item  When will the deliverable be reviewed again? Who is responsible? (Are they aware of the responsibility?)

8.42 Scribe  Pre-requisites for the scribe:  Know the review process  Know the objectives and standards for the current review  Be objective and aware of the implications the review may have for certain participants

8.43 Scribe  During the review:  Record all issues clearly, accurately, and unambiguously. If not sure on a particular issue, seek clarification  Be sure to use clear reference pointers between the review action list (report) and the review deliverable.  Gather copies of any materials introduced by participants to support an issue.  After the review:  Check the review action list is accurate and promptly distributed to relevant people (who may not have been in the review)