Presentation is loading. Please wait.

Presentation is loading. Please wait.

IMS Systems Analysis and Design

Similar presentations


Presentation on theme: "IMS Systems Analysis and Design"— Presentation transcript:

1 IMS9001 - Systems Analysis and Design
System implementation and maintenance

2 Implementation (Build)
System Users Build and deliver the system User acceptance testing Technical Design Report DESIGN User Documentation Build, test, install and deliver the new system User Training System Vendors Hardware/ Software Production System System and Technical Documentation System Owners Project Report MAINTENANCE

3 Implementing the System
implementation planning Build and test software Build/modify databases, networks etc. finalise documentation prepare the site convert data into required form and media conduct training install system monitor system transition to maintenance mode post-implementation review

4 Systems Implementation
Acceptance Checklist, Implementation Schedule, Training Schedule, Re-estimate Training Guides, User Manuals Test Data Preparation, System Test: Functional & Performance, Test Conversion Acceptance Test Computer Documents, I/O Documents, Operating Guide REVIEW FINALISE DOCUMENTATION CONDUCT SYSTEM TESTING CONDUCT ACCEPTANCE TESTING OPERATIONS HANDOVER IMPLEMENTATION PLANNING

5 Systems implementation cont.
CONDUCT TRAINING GET SYSTEM READY FOR START-UP Distribute Manuals, Test Equipment, Conduct Training, Set up / Convert Files System Installation, Monitor Operations, Secure Acceptance, Run Benchmark Tests, Tune System Hand over Technical Documentation, Post Implementation Review (What went wrong ?) REVIEW CONDUCT SYSTSEM ACCEPTANCE REVIEW WRAP UP REVIEW

6 Implementation Planning
Implementation stage of the project requires a great deal of co-ordination with professionals outside the development team Implementation plan will have been developed at earlier stage of project will need to be extended in greater detail must be updated to reflect the current situation Poor planning can cause significant delays in deadline! Tasks finalise acceptance checklist complete and confirm training schedule review and revise implementation plan

7 Finalise Documentation
Documentation describes how a system works to a wide audience The four main areas are: Training documentation used specifically during the training sessions especially designed to put the novice user at ease User documentation tells users how to work with the system and perform their tasks may be a user manual, on-line help, quick reference guide etc

8 Finalise documentation cont.
System documentation a communications tool and to review and revise the system during development also facilitates maintenance and enhancement of the system Operations documentation aimed at a centralised operations group (not on-line operators) details what tasks an operator needs to carry out for a particular program

9 Testing Testing is ... " the process of exercising or evaluating a system by manual or automatic means to verify that it satisfies specified requirements or to identify differences between expected and actual results " (IEEE, 1983)

10 Testing Steps All testing involves the following steps:
select what is to be measured by the test decide how it is to be tested develop the test cases determine the expected or correct results (you must ensure that expected results can be measured - vagueness does not encourage adequate testing) execute the test cases compare actual results to expected results Mynatt, B.T. (1990) - p Selct what is to be measured by the test: the tester must determine exactly what the goal of the test is .. requirements tested for completenss ? design being tested for cohesion ? code being tested for reliability ? Decide how it is to be tested: what kind of test is most suitable inspections, proofs, black-box and white-box testing, automated tools. The tester must decide what sort of test is suitable to measure the chosen quality. Develop the test cases: test data or situations used to reveal something about the attribute being measured Determine the expected or correct results: the tester should have all expected results written down BEFORE the actual testing. Otherwise easy to miss things Execute the test cases: may require special software to be written Compare actual results to expected results: All discrepancies need to be tracked down. Sometimes the error may be in the testing process.

11 Stages of Testing Installation test system in use User requirements
Performance test Function test Unit (module) test Installation test Acceptance test Integration test tested modules integrated modules functioning system validated software accepted system system in use Systems design Systems implementation Systems analysis & design User requirements Systems specifications Program specifications Programs, procedures, data

12 Module or Unit Testing Each module is tested individually
Lists what is being tested Lists expected outcome Identifies data to be used .. all possible combinations Who carries out Module Testing? Programmer - tests at code level Analyst - tests at application level Pfleeger, SL (1987) p Burch, J.G. (1992) p So we have a cleanly compiled computer program. What should we do ?? Go to the pub ! or unit test it before we go on to the next module. Unit Testing is the most effective way to test the logic within a module ... but it doesn't matter how thorough the unit testing has been, when all modules are brought together, the result may still be a shambles. So lets look at some testing strategies.

13 Integration Testing Verifies that system components work together:
data can be lost across interfaces a function may not perform as expected when combined with another function one module can have an adverse effect on another Use an incremental approach to integrate modules- easier to detect and correct errors: Top-down testing Bottom-up testing Sandwich testing

14 System Testing The process of testing the integrated software in the context of the total system it supports Performed after all unit and integration testing is complete Tests conducted at this stage include: Function tests - demonstrate that all the functions specified for the system in the requirements specification are operational Performance tests - demonstrate that the system meets the non-functional requirements specified.

15 Function Testing Performed after all programming and integration testing is finished Test cases must cover every aspect of the system’s functionality should have a high probability of detecting errors Test plan should be developed from the original specification must include expected results that are measurable

16 Performance Testing Compares the integrated modules with the non-functional system requirements such as speed, performance Stress tests Volume tests Configuration tests Compatibility tests Security tests Documentation tests Timing tests Environmental tests Quality tests Recovery tests Maintenance tests Human factors tests

17 Acceptance Testing Involves installing the system at user sites and is required when acceptance testing has not been performed on site The test focuses on completeness of the installed system and verification of any functional or nonfunctional characteristics that may be affected by site conditions Testing is complete When the customer is satisfied with the results The system can then be formally delivered

18 Prepare the Site Ensure that facilities are adequate:
adequate space for all resources, ergonomic furniture, noise reduction, privacy, security, appropriate electrical connections, uninterrupted power, etc. Install the hardware and software required to run the system must be tested to ensure no damage during transportation, product not defective, product changes between purchase and delivery are acceptable etc People responsible Vendor Engineer, Technical Support Group Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

19 Data Conversion Current production data needs to be converted
Format, Content, Storage Medium Done according to the conversion plan Manual file conversion is a time-consuming task Often needs specially written conversion programs e.g. Database Load Program Record Transformation Program Data must be confirmed to be correct May need to support both old and new systems’ files can introduce time lag files may be out of step Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

20 Conduct Training Need to consider: who is the audience?
what level of detail should be imparted to the audience? who should conduct the training? where should the training be conducted? when should the training be conducted?

21 User Training Training - a complete and concentrated course in system use at the time of delivery Ongoing training needs; new staff, staff changes etc. Training must be planned methods resources should also consider Help during and after installation for new users, infrequent users and users who want to "brush up" Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

22 User training Training aids
must be easy to use reliable demonstrations and classes documentation on-line help and icons expert users Supportive User Manager who provides training, motivation, support Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

23 Install the System Method of installation depends on several criteria
Cost - if there are cost constraints certain choices are not viable System criticality - if system failure would be disastrous, the safest approach should be selected regardless of cost User computer experience - the more experience the users have, the less necessary it is to delay changeover System complexity - the more complex the system, the greater the chance of flaws ... a safer approach is better User resistance - need to consider what the users are best able to cope with

24 Install the System Alternatives Direct installation or Abrupt cut-over
Parallel installation Phased installation or Staged installation Pilot installation or Single Location conversion Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

25 Direct Installation (Abrupt Cutover)
Old system stops and new system starts Total cutover Old system New system Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

26 Direct Installation (Abrupt Cutover)
This approach is meaningful when the system is not replacing any other system the old system is judged absolutely without value the old system is either very small and/or very simple the new system is completely different from the old and comparisons would be meaningless Advantages costs minimised Disadvantages high risk Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

27 Parallel Installation
Old and new systems operated concurrently Old system New system Total cutover Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

28 Parallel Installation
Old and new systems operated concurrently Cut-over at end of a business cycle Balancing between both systems Advantages risks low if problems occur Disadvantages cost of operating both systems 2.5 times the resources Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

29 Phased Installation (Staged Installation)
System installed in stages Old system Total cutover New system Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

30 Phased Installation (Staged Installation)
System installed in stages Subsequent stages provide more features Phases or stages need to be identified at general design Advantages lower costs for earlier results benefits can be realised earlier rate of change for users minimised Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

31 Phased Installation (Staged Installation)
Disadvantages close control of systems development is essential costs associated with the development of temporary interfaces to old systems limited applicability demoralising - no sense of completing a system. Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

32 Pilot Installation Old and new systems operated concurrently
Total cutover Old system Old system New system Old system New system New system Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

33 Pilot Installation Old and new systems operated concurrently
Only part of the organisation tries out the new system The pilot system must prove itself at the test site Advantages risks relatively low if problems occur errors are localised can be used to train users before implementation at their own site Disadvantages lack of consistency between different parts of organisation

34 Monitor Operations Monitor user satisfaction Run benchmark tests
with functional requirements with system performance Run benchmark tests Tune system

35 Transition to Maintenance
Most organisations have formal procedures set up A "maintenance" section is responsible! Procedures should be set up to request maintenance Owners of the new system must be informed of relevant procedures Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

36 Maintenance Fix it / Make it better Maintain the new system
Users Fixes and enhancements Problems/New ideas Maintain the new system Additional training and documentation Technical problems and new technology Modifications Escalating maintenance Project staff PRODUCTION SYSTEM back to INITIATION

37 Maintenance Corrective - fix errors Adaptive - satisfy changing needs
Perfective - enhance performance Preventative - fix potential problems If the cost of maintenance is too high consider other options: new development, purchase a software package, re-engineer/modify

38 Corrective Maintenance
Corrects analysis, design and implementation errors Can be the most expensive kind of maintenance costs of functions not working correctly having to undo what has been developed Requires immediate attention typically urgent, interfere with normal operations Needs skilled maintenance staff to ensure rapid diagnosis of errors and their correction must have or quickly develop high level of familiarity with the system May use software tools for diagnosis

39 Adaptive Maintenance To satisfy changes in the environment, changing business needs or new user requirements changes in tax laws, takeovers and mergers, new OS, etc new type of report, new class of customer etc. Less urgent - changes occur over time Adaptive maintenance is inevitable, does add value Maintenance staff need strong analysis and design skills as well as programming skills changes often require a complete SDLC also need good understanding of the system

40 Preventative Maintenance
Pay now or pay more later defects or potential problems found and corrected before they cause any damage reduce chance of future system failure eg expand number of records beyond needs, standardise formats across platforms A natural by-product of maintenance work - identify and fix any potential problems noted while fixing other errors Ideally have periodic (monthly / half-yearly / annual) reviews of system to uncover and anticipate problems

41 Perfective Maintenance
To enhance performance, maintainability, usability adds desired features rather than required better run times, faster transaction processing etc. To meet user requirements not previously recognised or given high priority missed in development or not known about considered unimportant Legacy systems (old systems running for at least 10 years) are likely candidates for perfective maintenance May involve technical systems specialists as well as general maintenance staff network specialist to change network design for improved performance

42 Cost Elements of Maintenance
Maintainability the ease with which software can be understood, corrected, adapted and enhanced Low maintainability results in uncontrollable maintenance expenses The following factors affect ‘maintainability’ Defects Customers Documentation Personnel Tools Software structure

43 Cost Elements of Maintenance
Defects the number of latent or unknown errors existing after system installation influences most maintenance costs, drives all other cost factors few errors --> low maintenance costs Customers the number of customers/users of system more customers, more maintenance effort/cost greater need for high maintainability

44 Cost Elements of Maintenance
Documentation quality of system documentation exponential effect on maintenance costs Personnel quality of maintenance personnel highly skilled programmers, typically not original programmers, to quickly understand and carefully change system separate from development? in-house? dedicated end-user support?

45 Cost Elements of Maintenance
Tools appropriate automated development tools programming tools, code generators, debuggers, hardware, CASE, diagnostics, etc reverse engineering for no documentation Software structure quality of software structure and maintainability formalisation of code, comments, versioning structure charts, OO

46 Measuring Maintenance Effectiveness
There is a need to measure maintenance understand quality of development/maintenance effort We measure the following factors number of failures time between each failure type of failure Mean Time Between Failures (MTBF) calculated using number of failures and time between each failure, widely used measure of quality

47 Maintenance Life Cycle
Software Maintenance Life Cycle (SMLC) receive a Maintenance Request transform the Maintenance Request to a Change (analysis) specify the Change (design) develop the Change (code) test the Change train users and run an acceptance test convert and release to operations update the documentation conduct a Post-Maintenance Review Chapin, 1988

48 Review What went wrong/right? Why? Review the system and the project
Users System Audit Report Problems/New ideas Review the system and the project Auditor Project issues and system bugs Fixes and enhancements MAINTENANCE Project Review Report Steering Committee Project staff

49 Post Implementation Review
A PIR analyses what went right and wrong with a project. It is conducted 2 to 6 months after conversion by a team which includes user reps, development staff, internal auditors and sometimes external consultants - development team is not in charge look at original requirements and objectives evaluate how well they were met compare costs of development and operation against original estimates (maintenance costs ??) compare original and actual benefits new system reviewed to see whether more of original or additional benefits can be realised Who carries out SYSTEM TESTING ? Analyst, Designer, Project Leader with the help of the user, if one on the development team.

50 References HOFFER, J.A., GEORGE, J.F. and VALACICH (2005) Modern Systems Analysis and Design, (4th edition), Pearson Education Inc., Upper Saddle River, New Jersey, USA. Chapters 15, 16 WHITTEN, J.L., BENTLEY, L.D. and DITTMAN, K.C. (2001) 5th ed., Systems Analysis and Design Methods, Irwin/McGraw-HilI, New York, NY. Chapters 16, 17


Download ppt "IMS Systems Analysis and Design"

Similar presentations


Ads by Google