4 Development Methodologies Traditional Waterfall ModelSystems Development Life Cycle (SDLC)Structured Systems Analysis and Design Rapid Applications Development (RAD)Spiral ModelAgile Methodologies
5 Development Methodologies (1/2) Agile software developmentAgile Unified Process (AUP)Open Unified ProcessBest practiceCathedral and the Bazaar, Open sourceConstructionist design methodology (CDM)Cowboy codingDesign by Use (DBU)Design-driven development (D3)Don't repeat yourself (DRY) or Once and Only Once (O3)Dynamic Systems Development Method (DSDM)Extreme Programming (XP)
6 Development Methodologies (2/2) Test-driven development (TDD)Unified ProcessWaterfall modelWorse is better (New Jersey style)Extreme Programming (XP)You Ain't Gonna Need It (YAGNI)Iterative and incremental developmentKISS principle (Keep It Simple, Stupid)MIT approachQuick-and-dirtyRational Unified Process (RUP)Scrum (management)Spiral modelSoftware Scouting
9 The Waterfall ModelWhatever means of software acquisition you choose, all the stages of the development life cycle are followed. However there are some differences in terms of what happens at each stage depending on whether you opt for bespoke, off-the-shelf purchase or end-user development
10 Drawbacks of SDLC Sequential nature of life cycle Bureaucratic, long winded, expensiveMinor changes can cause problemsCost of correcting errorsMisunderstandings/omissions may not come to light until user acceptance test stage – maybe too late to make significant changesChange may be needed after sign off by user
11 Drawbacks of SDLC User Dissatisfaction Early sign-off Incorrect functionalityIncomplete functionalityUser friendlinessBugsLack of participation
12 Drawbacks of SDLC Applications backlog VisibleInvisibleFailure to meet needs of managementStrategic/tactical potential ignoredUnambitious systems design
13 Drawbacks of SDLC Problems with documentation Maintenance workload User acceptanceRestrictiveSlowMaintenance workloadInflexibilityTo cope with rapidly changing business climate
14 ‘V’ Model Project Initiation Requirements specification Product phase outEvolutionAcceptance testingDetailedrequirements specificationSpecificationVerified SystemArchitecturalsoftware designSystem Integration & testIntegrated softwareDesignSoftware Integration & testDetailed software designQAQADebugged ModulesModule DesignCode & Unit test
15 SSADMOnly covers part of the system development process, i.e. analysis and design.It emphasises the importance of the correct determination of systems requirements.
16 SSADM Stages Feasibility Study Requirements Analysis Stage 0 – FeasibilityRequirements AnalysisStage 1 – Investigation of current requirementsStage 2 – Business Systems OptionsRequirements SpecificationStage 3 – Definition of Requirements
18 Rapid Applications Development (RAD) A method of developing information systems which uses prototyping to achieve user involvement and faster development compared to traditional methodologies such as SSADM.A prototype is a preliminary version of part or a framework of all of an information system which can be reviewed by end-users. Prototyping is an iterative process where users suggest modifications before further prototypes and the final information system is built.
22 The Capability Maturity Model for Software Development A 5 stage model for judging the maturity of the software processes of an organisation and for identifying the key practices that are required to increase the maturity of these processes.Many large specialist organisations (e.g. NASA) have achieved the higher levels.Many smaller companies have processes that are at stage 1 or 2.
23 Dynamic Systems Development Methodology (DSDM) A methodology that describes how RAD can be approached.The focus of this approach is on delivering the business functionality on time.Testing is integrated throughout the life cycle and not treated as a separate activity.For further information refer to:
29 Basic research methods Quantitative research (e.g. survey)Qualitative research (e.g. face-to-face interviews; focus groups; site visits)Case studiesParticipatory research
30 Quantitative research Involves information or data in the form of numbersAllows us to measure or to quantify thingsRespondents don’t necessarily give numbers as answers - answers are analysed as numbersGood example of quantitative research is the survey
31 SurveysThink clearly about questions (need to constrain answers as much as possible)Make sure results will answer your research questionCan use Internet for conducting surveys if need to cover wide geographic reach
32 Qualitative researchHelps us flesh out the story and develop a deeper understanding of a topicOften contrasted to quantitative researchTogether they give us the ‘bigger picture’Good examples of qualitative research are face-to-face interviews, focus groups and site visits
33 Face-to-face interviews Must prepare questionsGood idea to record your interviewsInterviews take up time, so plan for an hour or less (roughly 10 questions)Stick to your questions, but be flexible if relevant or interesting issues arise during the interview
34 Focus groupsTake time to arrange, so prepare in advance (use an intermediary to help you if you can)Who will be in your focus group? (e.g. age, gender)Size of focus group (8-10 is typical)Consider whether or not to have separate focus groups for different ages or genders (e.g. discussing sex and sexuality)
35 Site visits and observation Site visits involve visiting an organization, community project etcConsider using a guideObservation is when you visit a location and observe what is going on, drawing your own conclusionsBoth facilitate making your research more relevant and concrete
36 Case studiesMethod of capturing and presenting concrete details of real or fictional situations in a structured wayGood for comparative analysis
37 Participatory research Allows participation of community being researched in research process (e.g. developing research question; choosing methodology; analysing results)Good way to ensure research does not simply reinforce prejudices and presumptions of researcherGood for raising awareness in community and developing appropriate action plans
38 Planning your research: Key questions What do you want to know?How do you find out what you want to know?Where can you get the information?Who do you need to ask?When does your research need to be done?Why? (Getting the answer)
39 Step 1: What? What do I want to know? When developing your research question, keep in mind:Who your research is for;What decisions your research will inform;What kind of information is needed to inform thosedecisions.Conduct a local information scanTake another look at your research question
40 Step 2: How? Where? Who? How do I find out what I want to know? Where can I get the information I need?Who do I need to ask?Choose your methodologyquantitative or numbers informationqualitative in-depth explanatory informationcase studiessite visits or observationparticipatory research
41 Step 3: When?When do all the different parts of the research need to be done?List all your research work areasMap them against a timelineDevelop a work plan
42 Step 4: Why? Getting the answer Collect your dataKeep returning to your research questionOrganize your research results to answer the questionKeep in mind who you are doing the research forFocus on what research results do tell youBe creative, methodical and meticulous
44 Requirements Elicitation Information to elicit:– Description of the problem domain– List of problems/opportunities requiring solution (the requirements)– Any client-imposed constraints upon system
45 Requirements Elicitation Requirements Elicitation Techniques:– Background Reading– Hard data collection– Interviews– Questionnaires– Group Techniques– Participant Observation– Ethnomethodology– Knowledge Elicitation Techniques
46 Sources of Information Clients (actual and potential)Users of systems (actual and potential)Domain ExpertsPre - existing system (within the problem domain)Other relevant productsDocumentsTechnical standards and legislation
47 Challenges of Elicitation (1/2) • Thin spread of domain knowledge– The knowledge might be distributed acrossmany sources. It is rarely available in an explicitform (i.e. not written down)– There will be conflicts between knowledgefrom different sources.• Tacit knowledge (The “say - do” problem)- People find it hard to describe knowledge theyregularly use.
48 Challenges of Elicitation (2/2) • Limited Observability– The problem owners might be too busy coping with the current system.– Presence of an observer may change the problem, e.g. Probe Effect, Hawthorne Effect• Bias– People may not be free to tell you what you need to know.– People may not want to tell you what you need to know.• The outcome will affect them, so they may try to influence you (hidden agendas)