T Iteration I1 Demo Software Trickery PP Iteration
2 Agenda Project status (10 min) –Achieving the goals of the iteration –Project metrics Work results (25 min) –Presenting the iteration’s results Architecture Implemented use cases Demo Quality Assurance Used work practices (5 min)
3 Introduction to the project Party management system v2.0 (PMS) –Modular web-based system –In use since last year Tournament management system module (TMS) –Replaces existing system (non PMS-module) –Extension module for PMS –A solution for managing game tournaments –Main user groups Administrators Tournament players Outside spectators
4 Status of the iteration’s goals Goal 1: Creating a working skeleton of the system –OK Goal 2: Test cases designed for the system –OK Goal 3: Development practices work efficiently –OK Goal 4: Successfully reach goals in the end of both iterations –OK
5 Sprint goals Sprint S1.1 –Functionalities to add and modify tournaments OK Sprint S1.2 –Functionalities to add players and register for tournaments OK –Implementing matches OK –Running basic tournaments OK
6 Status of the iteration’s deliverables Project plan –OK Requirements document –OK, all important requirements documented in general level Progress report –OK SEPA diaries –OK Architectural documentation –OK Test Cases –Automated acceptance-level test suites (Selenium framework)
7 Resource usage *Sepa resourcing Mostly Ok PMQMSAMaGMiAToAJaLMaHErHSUM PP I I Total Original plan (in the beginning of the iteration) Realization and updated plan (realized hours and updates) PMQMSAMaGMiAToAJaLMaHErHSUM PP I I2 5253*58104* Total
8 Resource usage S1.1 Sprint went according to plan with acceptable error in estimation
9 Resource usage S1.2 Estimates upward Project reporting tasks are mostly based in the end of period
10 Changes to the project Changes to processes from reflection workshops –”Development-task cards to wiki” Requirements –Minor changes due to discussed issues in weekly meeting E.g. All clan players are PMS-Users
11 Risks Risks identified in PP –Risk regarding communication realized slightly resulting in errors during testing Database update was not communicated to QM New risks identified during I1 –Members of project group have schedule problems which effect development practices
12 Results of the iteration Project plan –Project Plan document (Updated) Requirements –Requirements document (Updated) –Quality Assurance Plan Architecture –Architecture document –Design guidelines ? SEPA diaries (Continuous Integration and Automated Testing) –Pyry Lahti and Markus Granström Software –Demo
13 System overview
14 Tournament data hierarchy
15 Database structure, participants
16 TMS Status Implemented –Tournament and Phase Create/Modify –Inserting Match results (Automatic advancement in tournaments) –Registering to tournaments –Tournament tree visualizations (Single elimination) To Be Implemented –Additional bracket types (Double elimination, Round robin) –Clan functionalities and management –Additional administrative functionalities –UI refinement
17 Demo
18 Quality Assurance (1/3): Quality palette PRACTICE QUALITY GOALS FunctionalityUsabilityCode correctnessSecurityMaintainability Automated unit testing* *** Test case based testingRealized by 'Automated acceptance testing' Automated acceptance testing* ** Exploratory testing** * Pair programming *** Peer testing Informal code reviewing *** Refactoring * * Coding guidelines * * Document reviewing* * Static code analysis * * Practice was planned: * Effect on quality: red / yellow / green
19 Quality Assurance (2/3): Quality dashboard Part of the systemQualityConfidenceComments Administrative functionality 3-Thorough acceptance-level test suite -Some usability issues -Features still missing: register/unregister participants Participant functionality 2-Basic acceptance-level tests -Usability issues, not very well tested -Clan-related features still missing Tournament phases and progression 3-Unit tests and acceptance-tests for the most important modules -No known defects Spectator functionality 2-Basic acceptance-level tests -Many usability issues
20 Quality Assurance (3/3): CruiseControl & automated testing Automated unit-tests Automated acceptance-tests Code metrics and analysis Linking between failed acceptance-tests and defects See
21 Experiences on work practices Development days –Facilitates communication to help development, work great Side-by-side coding –Proven to work Continuous integration –Transparency to project status Early victory –Working skeleton of software in S1.1 Demo
22 Any Questions ? Thank you !