Presentation is loading. Please wait.

Presentation is loading. Please wait.

JTAMS PRE-CDR IT/SIS ANALYSIS

Similar presentations


Presentation on theme: "JTAMS PRE-CDR IT/SIS ANALYSIS"— Presentation transcript:

1 JTAMS PRE-CDR IT/SIS ANALYSIS
PRACTICUM 3 JTAMS PRE-CDR IT/SIS ANALYSIS 19 October FY 3 PM JTAMS / PM JUGGERNAUT Student Name: Student Name: PM JTAMS Lead PM JUGGERNAUT Lead Presentation Intro/Overview Key Software-related Questions for CDR Measures Assessment Data Rights Analysis Award Fee Criteria Analysis (JTAMS) Juggernaut’s Award Fee Analysis Software Development Planning AMRU’s Development Approach MRES Risk Assessment MRES Measures Assessment PRACTICUM 3 Note: P1 and P2 we are Milestone centric and P3 is CDR focused. Student Instructions: Each team will develop the practicum solution slide package using the templated slides provided, the given scenario/artifacts –additional info is in the Job Aids folder. All slides will be submitted to the instructors electronically (Blackboard/Shared Drive) and hard copy (printed 2 slides per page and double sided) Each team will designate a PM’s to present the results of the analysis prepared by the team. You will have 1.5 hours for research and development of the practicum solution. The PM and Deputy PM will have 5 minutes to present their designated slides. Instructor Notes: These are the required slides the students will prepare. All sections of the practicum requirement will be submitted for grading electronically and hard copy. The sections highlighted in red will be presented – see student templates – by the students designated as the PM JTAMS and PM JUGGERNAUT. Instructors are free to modify this requirement as they see fit. The instructor slide template has suggested answers and the grading rubric identifies the grading approach. The 5 teams will each brief the selected slides and the instructor will grade the individual briefers for an individual “PM” grade according to the rubric. Each team will have 2 briefers per practicum. Instructors will also grade the entire package for the team grade according to the rubric – this will be done after the presentation time. All practicums are followed by in-class homework time for the students so the instructors can take that 30 minutes to execute the team and individual grading. The 5 teams will be divided up so that both instructors will take briefings – since there are 5 teams and normally 2 instructors, one instructor will take 3 teams (6 briefers) and the other instructor will take 2 teams (4 briefers). I recommend the instructor taking the 3 teams stay in the classroom and condense their feedback during the actual session. The instructor taking the 2 teams can use the required “team room” and can take that opportunity to elaborate on the major takeaways and other learning points. There is no penalty IAW the rubric for students exceeding the 8 minutes per brief – however, it is important for the instructors to enforce this time limit especially if you do not have a separate team room and/or a 2d instructor. There is sufficient time in the schedule to conduct all briefings in one classroom. Additionally, there is the option of not having some students brief – if you elect this option, you can designate the students that did not brief P1-P3 to brief during P4 or grade the student during the Discussion Panel brief (the DP option should be the option of last resort) Summary – Are We Ready for CDR? (Joint Brief) PRACTICUM 3

2 Key Software-Related Questions for CDR
Requirements Management: Are requirements being managed and traced from higher level (parent) requirements to lower level (offspring) requirements? Are there any “orphan” or “childless” requirements? Is there full traceability from systems requirements allocated to software provided through: software requirements, software design, interface requirements, interface design, source code and test procedures? PRACTICUM 3 See: Job Aids\Technical Review Also use the NOTE document in the TECHNICAL REVIEW FOLDER. NOTE: Research the DoD CDR Checklist for Requirements Management and identify three (3) key questions. (Any 3 key questions can be chosen; explain why you picked the ones you did) This paragraph from the student instructions explains the current situation w/ respect to potential requirements volatility: Also, based on requests from field experiences cited by Combatant Commanders, the J-8 is making substantial changes in UAV/UGV robotic employment doctrine. This has resulted in the addition of additional mission profiles to the JRATS program and revisions in the JRATS requirements baseline. PM JTAMS SITUATION The J – 8 is making substantial changes to UAV/UGV robotic employment doctrine – this will result in requirements creep. EXPLAIN WHY YOU CHOSE THOSE QUESTIONS AND HOW THEY RELATE TO PM JTAMS SITUATION

3 Key Software-Related Questions for CDR
Management metrics relevant to life cycle phase: (Latest Cost Estimate) CDR Question Answer and Discussion Has the software cost estimate been updated based upon actual measured project software development performance and productivity to date? If not, what are the implications? There is no evidence that any updated cost estimates have been produced. At this point in the dev elopement, we should be able to adjust the parameters of our estimate to provide a more accurate estimation of SW costs. Has cost of acquiring, licensing and configuring COTS and / or GOTS computer hardware and software been considered? Was this included in the original estimate? Identify any implications Again, while we initially may have included aspects of using legacy GOTS SW, we have not updated our SW cost estimates to address any changes, updates, etc that may be required. Since we likely leveraged some benefits of using legacy systems and SW, we should at this point have identified the actual implications of using the existing GOTS. What caused (or could be the cause of) a change in the software cost since the beginning of the project, if any? (based on the current scenario) Many factors could impact SW costs – students should look at the parameters they originally used in cost estimation to identify areas that may have changed. Many factors such as the potential need to replace some legacy SW, use of OSA, requirements for Cybersecurity, etc could impact our original cost estimate PRACTICUM 3 See: Job Aids\Technical Review Also use the NOTE document in the TECHNICAL REVIEW FOLDER. NOTE: Apply the questions identified to the JTAMS scenario. You may have to refer back to the cost estimating portions of Practicums 1 and 2. Student answers will vary – leverage the various factors included in SW cost estimating from lesson 3 and the various factors used to generate the original cost estimate. See the Cost Factor Cheat sheet for review (double-click on it)

4 JTAMS MEASURES ASSESSMENT – for INSTRUCTORS ONLY – STUDENTS DO NOT HAVE TO PROVIDE THIS LEVEL OF DETAIL – SEE NEXT SLIDE FOR STUDENT REQUIREMENTS Provide an assessment given the key indicators below Worksheet #1 Indicator Interpretation Recommendations JRATS External Interfaces The total number of Identified External Interfaces has remained fairly steady (with a slight increase). As of May FY2, there were a large number that were Approximated and relatively few that were Approved. The approximated interfaces introduce a huge area of risk in the program. As of Oct FY2, there still remains over 100 Approximated interfaces and only about half of the have been Approved. There are still 42 undefined interfaces. There remains a lot of work to be done to finalize the interfaces, and a lot of risk. From a traditional perspective these measures indicate at best ‘slow’ progress! Significant backlog remains! Determine reason for slow progress; most likely due to ‘non-dedicated’ government User involvement! All external interfaces should be specified (or mostly specified) by CDR time. Current trend indicates that all interfaces will NOT be specified by that time. JTAMS Subsystem Interfaces The majority of Approximated Interfaces are associated with MRES. PAMS and THS are both the furthest along in having Approved Interfaces. Trending towards goodness! Work to reduce MRES backlog All internal interfaces should be specified (or mostly specified) by CDR Requirements Volatility Across all JTAMS subsystems, there continues to be requirements growth and change. In Dec FY1, most of the new requirements were identified, although additional ones have continued to be defined. If overall requirements volatility is reasonable why are INTERFACES having significant acklog…DETERMINE! Drill-Down – Requirements Volatility by Subsystem From the drill down, it is clear that much of the growth in requirements is confined to MRES. Both MRES and THS are experiencing changing requirements, most likely as a result of the changing battlefield doctrine. As MRES changes, THS must change to keep the training in alignment with the mission rehearsal capability. PAMS has shown the least volatility. Again, MRES is the problem child Focus on resolving MRES delays MRES requirements need to be finalized and monitored. Defects – Open, Closed, Backlog The number of defects found in peer reviews is increasing as is the backlog. The closure rate needs to increase significantly. The rate of open defects is greater than the defects being closed. Rate of closing defects is increasing the backlog. Recommend determing reason! Defects by Priority The majority of defects being found are associated with the lowest priority. This should raise concerns about the effectiveness of the peer reviews. The program will NOT reach the 70% of defect removal with this approach. DEFECT overall backlog is not reasonable, PRI-5 Kicking the can??? Watch out for Pri-1-2; Focus! PRACTICUM 3 Instructions: 1.Review the set of measures and complete Worksheet #1. 2.Do an integrated analysis. Perform a causal analysis to identify root causes and underlying problems. Provide a summary for the two or three major issues, describing the problems, impacts, risks, potential outcomes, and alternatives. Document your analysis on Worksheet #2. See: Job Aids\Measures\JTAMS Measures for section 3 This is the “school solution” to the exercise. The students will likely identify a different set of answers but should be in the ball park. The answers are gleaned from the updated scenario and from information in the measures portion of the Job Aids. Student Instructions Objective: Given the JRATS/JTAMS measurement scenario, evaluate status and recommend course of action during the Manufacturing phase of EMD for the JTAMS software effort.

5 JTAMS MEASURES ASSESSMENT
Summarize your analysis Worksheet #2 – Part 1 Problems: Summarize the problems in the areas of: JRATS External Interfaces; JTAMS Subsystem Interfaces; Rgmnts Volatility; Defects by Priority . 1. Interfaces are not yet completely specified. Too many are still Approximated, and some are still not defined. 2. Defect identification is significantly behind where it should be, in order to meet the goal of 70% defect removal by CDR. The defects identified are primarily priority 5 defects, which are generally of minor consequence - typos and formatting issues (during peer review). The peer review process does not seem to be identifying significant technical issues. 3. Requirements continue to be volatile, especially in MRES and THS. Impacts: Describe the root causes and assess the potential impacts. Include the implications for Software Quality. Incomplete interfaces and requirements volatility at this point in the development timeline are indicators of significant risk in the program. There are likely to be continued changes, which will necessitate rework to design artifacts and the developed code. The later these changes occur, the larger the projected schedule, cost, and quality impacts. If no action is taken the program will likely overrun cost estimates and the schedule will slip. Also, late changes to the code will create a difficult to manage simultaneous development environment for interfacing software components, negatively impacting the system software capability, reliability, and quality. If no further action is taken, project will not make it to CDR on time PRACTICUM 3 Instructions: 1.Review the set of measures and complete Worksheet #1. 2.Do an integrated analysis. Perform a causal analysis to identify root causes and underlying problems. Provide a summary for the two or three major issues, describing the problems, impacts, risks, potential outcomes, and alternatives. Document your analysis on Worksheet #2. See: Job Aids\Measures\JTAMS Measures for section 3 This is the “school solution” to the exercise. The students will likely identify a different set of answers but should be in the ball park. The answers are gleaned from the updated scenario and from information in the measures portion of the Job Aids.

6 JTAMS MEASURES ASSESSMENT
Provide alternatives given your analysis Worksheet #2 – Part 2 (Alternatives) Alternatives List the alternative courses of action you would consider for this project. 1. Put together an expert technical team to finalize and approve interfaces and requirements. The team should have representatives from each system component. This team should assess the criticality of any undefined items, based on required functionality and cost/schedule impacts. Consider delaying implementation of interfaces or requirements that cannot be finalized, and those that are less critical. 2. Put in place a more rigorous control process for requirements additions and changes. Carefully evaluate any proposed requirements additions or changes, for impacts to cost, schedule, and quality. Defer non-critical additions and changes to a future release of the system. Possibly develop some Agile processes as appropriate. 3. Assess the peer review process, and other processes that help to identify technical problems, such as the unit testing, CM, QA, etc. Identify improvements that focus on finding significant technical issues. Improve the review teams to more experience project personnel with domain and program expertise. Emphasize that significant technical issues should be identified earlier. PRACTICUM 3 Instructions: 1.Review the set of measures and complete Worksheet #1. 2.Do an integrated analysis. Perform a causal analysis to identify root causes and underlying problems. Provide a summary for the two or three major issues, describing the problems, impacts, risks, potential outcomes, and alternatives. Document your analysis on Worksheet #2. See: Job Aids\Measures\JTAMS Measures for section 3 This is the “school solution” to the exercise. The students will likely identify a different set of answers but should be in the ball park. The answers are gleaned from the updated scenario and from information in the measures portion of the Job Aids. Presentation Slide – note the students only have one slide to present that includes Problems, Impacts, and alternatives

7 Real-time Engine Data Rights Analysis
# COA Cost ($) or H/M/L Schedule (Months or H/M/L) DR (R, GP, U) Risk and Risk level (L, M, H) 1 Challenge the V-Robotics assertion (do nothing) Low High GP 2 Negotiate for Government Purpose Rights 15M N/A 3 Develop the Real-time software engine to support MRES 5M 9m U 4 Accept Juggernaut’s assertion and agree to the limited rights associated R 5 Allow another contractor to do the work 6M 4m U/GP 6 Student identified COA Look for COTS product that will perform task Mod M: We may not be able to find one or we may not be able to acquire DR Which option is the best from a Government perspective and why? Which option is best from Juggernaut’s perspective and why? Which option is your recommended option and why? PRACTICUM 3 See: Job Aids\Data Rights Instructions: Identify and assess COA’s for addressing the data rights issue - for the real-time engine. You are required to develop another COA – row 6 on the slide. There is data in the contract that can give you cost and schedule impacts for some of the COA’s - estimate or use the H/M/L approach if you do not have cost or schedule information Do not just identify risk as H/M/L – explain the factors that went into your risk assessment – what is the risk, etc? Use the lesson material from lesson 16 and any references for that lesson to understand “Restricted Rights” “Government Purpose Rights” and “Unlimited Rights”: Identify the risks the Government might incur: The Government would assume the risk of integrating the new real-time engine with the other software. Schedule and Cost risk of acquiring IP versus cost risk or not acquiring IP Assess the risks to the program and to the acquiring Service/Agency of not acquiring the desired product data or data rights due to cost or other considerations. Explore alternatives for acquiring the needed data rights for life cycle support from the OEM. Determine if, by not acquiring the data and rights, has the program locked the Government into a course of action that would be costly or impossible to change at a later date. One option could be to investigate the possibility of leaving to another increment

8 Award Fee Criteria Analysis (JTAMS)
Award Fee Weights For Period 1 (25 Jun FY2 After SRR) Risk Management = .5 Why? Because this must be done up front Configuration Management = .2 Interface Control = 0 Why? We don’t lock in until PDR so not relevant now Formal Inspections = 0 Why? We have nothing to inspect at this time End-to-End Requirements Traceability = .3 Why? Controlling Requirements is Key to a Successful Program Award Fee Weights For Period 2 (16 Nov FY3 After PDR) Risk Management = .3 Why? Must be maintained Configuration Management = .3 Why? Must increase now that we have products to control (Design) Interface Control = .15 Why? We must emphasize definition and control of interfaces now Formal Inspections = .1 Why? We have inspect the SSS, SRS, IRS, SDD, IDD, DBDD End-to-End Requirements Traceability Why? Controlling Requirements is Key to a Successful Program PRACTICUM 3 NOTE: PM Juggernaut was provided the Contract modification that added our Five Award Fee Areas See: Job Aids\Contract Refer to the updated scenario and to the contract information in the Job Aids SEE JTAMS CONTRACT, SECTION H.6; AWARD FEE DETERMINATION SECTION 4 – We have awarded a Cost-Plus-Award-Fee (CPAF) contract to JUGGERNAUT (See Job Aids) for the JTAMS effort. Using the contract, task 2 is to: (1) identify the Government Software Quality Performance Statement (Goals of Program); (2) given Section H and Section H.6.1 Risk Management Award Fee Determination (AFD) as an example, determine the Award Fee Area for Section H.6.2, H.6.3, and H.6.4 based on the criteria listed; (3) for Section H.6.5, you come up with your own Award Fee Area based on the Government Software Quality Goals; (4) Fill in the Award fee table weights for Contract Performance Period 1 (some already completed) and Contract Performance Period 2 and explain why you gave the areas the weights you did. See Contract in Job Aids. (LEAD ROLE: DPM JTAMS) ANSWER: The weights will vary and there is no correct answer as long as the students can justify their weights.

9 JTAMS PRE-CDR IT/SIS ANALYSIS
PRACTICUM 3 JTAMS PRE-CDR IT/SIS ANALYSIS 19 October FY 3 PM JTAMS / PM JUGGERNAUT PM JTAMS Lead PM JUGGERNAUT Lead Presentation Intro/Overview Key Software-related Questions for CDR Measures Assessment Data Rights Analysis Award Fee Criteria Analysis (JTAMS) Juggernaut’s Award Fee Analysis Software Development Planning AMRU’s Development Approach MRES Risk Assessment MRES Measures Assessment PRACTICUM 3 Note: P1 and P2 we are Milestone centric and P3 is CDR focused. Student Instructions: Each team will develop the practicum solution slide package using the templated slides provided, the given scenario, and the artifacts provided in the Job Aids folder. All slides will be submitted to the instructors electronically (Blackboard/Shared Drive) and hard copy (printed 2 slides per page and double sided) Each team will designate a PM JTAMS and PM JUGGERNAUT to present the results of the analysis prepared by the team. The student templates identify specific slides to be included in the formal student presentation (the additional slides are hidden). You will have 1.5 hours for research and development of the practicum solution. The PM JTAMS and PM JUGGERNAUT will have a maximum of 8 minutes to present their designated slides. Instructor Notes: These are the required slides the students will prepare. All sections of the practicum requirement will be submitted for grading electronically and hard copy. The sections highlighted in red will be presented – see student templates – by the students designated as the PM JTAMS and PM JUGGERNAUT. Instructors are free to modify this requirement as they see fit. The instructor slide template has suggested answers and the grading rubric identifies the grading approach. The 5 teams will each brief the selected slides and the instructor will grade the individual briefers for an individual “PM” grade according to the rubric. Each team will have 2 briefers per practicum. Instructors will also grade the entire package for the team grade according to the rubric – this will be done after the presentation time. All practicums are followed by in-class homework time for the students so the instructors can take that 30 minutes to execute the team and individual grading. The 5 teams will be divided up so that both instructors will take briefings – since there are 5 teams and normally 2 instructors, one instructor will take 3 teams (6 briefers) and the other instructor will take 2 teams (4 briefers). I recommend the instructor taking the 3 teams stay in the classroom and condense their feedback during the actual session. The instructor taking the 2 teams can use the required “team room” and can take that opportunity to elaborate on the major takeaways and other learning points. There is no penalty IAW the rubric for students exceeding the 8 minutes per brief – however, it is important for the instructors to enforce this time limit especially if you do not have a separate team room and/or a 2d instructor. There is sufficient time in the schedule to conduct all briefings in one classroom. Additionally, there is the option of not having some students brief – if you elect this option, you can designate the students that did not brief P1-P3 to brief during P4 or grade the student during the Discussion Panel brief (the DP option should be the option of last resort) Summary – Are We Ready for CDR? (Joint Brief) PRACTICUM 3

10 Juggernaut’s Award Fee Analysis
Section H.6.1 Risk Management JSSD Plan to Support Create Risk Management Plan approved by PM JTAMS and put under JSSD CM Control Emphasize our Employee Risk Identification Award Plan Assigned a Senior JSSD employee as Risk Manager Assigned a JSSD Lead for each Risk Area Critical Path Risks identified and reported daily Risk Mitigation activities are part of all of our processes Risk Control Profile filled out on all medium and high Risks Weekly updates to cost and schedule plans based on risk assessments Formal Risk Management Plan includes provision for employees who directly help minimize risk to be rewarded PRACTICUM 3 ANSWER: SEE JTAMS CONTRACT, SECTION H.6.1; RISK MANAGMENT

11 Software Development Plan
Identify the purpose and who produces the Software Development Plan. (SDP) Purpose: Provides the acquirer insight into, and a tool for monitoring, the processes to be followed for software development, the methods to be used, the approach to be followed for each activity, and project schedules, organization, and resources. Produced by: The developer produces and aligns with the developer’s Systems Engineering Management Plan (SEMP) Provide an overview of AMRU’s approach to software development: AMRU uses Agile methods (a combination of Scrum and XP, in particular) for all of its software development activities, including those for building the STIM modeling product for MRES. break the work into small batches and for each batch of work (called a Release), we use time boxes (three weeks, in our case) to focus on evolving a small part of the product from requirements to tested code, We call the time box a sprint We don't specify the details of ALL the requirements for the product at the beginning of the project. We lay out the "big" requirements (we call them epics) into a Release Plan (a release is made up of 3 sprints of 3 weeks each) Our teams are cross-functional. Our teams only work on one project at a time. An exception to this is certain specialty functions like User Experience designers -- they will come in to a team when needed. We build the evolving system every night and run automated regressions tests to ensure that the prior day's work hasn't "broken" anything. We use automated tools tuned to Agile practices for recording requirements (stories), managing tasking, and visualizing progress. PRACTICUM 3 See: Job Aids\Software Development This is the software architecture as identified in the SEP Identify the purpose and who produces the Software Development Plan. (SDP) Provide an overview of AMRU’s approach to software development Explain AMRU’s expected benefits or rationale for using this approach Answer the following questions concerning AMRU’s approach to development What concerns does the V-Robotics have concerning their subcontractor AMRU? How is requirements management affected by AMRU’s software methodology? How is requirements traceability affected by AMRU's software methodology? MRES Risk Assessment: Assess the risks AMRU has identified. Using the risk management cube methodology, identify 2 additional risks to JTAMS as a result of AMRU's software development approach MRES Measures Analysis: Using the cumulative flow diagram contained in AMRU's release report, identify the work completed, work in progress and the work remaining on 8/1/02. Explain AMRU’s expected benefits or rationale for using this approach Rationale/benefit: The result of this way of working is that we produce working software earlier than is typical in DoD traditional settings We know that customers need visualizations of progress. To that end, we use a visualization called a Cumulative Flow Diagram. It tracks progress from requirements receipt to completion of integration testing.

12 AMRU’s approach to development
What concerns does the V-Robotics have concerning their subcontractor AMRU? Uncertainty they can “keep to our promised schedule” ( A). V-Robotics has not “received any input from AMRU” ( A). Lack of visibility into “how requirements are being managed and traced” ( B) How is requirements management affected by AMRU's software methodology? How is requirements traceability affected by AMRU's software methodology? High level requirements are identified and incorporated into a release plan. Requirements are decomposed close to the time scheduled for them to be incorporated into a development cycle (sprint). The order in which the requirements are worked is dependent on their value to the customer and technical dependencies. (White Paper) Typical documentation such as a traceability matrix and Software Requirements Specifications are updated each sprint to reflect the software completed to date, but not finalized until the last release. PRACTICUM 3 See: Job Aids\ Measures\Agile - MRES Measures for Section 8 The answers to question comes from a review of the s in the Job Aids The answers can be found in the job aids Instructor notes: One of the biggest issues in Agile software development is that, because of the incremental, iterative nature of the small batches of software that move through the development system, requirements are at different levels of abstraction depending on where they are in the build queue. Software elements that are scheduled for near term development are at lower levels of details. Software elements that are lower in the development queue are at higher levels of abstraction than is typical during the earlier phases in a waterfall development. It’s a normal aspect of Agile development, but one that is difficult to deal with if the rest of the development is using waterfall approaches. This section of the questions focuses on some of the issues that acquirers face when one of the elements being developed is using Agile when the main acquisition is waterfall. If you only read the s, without understanding (via the Release Report, you may think that AMRU is trying to avoid meeting V-Robotics requirements. However, during this time period, AMRU is already producing early iterations of software that is available for V-Robotics to review, integrate, and test.

13 Question 3 - MRES Risk Assessment
Description of Risks Question 3 - MRES Risk Assessment Identify the risks AMRU has identified. Using the risk management cube methodology, discuss AMRUs likelihood and consequence assessment – then provide your own risk statement and assessment (place additional star on the cube(s)) - along with your rationale. (Note: see the Release Report). Revised if-then statement and rationale for “Star” placement AMRU Identified Risks and Risk Cubes Risk FY02-1 If-Then Stmt: If V-Robotics and the Program Office are unfamiliar or uncomfortable with our Agile Development approach then more time may be required to explain the process and how it fits in the bigger acquisition picture Rationale for “Star” placement on cube: Training will be required for all involved Schedule slippage from new involvement and no prior experience However, this is an administrative risk Risk FY02-13 If-Then Stmt: If OBD interfaces specs are not provided to AMRU then Re-work may be needed if assumptions prove to be wrong This is very likely given the interface challenges We are fairly confident there will be only minor mods required to the interface PRACTICUM 3 See: Job Aids\ Measures\Agile - MRES Measures for Section 8 The answers are included in the job aids. The intent is to have the students explain their rationale for where they placed the risk assessment star on the risk cube. One learning point will be that the interface risk assessment for the STIM may be different than the assessment for the overall program. There is no right answer (on the risk cube) as long as the students can justify their answers.

14 Question 4 (MRES Risk Assessment)
Description of Risks Question 4 (MRES Risk Assessment) Using the risk management cube methodology, identify 2 additional risks to JTAMS as a result of AMRU's software development approach. Likelihood / Consequence Assessment 2 Additional Risks / Statements Rationale for “Star” placement Risk 1 If-Then Stmt: If AMRU delivers software prior to other development activities then rework may be required as requirements change impacting cost and schedule Risk 2 If Interfaces definition decisions continue to be delayed, then development of the STIM model will be further delayed resulting in cost, schedule, and performance impacts. Risk 1 Risk 2 Risk 1 Rework will be needed causing delay Postponing overall capability delivery may be necessary Additional testing time of software requirements will be needed to make sure that all requirements are met. Risk 2 Rework time will be increased causing schedule variance Testing time of software requirements may cause delays in schedule if not compatible Overall performance of the MRES capability may be impacted. Likelihood Consequence Likelihood Consequence PRACTICUM 3 See: Job Aids\ Measures\Agile - MRES Measures for Section 8 The answers are included in the job aids. The intent is to have the students identify some potential additional risks and explain their rationale for where they placed the risk assessments on the risk cube. There is no right answer (on the risk cube) as long as the students can justify their answers. The additional risks can be identified in the Job Aid “Programmatic Implications of AMRU CDRL Tailoring” AMRU's measurements on definition, inspection, and delivery of software requirements statements will be out of synchronization with V-Robotics and Juggernaut Some of their requirements will be at the expected level of detail, but others will be at a much higher level of abstraction at any particular point in time AMRU will have prototype software prior to other elements of MRES - Increases the risk of AMRU having rework due to later decisions being made in other elements of the program Late decisions on interface definition will have a greater impact on AMRU than other elements where those interfaces affect the STIM model (a large number of the interfaces affect the STIM model)

15 Question 5 (MRES Measures Analysis)
Using the cumulative flow diagram contained in AMRU's release report, identify the work completed, work in progress and the work remaining on 8/1/02. AMRU Block 1 Implementation Progress Work Remaining In-Progress PRACTICUM 3 See: Job Aids\ Measures\Agile - MRES Measures for Section 8 The students have the diagram in the job aids along with an explanation of what each part of the graph indicated – the goal is to ensure they understand by pointing out the areas indicated. Instructor notes: See the two job aids in the student files – Agile & Metrics presentation from SEI, and “How to Build a Cumulative Flow Diagram” .pdf for basics of how a cumulative flow diagram represents progress over time. In the diagram above, the blue band indicates the stories that are “ready” for development. While they are in the blue state, they are waiting to b e worked. The red band represents the time that stories are in development. The green represents time spent in test, and purple represents the work that is completed and ready to be delivered to AMRU’s customer, V-Robotics. There are many useful questions that CFDs stimulate. For example, what do the dips and hills in Ready (for development) stories indicate? In this case, it might have something to do with waiting for the interface data and then deciding to build stories with proxy date, knowing that rework may be caused because of this decision (technical debt). When you look from left to right, at any particular point, you can measure the life cycle of a story from Ready for development through deliv ery to V-Robotics. Questions that arise when looking from this viewpoint include, why is there variation in time that stories spend in development in test? The main purpose of this representation is to generate conversations about progress. Like most progress visualizations, it generates more questions than answers. Answering the questions is part of the dialogue that should be occurring between developers and acquirers. Work Complete

16 Summary – Are We Ready for CDR?
Completion of the CDR should provide the following: (1) An established system initial product baseline Status: Not complete yet—Still waiting on Interfaces (2) An updated risk assessment for EMD Status: Risk Mitigation Plans in place (3) An updated CARD (or CARD-like document) based on the system product baseline Status: CARD estimate completed (4) An updated program development schedule Status: Done (5) An approved Life-cycle Sustainment Plan Government and Juggernaut Assessment: Are we ready for CDR? NO (or Yes – depending on the level of risk we are willing to accept with the interface problem – students could argue that we could move forward since the scenario says we have a build-to design) PRACTICUM 3 – (JOINT TASK GOV + CONTRACTOR) For this point in the development effort: Summarize our readiness for CDR. (LEAD ROLE: PM JTAMS in close coordination with JUGGERNAUT PM) ANSWER: Read the Student Situation…we are waiting on Interface Definitions…that’s the show stopper! Completion of the CDR should provide the following: (1) An established system initial product baseline Status: Still Being Worked Note: External interfaces are pending final decision (2) An updated risk assessment for EMD Status: completed (3) An updated CARD (or CARD-like document) based on the system product baseline Status: Submitted for approval (4) An updated program development schedule (5) An approved Life-cycle Sustainment Plan Status: Approved Juggernaut Assessment: Are we ready for CDR? No, we need another risk review of the initial product baseline.

17 PRACTICUM 3 LO’S: Description LO’s Slide
TLO 2.1.X TLO: Given a notional Information Technology (IT) system, analyze the needed documentation and decisions necessary for an Information Technology (IT)/Software Intensive System (SIS) to support a Milestone decision. ELO Given a DoD acquisition scenario, asses required aspects of DoD SE technical reviews, technical processes, and technical management processes. 2, 3, 11 ELO Given a DoD acquisition scenario, apply System Engineering Lessons Learned and Best Practices when dealing with software. ELO Apply the five components of the DoD Risk Management Process Model to a given IT acquisition scenario 11-13 ELO Given a description of an IT acquisition need, assess documentation to ensure specific information provides a clear understanding of the intended IT acquisition objectives from program outset. 6, 8-10, 11 ELO Given a DoD acquisition scenario, apply laws and DoD directives to ensure successful IT systems management throughout the acquisition life cycle. 6, 8-10 ELO Given an IT acquisition scenario, identify the most appropriate software methodology (or a combination of methodologies) to meet the expectations of the government. 9, 10 ELO Identify the key principles of an agile software development process ELO Given an acquisition scenario, identify key issues with using agile software development processes in a DoD environment 9-12 ELO Given an IT acquisition scenario, assess the components of the contractor’s Software Development Plan (SDP). ELO Given an IT acquisition scenario, apply the measurement results to support IT program decisions. 4 ELO Given a scenario, identify program IT decision information requirements. ELO Create a set of program IT measures that are linked to the program decision information requirements. ELO Given an IT acquisition scenario, recommend an incentive structure. 6 ELO Given an IT acquisition scenario analyze different types of data rights 5 ELO Given a software acquisition scenario, recognize the preferred method for identifying and tracking defects ELO Given acquisition scenarios with several software-reliant program issues, analyze how each issue may affect achieving the program's software quality objectives 4, 6 PRACTICUM 3


Download ppt "JTAMS PRE-CDR IT/SIS ANALYSIS"

Similar presentations


Ads by Google