Critical Design Review November 2015

Slides:



Advertisements
Similar presentations
Global Congress Global Leadership Vision for Project Management.
Advertisements

PROJECT RISK MANAGEMENT
Software Quality Assurance Plan
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
More CMM Part Two : Details.
Copyright © 2012 Pearson Education, Inc. Publishing as Prentice Hall 3.1.
ITIL: Service Transition
Software Quality Metrics
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
I n t e g r i t y - S e r v i c e - E x c e l l e n c e Business & Enterprise Systems Introduction to Hewlett Packard (HP) Application Lifecycle Management.
Chapter 3: The Project Management Process Groups
IRM 101 Critical Design Review For March 2015 Update Status 11/21/2014.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
8-1 Copyright © 2013 McGraw-Hill Education (Australia) Pty Ltd Hill, Hill, Perlitz, Professional Training & Assessment, 1e C HAPTER 8 TAEASS502B Design.
What is Business Analysis Planning & Monitoring?
Effective Methods for Software and Systems Integration
Proposed EA Assessment Framework 2.0 Chief Architect’s Forum (CAF) Dick Burk Chief Architect and Director of Federal Enterprise Architecture Program, OMB.
S/W Project Management
COMPANY CONFIDENTIAL Page 1 Final Findings Briefing Client ABC Ltd CMMI (SW) – Ver 1.2 Staged Representation Conducted by: QAI India SM - CMMI is a service.
Basics of OHSAS Occupational Health & Safety Management System
N By: Md Rezaul Huda Reza n
OSF/ISD Project Portfolio Management Framework January 17, 2011.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
NIST Special Publication Revision 1
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Chapter 2 Process: A Generic View
Software Engineering Lecture # 17
Service Transition & Planning Service Validation & Testing
Certification and Accreditation CS Phase-1: Definition Atif Sultanuddin Raja Chawat Raja Chawat.
『华东师范大学』 课程名称: 软件开发实践 Software Development Practice 课程类型: 实践课 第二讲: 项目管理 Lect_02: Manage the Project 主讲 : 软件学院 周勇 副 教授 日期 :
Chapter – 9 Checkpoints of the process
Systems Design Approaches The Waterfall vs. Iterative Methodologies.
Teaching Today: An Introduction to Education 8th edition
IRM304 CDR Course Manager: Denny Involved Competency Leads: 26 (Cybersecurity)-Denman, 19 (Measurement)-Denny, 7 (DBS)-Corcoran [Capability Planning],
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
SAM Executive Seminar Software Measurement.
Managing CMMI® as a Project
Georgia Institute of Technology CS 4320 Fall 2003.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
 Copyright ProcessVelocity, LLP Slides intended for informational purposes only. CMM and Capability Maturity Model are registered in the U.S. Patent.
SOFTWARE PROJECT MANAGEMENT
Ch-1 Introduction The processes used for executing a software project have major effect on quality of s/w produced and productivity achieved in project…
Project Management Cross lifecycle Activity
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
Microsoft Office Project 2007 Advanced Tips & Tricks Keith Wilson, Executive/Senior Consultant Project Management Practice Inc.
The Goal: To Climb Above The Competition Copyright 2005: I Lead Projects, L.L.C. Course Description Project Process Workplates Project Process Workplates.
Purpose: The purpose of CMM Integration is to provide guidance for improving your organization’s processes and your ability to manage the development,
Communications Management
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Report Performance Monitor & Control Risk Administer Procurement MONITORING & CONTROLLING PROCESS.
Software Engineering (CSI 321) Software Process: A Generic View 1.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
Software Engineering Lecture 10: System Engineering.
1 Lecture 2.4a: SEF SE Planning and the SEP (SEF Ch 16) Dr. John MacCarthy UMBC CMSC 615 Fall, 2006.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
Develop Schedule is the Process of analyzing activity sequences, durations, resource requirements, and schedule constraints to create the project schedule.
Information Technology Project Management, Seventh Edition.
Info-Tech Research Group1 Info-Tech Research Group, Inc. Is a global leader in providing IT research and advice. Info-Tech’s products and services combine.
 System Requirement Specification and System Planning.
CMMI for Services, Version 1.3 Speaker: Business Excellence Date:
ITIL: Service Transition
ISA 201 Information System Acquisition
Security SIG in MTS 05th November 2013 DEG/MTS RISK-BASED SECURITY TESTING Fraunhofer FOKUS.
TechStambha PMP Certification Training
Advanced Software Acquisition Management ISA 320
CMMI – Staged Representation
X-DIS/XBRL Phase 2 Kick-Off
Presentation transcript:

Critical Design Review November 2015 SAM 301 Critical Design Review November 2015

Critical Design Review   Question Y/N Slide? 1 Are action items from PDR complete? Y 2 Are detailed summaries of each module/storyboards complete?  * 3 Have any copyright issues developed? NA 4 Is all required media (articles, videos etc.) available? 5 Are draft quizzes, tests and associated rubrics complete? 6 What is the course transition strategy? 7 Is the instructor pilot scheduled? Who will attend and where?  TBD

CDR Focused on new LOs Contained in Instructor ISP 2 Are detailed summaries of each module/storyboards complete? Y   Focused on new LOs Contained in Instructor ISP

New ELO Mapping to Modules Enabling Learning Objectives Module 10.1.1.4 Given an IT acquisition scenario, modify a software development capability release plan to increase the likelihood of success (on-time, on-schedule and with the required functionality) in delivering an IT system. (BL 3) Update Module 04 (Case Study) 18.1.1.5 Given a software system lifecycle approach, evaluate the effectiveness and efficiency of the approach over its lifecycle. (BL 5) Module 14 19.2.2.4: Adapt and improve the IT measurement process. (BL 3) (IRM304 Share) Build New Topic 07 19.2.2.1: Given a scenario, collect and process measurement and context relevant IT management and technical data. (BL 3) (IRM304 Share) 19.2.2.3 Given a scenario, make actionable recommendations. (BL 3) (IRM304 Share) Module 09 19.2.2.2: Given a scenario, analyze the collected IT data with respect to the defined information needs. (BL 4) (IRM304 Share) 19.1.2.5: Given a scenario, evaluate the effectiveness of an IT measurement and analysis program. (BL 5)

CDR Primary changes: Other changes: Mod 4: F18 Case Study 6 What is the course transition strategy?  TBD  * Primary changes: Mod 4: F18 Case Study Mod 7: New Measurement Lesson Mod 9: UUV Case Study Mod 14: GBS Case Study Other changes: Mod 2: Software Acquisition Initiatives Mod 6: EFV Case Study Mod 5, 9, 10, 11, 13: Review and Update Existing Topic Mod 12: TPM Case Study

Transition strategy for targeted updates Monday Pre-Assignment Change welcome message 0800-1000 01-Introduction 1000-1130 02- Topic: Software Acquisition Challenges Combine old mods 2&3/Update 1130-1230 Lunch   1230-1300 03-Topic: Critical Thinking and Problem Solving 1300-1400 Learning Team Discussion (F18) 1400-1530 04-Case: F18 Software Release Plan Update/Cleanup file 1530-1700 Reflection & Reading Assignments Tuesday 0800-0830 Daily Startup 0830-0900 Learning Team Discussion (UUV) 0900-1130 05- Topic: Human Capital Issues Review/Update topic for currency 1230-1400 06-Case: EFV Training Clean up case study file 07-Topic SW Program Success Build new lesson *Red indicates primary changes affected by new ELOs, Green are secondary changes

Transition strategy for targeted updates Wednesday 0800-0830 Daily Startup 0830-0900 Learning Team Discussion(EFV)   0900-1130 08-Topic: Software Quality Review/Update topic for currency 1130-1230 Lunch 1230-1400 09-Case: UUV Metrics Update/Cleanup file 1400-1530 10-Topic: Requirements Management 1530-1700 Reflection & Reading Assignments Thursday Learning Team Discussion(TPM) 11-Topic: Software Development 12-Case: TPM Clean up case study file 13-Topic: Technology Advancement Friday Learning Team Discussion(GBS) 0900-1030 14-Case: GBS Lifecycle Update/Cleanup file 1130-1200 15-Topic: Next Steps 1200-1230 Graduation *Red indicates primary changes affected by new ELOs, Green are secondary changes

CDR 5 Are draft quizzes, tests and associated rubrics complete? Y   Rubrics:

70 points required to pass SAM-301 Grading Case Study Analysis (40 points) Questions at end of case study to help guide the analysis Learning Team Participation (30 points) Peer evaluation form (Friday AM) Large Group Participation (30 points) Instructor evaluation 70 points required to pass

1-Case Study Analysis (40 points) How you answer homework questions 1-2 pages submitted at end of learning team discussion + 10 points for completion and submitting on time Case Study Analysis Rubric + 10 points + 20 points + 30 points Weakly applies function knowledge relevant to case making incorrect statements Shows a lack of familiarity with basic facts by making incorrect statements Correctly applies functional knowledge to case parameters Evidences understanding of basic facts and key issues In addition to +20 point requirements… Integrates facts and identifies implications Explicates relationships among issues

2-Learning Team Participation(30 points) Your contribution to Learning Team effort Peer evaluation assessment on Friday AM

Learning Team CONFIDENTIAL STUDENT RECONGNITION

3-Large Group Participation (30 points) 10 20 30 10 20 30 10 20 30

Risk Register Item Risk Severity Mitigation 1 SME Availability Scheduled deliverables 2

Action Register 1 Agreement by sponsor on course description Item Action Due Assigned 1 Agreement by sponsor on course description Cooke/Skertic 2 Projected course throughput Skertic 4 Allocate LO’s to modules Complete Cooke 3 Validate File Structure on Blackboard (stud focus) 10/31 Cooke/Debra 5 Find DBS Case Study 11/31 6 Update Course Intro Message 7 Build Measurement Topic Lesson 11/30 TBD 8 Build SW Acquisition Challenge Topic Lesson

Backup

Course Description Software Acquisition Management (SAM) 301 is a case-based course for senior managers who acquire, engineer, test, and evaluate DoD software-intensive systems. SAM 301 is also for acquisition professionals interested in obtaining comprehensive insight into the risks and issues associated with developing and implementing complex DoD software systems. Case study analysis, topical area discussion, and subject matter expert presentations are used to cover topics related to the planning, management and sustainment of software systems.

MAJOR TAKEAWAYS There is rarely a single “correct” release plan ELO 10.1.1.4: Given an IT acquisition scenario, modify a software development capability release plan to increase the likelihood of success (on-time, on-schedule and with the required functionality) in delivering an IT system. (BL 3) MAJOR TAKEAWAYS There is rarely a single “correct” release plan Establish time boxed release schedules (fix cost and schedule and push capability to the next release) As more information is gathered on the project your release plan will likely change to meet the needs of technology and / or the customer- this is a good thing.

ELO 18.1.1.5: Given a software system lifecycle approach, evaluate the effectiveness and efficiency of the approach over its lifecycle. (BL 5) MT 8. Considerations for recommended changes could include - Software Development Plan (SDP), Post-Deployment Software Support (PDSS),Data Protection and Software Assurance, Software Data Management and Technical Data Rights, Software Reuse, Software Acquisition and Sustainment Costs, Software Safety, the use of Modular Open Systems, and a documented software architecture.

ELO 19.2.2.1: Given a scenario, collect and process measurement and context relevant IT management and technical data. (BL 3) (IRM304 Share) MAJOR TAKEAWAYS Data must be collected, in accordance with the specified measures, to support measurement analysis based on the established information needs. Usually, data is collected on a periodic basis. Only the data necessary to satisfy the defined information needs should be collected. Both quantitative attribute data, and relevant program/technical context data, is required. The data comes from multiple sources, in multiple formats. It must be evaluated for availability, integrity, and usability. Data must be normalized and aggregated in accordance with the measurement specifications and measurement plan. Data should be stored in an accessible data repository. An Excel workbook is sufficient for many applications. Historical data, plans, and actual data are collected. Attention should be paid to changes in plans over time. Low level data should be collected, to allow localization of problems and detailed analysis.

ELO 19.2.2.2: Given a scenario, analyze the collected IT data with respect to the defined information needs. (BL 4) (IRM304 Share) MAJOR TAKEAWAYS Analysis includes estimation, feasibility analysis, and performance analysis. An indicator is a primary analysis product. It is a measure that provides an estimate or evaluation of specified attributes with respect to an information need. It includes one or more values of base and/or derived measures, along with the decision criteria used to assess the indicator value. Indicators support all three types of analysis. Indicators are systematically generated, analyzed, and reviewed to: produce assessments relative to known information needs identify new information needs (problems, risks, lack of information) Indicators include: 1) pre-defined "recurring” indicators that address identified information needs, and 2) “as required” indicators needed to address new questions or to localize problems. Estimation provides expectations of key project and enterprise performance parameters, allow evaluation of the feasibility of plans, project end-item results based on performance to date, help evaluate risk, and establish enterprise performance baselines. Feasibility Analysis is an evaluation of whether plans are realistic and achievable. It helps to define alternatives and identify risks. Feasibility analysis includes comparisons of project parameters and consistency of assumptions and adjustments. It establishes confidence in the plans. Performance Analysis uses plans and actual data to monitor status and answer questions such as: Is the work tracking to the plan(s)? and is the variance significant? Performance Analysis produces status information and exposes problems and risks. It includes analysis of leading indicators, critical path items, and inconsistent trends. Analysis must take into account the cause and effect relationships between key measurement information categories (integrated analysis).

ELO 19. 2. 2. 3: Given a scenario, make actionable recommendations ELO 19.2.2.3: Given a scenario, make actionable recommendations. (BL 3) (IRM304 Share) MAJOR TAKEAWAYS Insight into issues and objectives is generally improved by reviewing multiple related indicators together. The measurement results must be clearly presented to the decision makers in an understandable format. The measures and analysis results should be communicated to the stakeholder team. The measurement results must be interpreted within the context of the program - the objectives, assumptions, and constraints. The actions dictated by the measurement results may not be possible: recommendations may have to optimize within project or enterprise constraints.

ELO 19. 2. 2. 4: Adapt and improve the IT measurement process ELO 19.2.2.4: Adapt and improve the IT measurement process. (BL 3) (IRM304 Share) MAJOR TAKEAWAYS Both the measures and the measurement process must be regularly evaluated and updated. Measures and indicators must be evaluated to see if they provide usable decision information, and are being used. The measurement process should also be evaluated for effectiveness and efficiency, and to see if the defined process is sufficient and being followed. Artifacts and observations from the measurement process should be shared. Lessons learned may be used to change the process or as a basis for additional training. Lessons learned may be implemented as improvements to the current project or enterprise’s measurement implementation or for future projects.

ELO 19.1.2.5 Given a scenario, evaluate the effectiveness of an IT measurement and analysis program. MT 26: Measurement analysis focuses on taking the measured IT parameter results and transforming them, through the use of various constructs, into information products that relate directly to one or more information needs all information products are based on the measurement of key program/system attributes at a low level - base measures base measures are systematically combined, using consistently defined relationships, to quantify an IT activity or product. The measurement result is then compared against established decision criteria, and presented as a measurement indicator the measurement indicator conveys the measurement results to the decision maker - it usually compares the measured results to pre-established thresholds that determine the need for action integrated analysis combines multiple indicators and focuses on the cause and effect relationships inherent between IT performance parameters - integrated analysis helps to identify and correct performance factor inconsistencies MT 27: Measurement derived information must be coupled with program context information to interpret the numbers correctly. measurement information products need to be understood and “usable” by both program and enterprise decision makers. decision makers must understand the measures presented to them and the associated data and analysis. They have to be able to evaluate the limitations of the measurement results. most program decisions are supported by multiple measures (tightly coupled attributes) and different types of information - there is no single measure that indicates IT program performance - there is no single number that indicates “good” or “bad”. MT 28: The decision maker may not always be able to make “fact-based” decisions - there are inherent limitations with every program environment.

Assessment Strategy Student homework graded for quality of analysis Students evaluated by instructors for participation and quality of input during case study large group discussions Students evaluated by peers for quality of participation during small group discussions and presentations Students evaluation by instructors during daily reflection for guest speaker and discussion takeaways and application to their job