Copyright © Fraunhofer USA 2009 Overview of Fraunhofer Competencies in System Engineering Research Dr. Forrest Shull Division Director, Experience and.

Slides:



Advertisements
Similar presentations
1 The Systems Engineering Research Center UARC Dr. Dinesh Verma Executive Director January 13,
Advertisements

Integrated Project Management IPM (Without IPPD) Intermediate Concepts of CMMI Project meets the organization Author: Kiril Karaatanasov
The Experience Factory May 2004 Leonardo Vaccaro.
SERC Achievements and Program Direction Art Pyster Deputy Executive Director November, Note by R Peak 12/7/2010: This presentation.
SE curriculum in CC2001 made by IEEE and ACM: Overview and Ideas for Our Work Katerina Zdravkova Institute of Informatics
SE 470 Software Development Processes James Nowotarski 12 May 2003.
IT Planning.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February Empiricism in Software Engineering Empiricism:
1 Introduction to System Engineering G. Nacouzi ME 155B.
System-of-Systems Cost Modeling: COSOSIMO July 2005 Workshop Results Jo Ann Lane University of Southern California Center for Software Engineering.
By Saurabh Sardesai October 2014.
CSE USC Fraunhofer USA Center for Experimental Software Engineering, Maryland February 6, Outline Motivation Examples of Existing.
EE496A Senior Design Project I Dr. Jane Dong Electrical and Computer Engineering.
Welcome to the Minnesota SharePoint User Group. Introductions / Overview Project Tracking / Management / Collaboration via SharePoint Multiple Audiences.
Annual SERC Research Review - Student Presentation, October 5-6, Extending Model Based System Engineering to Utilize 3D Virtual Environments Peter.
EASTERN MICHIGAN UNIVERSITY Continuity of Operations Planning (COOP)
Effective Methods for Software and Systems Integration
Chapter : Software Process
CMMI Course Summary CMMI course Module 9..
S/W Project Management
CHAPTER 5 Infrastructure Components PART I. 2 ESGD5125 SEM II 2009/2010 Dr. Samy Abu Naser 2 Learning Objectives: To discuss: The need for SQA procedures.
PMP® Exam Preparation Course
Introduction to ISO New and modified requirements.
© Copyright 2007 Fraunhofer Center MD 1 Life at Fraunhofer Center - Maryland Dr. Forrest Shull Division Director, Measurement & Knowledge Management.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
-Nikhil Bhatia 28 th October What is RUP? Central Elements of RUP Project Lifecycle Phases Six Engineering Disciplines Three Supporting Disciplines.
Software Quality Assurance Activities
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
1PBI_SAS_08_Exec_ShullSeptember 2008MAC-T IVV Dr. Forrest Shull, FCMD Kurt Woodham, L-3 Communications OSMA SAS 08 Infusion of Perspective-Based.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Business Analysis and Essential Competencies
The Challenge of IT-Business Alignment
Contents 1 Description of 1 Description of Initiative Initiative 3 Results: 3 Results: Characterization Characterization 2 Description of 2 Description.
© 2006 Cisco Systems, Inc. All rights reserved.Cisco Public 1 Version 4.0 Gathering Network Requirements Designing and Supporting Computer Networks – Chapter.
Role-Based Guide to the RUP Architect. 2 Mission of an Architect A software architect leads and coordinates technical activities and artifacts throughout.
Intent Specification Intent Specification is used in SpecTRM
PROJECT MANAGEMENT. A project is one – having a specific objective to be completed within certain specifications – having defined start and end dates.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
Strong9 Consulting Services, LLC 1 PMI - SVC I-80 Breakfast Roundtable Monthly Meeting Thursday, October 12, :00 am – 9:00 am.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Quality Activity Matrix Presented by Sandra Toalston President, SanSeek 1.
Managing CMMI® as a Project
© 2012 xtUML.org Bill Chown – Mentor Graphics Model Driven Engineering.
Towards an Experience Management System at Fraunhofer Center for Experimental Software Engineering Maryland (FC-MD)
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
Develop Project Charter
1 © 2009 Fraunhofer Center - Maryland Dr. Forrest Shull, FCMD Verification of Software Architectures via Reviews.
March 2004 At A Glance NASA’s GSFC GMSEC architecture provides a scalable, extensible ground and flight system approach for future missions. Benefits Simplifies.
Sept Software Research and Technology Infusion 2007 Software Assurance Symposium.
27/3/2008 1/16 A FRAMEWORK FOR REQUIREMENTS ENGINEERING PROCESS DEVELOPMENT (FRERE) Dr. Li Jiang School of Computer Science The.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
International Workshop Jan 21– 24, 2012 Jacksonville, Fl USA Model-based Systems Engineering (MBSE) Initiative Slides by Henson Graves Presented by Matthew.
Network design Topic 6 Testing and documentation.
Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts Rational Unified Process Fundamentals Module 4: Core Workflows II - Concepts.
 CMMI  REQUIREMENT DEVELOPMENT  SPECIFIC AND GENERIC GOALS  SG1: Develop CUSTOMER Requirement  SG2: Develop Product Requirement  SG3: Analyze.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Contents 1 Description of 1 Description of Initiative Initiative 3 Defining Inspection 3 Defining Inspection Perspectives Perspectives 2 Overview of 2.
Project Management Strategies Hidden in the CMMI Rick Hefner, Northrop Grumman CMMI Technology Conference & User Group November.
Evaluate SE Methods, Processes and Tools Technical Task Plan USC Workshop Los Angeles, CA 16 March 2009.
Beyond the BACoE: Developing Business Analysis Maturity.
Contents 1 Description of 1 Description of Initiative Initiative 3 Year 2: Updated 3 Year 2: Updated Training/Metrics Training/Metrics 2 Year 1: NASA 2.
Mgt Project Portfolio Management and the PMO Module 8 - Fundamentals of the Program Management Office Dr. Alan C. Maltz Howe School of Technology.
Optimizing the Approach
CS4311 Spring 2011 Process Improvement Dr
CMMI Q & A.
UNIT-6 SOFTWARE QUALITY ASSURANCE
CS 8532: Advanced Software Engineering
Bridging the ITSM Information Gap
Presentation transcript:

Copyright © Fraunhofer USA 2009 Overview of Fraunhofer Competencies in System Engineering Research Dr. Forrest Shull Division Director, Experience and Knowledge Management Division Fraunhofer Center Maryland

October 2009 Copyright © Fraunhofer USA 2009 Fraunhofer Center Maryland (FC-MD) Center for Experimental Software Engineering A not-for-profit applied research & technology transfer organization Founded in 1997 by Dr. Victor Basili and Dr. Marv Zelkowitz Affiliated with University of Maryland at College Park One of six Fraunhofer USA Centers Vision: –Apply a scientific/engineering method to system and software engineering –Utilize past results to guide development choices –Use organizational learning as the key to improvement

October 2009 Copyright © Fraunhofer USA 2009 Fraunhofer Center Maryland (FC-MD) Our Staff ~25 technical personnel –Scientists Many with significant track records of research and tech transfer –Applied Technology Engineers Many with 20+ years of experience in government and commercial organizations –Part-time university faculty (associate and adjunct professors) Maintaining tight connection with local university departments –Interns University students who desire experience on cutting-edge problems as part of their studies

October 2009 Copyright © Fraunhofer USA 2009 Fraunhofer Center Maryland (FC-MD) Partners & Customers Mission: Advance real-world practices via empirically validated research into system- and software- engineering technologies and processes We bridge research and application, working with: Government organizations (DoD, NASA, …) Large industrial companies (Boeing, Motorola, …) Small/medium-sized organizations (KeyMind,…) National labs (Los Alamos Labs, ….) Universities and research centers (CMU, USC, MIT, …) …and others (SEI, MITRE, JHU/APL,...)

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: SERC EM Task Goals: –Compare SEPAT effectiveness measures with DAPS assessment methodology to determine gaps and coverage –Provide constructive feedback to both model owners as appropriate. –Determine the significance of SEPAT effectiveness measures w/r/t SADB negative findings. Example research methods: Use very experienced SMEs to Identify and select key topic areas Analyze both reference docs and create comparison map Request SADB findings on key queries and map to SEPAT

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: SERC EM Task Results: –The SEPAT has ‘rolled up’ many subtopics into fewer effectiveness areas; DAPS is more broad and more specific –Several key concepts found in SEPAT seem to be absent or used in a different manner in the DAPS context, such as Negotiating roles & responsibilities Nominal/off-nominal Stakeholder concurrence/acceptance Feasibility evidence Timeboxing –DAPS is very DoD-process oriented; as a result, its guidance on what the program should do when is more specific, e.g., each section addresses Milestones A, B, and C as appropriate.

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: SERC MPT Task Goals: –Explore MPTs & gaps w/r/t agility and responsiveness to change, in context –Conceptualize agile challenges along with: Themes – Important components of a solution for each challenge Elements – Key activities that must be completed to achieve each theme MPTs – Existing sets of methods, processes, and tools that together provide an actionable way of achieving an element. Example research methods: –Conducted survey of organizations with relevant experience –Qualitative analysis of responses to identify concrete suggestions, contexts where appropriate, and areas still missing concrete MPTs Results: The bridge maps we are creating provide a way of encapsulating practical experience as to MPTs that can be recommended & further research to be done.

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: SERC MPT Task Results (cont.): Worked with sponsor to understand context 100+ surveys analyzed Initial ‘bridge diagrams’ allow recommending initial MPTs for context Gap analysis allows discovery of areas requiring research to develop new MPTs

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: DOD Best Practices Clearinghouse Goals: –Provide the system engineering workforce with recommendations about applicable MPTs, based on experiential information from practice –Communicate the effectiveness of SE MPTs based on cost, schedule, and quality impact on real projects –Example research methods: –Exploring use of multiple methods (interviews, surveys, questionnaires) to elicit experiential information –Application of qualitative and quantitative reasoning to aggregate information about practice’s costs and benefits –Developing information tagging by context, keywords, ontologies –Results: –A published website and growing support team of contributors and SMEs (e.g. from DAU campuses, AFIT) –~10K page views / month

October 2009 Copyright © Fraunhofer USA 2009 Clearinghouse content starts with practices recommended by government and industry experts BPCh recommendations must be based on evidence from real programs: – From publications – From interviews & feedback with users – From vetted expert guidebooks & standards Overview of building content Practice Maturity Name: Practice X Evidence 1 Source Context Results Evidence 3 Source Context Results Evidence 2 Source Context Results Evidence 4 Source Context Results Σ Σ Practice X has been successfully applied … Use it to … For more information click on the following links: … (Bronze) (Silver)(Gold) Example of Competencies: DOD Best Practices Clearinghouse

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: Software Safety Metrics Goals: –Provide some quantifiable evidence of potential risk w/r/t software safety and whether projects are addressing that risk –Approach is applicable to other key properties of the system early in the lifecycle Example research methods: –Developing models that isolate specific deficiencies, codify the risks, and offer possible responses Identify data in the artifacts available to address the problem Identify gaps in the data and propose responses to those gaps –Evaluate artifacts produced by the process over time E.g. Evaluating software-related hazard reports from multiple Constellation elements (Orion, Ares US, J-2X) Monitor areas of risk over time Do not depend on designs or code alone to evaluate safety

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: Software Safety Metrics Results: –Modeling hazards: Models of deficiencies in software risk causes, e.g. lack of specificity Currently extending model to characterize software controls and verifications –Modeling the evolution of software-related hazards to: Provide snapshots of software risk areas Identify highest risk elements, systems and subsystems Find mission drivers with the highest risk related to software –Developing metric templates for SR&QA personnel and vendors –Analysis of hazard reports is applicable and relevant to NASA projects with a similar hazard analysis and reporting structure –Future work will expand this approach to artifacts other than hazard reports

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: Improving Sys & SW Tech Reviews Goals: –Fundamental research aimed at applying a proven system engineering practice to contemporary constraints and problems –Work with many NASA missions aimed at integration of inspection best practices with program execution activities –Further deployment to reach the NASA System Engineering workforce Example research methods: –Quantitative statistical testing based on data points –Tailoring of method to project environments, handling software- and system-level technical documents –Encoding research results into tool support for use by NASA teams Results: –Concrete, practical information formulated and infused with teams –Tool support developed and tested –Inclusion in NASA’s safety curriculum

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: Improving Sys & SW Tech Reviews SSC: LH Barge Ladder Logic Wallops: SIPS Core Slaving Computation Process JSC: LIDS GSFC & MSFC: NPR procedural reqts for software engineering. ARC: Simulink models for smallsat IV&V: James Webb Space Telescope

October 2009 Copyright © Fraunhofer USA 2009 Example of Competencies: Model-based V&V of med devices Goals: –Demonstrate feasibility & cost effectiveness of an advanced MPT (model-based checking) for complex, safety-critical systems Example Research Methods: –Work with customer on real system to address scalability concerns –Ensure integration with industry standard modeling tools Results: –This practice currently being piloted as part of the Generic Infusion Pump (GIP) project –Models of a GIP have been constructed –Safety requirements created from hazards analysis –Safety requirements converted into monitor models (26 done so far) –Verified against GIP model

October 2009 Copyright © Fraunhofer USA 2009 Obtained from model under verification When is the requirement valid? Ideal output according to req. Ideal output according to req. Are ideal and expected equal? Are ideal and expected equal? Is this always value 1 for all simulation time instants? Is this always value 1 for all simulation time instants? Example of Competencies: Model-based V&V of med devices

October 2009 Copyright © Fraunhofer USA 2009 Summary Proven results in a variety of contexts –Including safety- and mission-criticality …and for a number of sponsors. Effective at working both with researchers and engineers –And communicating results that improve work in progress Central theme of results: Learning from analysis, measurement, and experience

October 2009 Copyright © Fraunhofer USA 2009 For More Info… Contact info: Forrest Shull (240)

October 2009 Copyright © Fraunhofer USA 2009 For More Info… Key pubs (excerpts) SERC UARC: –Pennotti, M., Turner, R., and Shull, F., “Evaluating the Effectiveness of Systems and Software Engineering Methods, Processes and Tools for Use in Defense Programs,” Proc. IEEE International Systems Conference, pp Vancouver, BC, Canada, March 23-26, DOD Best Practices Clearinghouse: – –Feldmann, R., Shull, F., and Shaw, M., “Decision Support for Best Practices: Lessons Learned on Bridging the Gap between Research and Applied Practice,” Acquisition Review Journal, vol. 14, no. 1, pp , February NASA Safety Metrics for Constellation: –Contact Dr. Victor Basili, NASA Software & System Technical Reviews: –Denger, C., and Shull, F., “A Practical Approach for Quality-Driven Inspections,” IEEE Software, vol. 24, no. 2, pp , March/April FDA Model-Based V&V: –Penn Engineering Technical Report,