Value of Systems Engineering

Slides:



Advertisements
Similar presentations
Numbers Treasure Hunt Following each question, click on the answer. If correct, the next page will load with a graphic first – these can be used to check.
Advertisements

1 Yell / The Law and Special Education, Second Edition Copyright © 2006 by Pearson Education, Inc. All rights reserved.
Requirements Engineering Processes – 2
Using Metrics to Reduce Cost of Re-work Dwight Lamppert Senior Test Manager Franklin Templeton.
1 Nia Sutton Becta Total Cost of Ownership of ICT in schools.
1 UNIVERSITIES of AUSTRALASIA BENCHMARKING RISK MANAGEMENT BILL DUNNE DIRECTOR RISK MANAGEMENT UNSW. PROUDLY SPONSORED BY UNIMUTUAL.
Pricing for Utility-driven Resource Management and Allocation in Clusters Chee Shin Yeo and Rajkumar Buyya Grid Computing and Distributed Systems (GRIDS)
A Study of State and Local Implementation and Impact The Study of State and Local Implementation and Impact of the Individuals with Disabilities Education.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Chapter 1 The Study of Body Function Image PowerPoint
1 Copyright © 2013 Elsevier Inc. All rights reserved. Appendix 01.
Effective Change Detection Using Sampling Junghoo John Cho Alexandros Ntoulas UCLA.
Chapter 1 Image Slides Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Value of Systems Engineering INCOSE Data
Slide 1 FastFacts Feature Presentation August 12, 2010 We are using audio during this session, so please dial in to our conference line… Phone number:
NANPA Oversight Working Group Status Report to the NANC April 17, 2001 Chair Pat Caldwell.
1 The Academic Profession and the Managerial University: An International Comparative Study from Japan Akira Arimoto Research Institute for Higher Education.
6 - 1 Copyright © 2002 by Harcourt, Inc All rights reserved. CHAPTER 6 Risk and Return: The Basics Basic return concepts Basic risk concepts Stand-alone.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Measurements and Their Uncertainty 3.1
FACTORING ax2 + bx + c Think “unfoil” Work down, Show all steps.
Projects in Computing and Information Systems A Student’s Guide
IAEA Training in Emergency Preparedness and Response Module L-051 General Concepts of Exercises to Test Preparedness Lecture.
1 Click here to End Presentation Software: Installation and Updates Internet Download CD release NACIS Updates.
1 Implementing Internet Web Sites in Counseling and Career Development James P. Sampson, Jr. Florida State University Copyright 2003 by James P. Sampson,
Break Time Remaining 10:00.
Traffic Analysis Toolbox & Highway Capacity Manual Transition
Selecting an Advanced Energy Management System May 2007 Chris Greenwell – Director Energy Markets Scott Muench - Manager Technical Sales © 2007 Tridium,
Fact-finding Techniques Transparencies
Effective Test Planning: Scope, Estimates, and Schedule Presented By: Shaun Bradshaw
Effectively applying ISO9001:2000 clauses 6 and 7.
1. 2 Evaluation Report A preliminary report to the faculty and administrators of the online distance learning program in the Department of Educational.
Company Confidential © 2012 Eli Lilly and Company Beyond ICH Q1E Opening Remarks Rebecca Elliott Senior Research Scientist Eli Lilly and Company MBSW 2013.
Bright Futures Guidelines Priorities and Screening Tables
1 Slides revised The overwhelming majority of samples of n from a population of N can stand-in for the population.
VOORBLAD.
15. Oktober Oktober Oktober 2012.
April 30, A New Tool for Designer-Level Verification: From Concept to Reality April 30, 2014 Ziv Nevo IBM Haifa Research Lab.
Copyright © 2012, Elsevier Inc. All rights Reserved. 1 Chapter 7 Modeling Structure with Blocks.
Target Costing If you cannot find the time to do it right, how will you find the time to do it over?
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
Understanding Generalist Practice, 5e, Kirst-Ashman/Hull
Employment Ontario Program Updates EO Leadership Summit – May 13, 2013 Barb Simmons, MTCU.
Model and Relationships 6 M 1 M M M M M M M M M M M M M M M M
25 seconds left…...
Equal or Not. Equal or Not
Slippery Slope
: 3 00.
5 minutes.
Risk, Return, and the Capital Asset Pricing Model
Copyright 2001 Advanced Strategies, Inc. 1 Data Bridging An Overview Prepared for DIGIT By Advanced Strategies, Inc.
REGISTRATION OF STUDENTS Master Settings STUDENT INFORMATION PRABANDHAK DEFINE FEE STRUCTURE FEE COLLECTION Attendance Management REPORTS Architecture.
Week 1.
Module 12 WSP quality assurance tool 1. Module 12 WSP quality assurance tool Session structure Introduction About the tool Using the tool Supporting materials.
©Brooks/Cole, 2001 Chapter 12 Derived Types-- Enumerated, Structure and Union.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Chapter 12 View Design and Integration.
Essential Cell Biology
1 Phase III: Planning Action Developing Improvement Plans.
Clock will move after 1 minute
PSSA Preparation.
05/19/04 1 A Lessons Learned Process Celebrate the Successes Learn From the Woes Natalie Scott, PMP Sr. Project Manager.
Copyright © 2002 by The McGraw-Hill Companies, Inc. All rights reserved Chapter The Future of Training and Development.
Select a time to count down from the clock above
J.M. Campa and I. Hernando M&As performance in the European Financial Industry Croatian National Bank, July 2005 THE ELEVENTH DUBROVNIK ECONOMIC CONFERENCE.
Murach’s OS/390 and z/OS JCLChapter 16, Slide 1 © 2002, Mike Murach & Associates, Inc.
Implementing Strategy in Companies That Compete in a Single Industry
What the quarterly Labour Force Survey can tell us about the economic circumstances of people with sight loss Sue Keil RNIB.
Presentation transcript:

Value of Systems Engineering Summary report SECOE and related projects Eric Honour INCOSE Director, Sponsored Research Value of Systems Engineering; Summary Report 1/04

Agenda Background Gathered results on Value of SE Heuristic Claims of SE Value Gathered results on Value of SE NASA Tracking 1980s “Boundary Management” study “Large Engineering Projects” MIT study “Impact of SE at NASA” (SECOE 02-02) “Impact of SE on Quality & Schedule” Boeing “SE Effectiveness” IBM study “Value of SE” research (SECOE 01-03) Value of Systems Engineering; Summary Report 1/04

“System Thinking” Design Heuristic Claim of SE Better systems engineering leads to Better system quality/value Lower cost Shorter schedule SYSTEM DESIGN DETAIL PRODUCTION INTEGRATION TEST Traditional Design Time Risk Well-known claims, but never supported to date with any real numbers. Very similar to the concepts of the MIT Lean Aircraft Initiative. Traditional design allows risk at “acceptable” levels; the risk is reduced during integration and testing; risk goes to zero at end of project. “System thinking” design spends more time at the systems engineering level, thereby reducing risk earlier and at less cost/time. No proof exists for these claims. Saved Time/ Cost “System Thinking” Design Time Risk Value of Systems Engineering; Summary Report 1/04

NASA Tracking 1980s Source Werner Gruhl NASA Comptroller’s Office GRO76 OMV TDRSS GALL IRAS HST TETH GOES I-M MARS EDO CEN LAND76 ACT MAG ERB77 COBE CHA.REC. STS Chart is transcribed from a manual chart created by Werner Gruhl in 1992 prior to retirement. Individual data points are labeled with program names such as “STS” (Shuttle), Pioneer/Venture, Venus probes, etc. Each point is a major NASA project. “Definition” percent is based on phases A & B of the five-phase NASA project model. This chart shows how important it is to define the project before starting the detail work. Trend line shows that about 15% project definition appears to be optimum. Note distinction from SE. This is “project definition,” not SE. Project definition includes some tasks that are not SE, such as model prototyping, program management, etc. SE includes tasks in phases C, D, & E that are not included in this chart. LAND78 GRO82 SEASAT ERB88 UARS VOY HEAO EUVE/EP DE ULYS SMM PIONVEN IUE ISEE Source Werner Gruhl NASA Comptroller’s Office Value of Systems Engineering; Summary Report 1/04

“Boundary Management” Study Study of 45 high-tech new product development teams ...Ancona and Caldwell, Research Technology Management, 1990 Significant portion of time is spent at team boundaries Individual Time Spent Outside Team 14%* * Typically limited to few individuals Within Team 38% Alone 48% Study performed in late 1980s. The 45 teams were small teams, on the order of 5-15 people, developing specific commercial technical products. Purpose of the study was to determine how people spent their time, and how much of that time was spent working at the boundaries of the project. Value of Systems Engineering; Summary Report 1/04

Boundary Management in Technical Teams Boundary management occurs in four different roles. Level of effort in each role changes with time. Ambassador Buffering, building support, reporting, strategy Task Coordinator Lateral group coordination, info transfer, planning, negotiating Scout Obtain possibilities from outside - interface with marketing Guard Withhold information, prevent disclosure CREATION DEVELOPMENT DIFFUSION The types of effort included in “boundary management” cover four different roles. All of these roles are also coincident with many systems engineering management tasks. Value of Systems Engineering; Summary Report 1/04

“Boundary Management” Study Significant Findings: High-performing teams did more external interaction than low-performing teams System technical leadership creates greater success The findings are shown in black. The words in red are an interpretation of the findings by SECOE, made in light of SE. No surprise in the first finding, although it was generally a surprise to the rank-and-file team members when it was presented back to them. Typical response was to decry that “politics” seems to be more important than technical merit. The project leaders, however, who did most of the boundary management, were not at all surprised. The second finding is rather alarming to today’s process-oriented world, for it seems to say that the processes were not as important as the right leadership. This echoes the primary criticism made of CMMs, that their focus is on the processes rather than on the results of the processes (i.e. the quality of the products themselves). Note however that this study looked only at small teams, where process can be assumed to be less important than on larger teams. Internal team dynamics (goals, processes, individual satisfaction) did not correlate with performance Process definition is important but not sufficient. Value of Systems Engineering; Summary Report 1/04

“Large Engineering Projects” Study of 60 LEPs (power generation, transportation, oil production, technology) The Strategic Management of Large Engineering Projects, MIT Press 2000 Evaluation by interviews and by objective and subjective quality measures. Cost Targets Schedule Targets Objective Targets 82% Percent of Projects Meeting: Study was primarily in the engineering management realm. Note that most project met cost goals, a lesser “most” met schedule goals. Note how few of the projects actually met their objectives. This shows the extreme influence of program management in large projects, the ultimate use of CAIV, an unbalanced set of priorities in which cost and schedule are far more important than achieving objectives. 72% 45% 18% 37% Failed! Value of Systems Engineering; Summary Report 1/04

“Large Engineering Projects” Significant Findings: Most important determinant was a coherent, well-developed organizational/team structure A structure of leadership creates greater success Technical difficulties, social disturbance, size were not statistically linked to performance All projects had turbulent events Technical excellence could not save a socially unacceptable project Process definition is important but not sufficient. The findings are shown in black. The words in red are an interpretation of the findings by SECOE, made in light of SE. Findings were mostly on the management realm rather than on technical processes. The “organization” referred to in the first finding is the macro-organization for the LEP, not the project technical organization. For LEPs, politics was *the* most important factor – get the right buy-in from the right people to make the LEP successful. Success of the projects was independent of the technical difficulties. Success was far more dependent on getting the politics right. Contrast the last finding with the last finding of the “Boundary Management” study. For both very small projects (5-15 people) and very large projects (LEPs), it appears that technical process definition is less important. Speculate: Perhaps it is only in the middle that it matters? Perhaps the individual phase work on an LEP *is* subject to process definition? Unknown, but there is certainly a large current momentum toward process solutions to quality/product problems that is so far contradictory to these findings. Value of Systems Engineering; Summary Report 1/04

Impact of SE at NASA (SECOE 02-02) Survey research within NASA Form with 38 questions, answers on graded scale Typical questions: On your most recent project, what percent of your total project cost was spent on Systems Engineering tasks? On your most recent project, did systems engineering shorten or lengthen the planned schedule? Aggressive survey pursuit with management push NASA: 250 sent, 54% valid response INCOSE: 650 sent, 38% valid response Engineering of Complex Systems – The Impact of Systems Engineering at NASA, A.K.P.Kludze, Jr. doctoral dissertation George Washington Univ. 2003 Value of Systems Engineering; Summary Report 1/04

Response Demographics Participating Organization NASA 136 INCOSE 243 Total 379 Education Level Bachelor 34% Master 55% Doctor 11% Work Experience 0-10 41% 11-20 37% 21+ 22% Job Titles SE 56% PM 17% Other 27% ...significant differences by organization The project had a good representation of demographics. Responses were tracked separately for NASA and INCOSE sources, seeking differences between the two populations. There were some significant differences in the demographics for work experience, age, education level, and job titles between NASA and INCOSE sources. These differences may affect the perceptive results as reported. Age Groups 20-39 29% 40-59 61% 60+ 10% Value of Systems Engineering; Summary Report 1/04

Key Survey Results - Cost Percent Spent on SE 0-5% 6-10% 11-15% 16% + Respondents marked bracket to show percent of total cost spent on SE on last project. Mode at 6-10% of project Few projects spent 11-15% Unexplained bimodal response >16% (perhaps interpretation of “project”) Cost Benefit of SE Very Poor Poor Fair Good Excellent Respondents believe strongly in cost benefit of SE In secondary question, few respondents could quantify All questions were multiple choice, most with suggested brackets. Respondents believed that their last projects typically spent 6-10% of total project on SE, although there is an unexplained second mode at “16% or more.” (It is possible that this second mode resulted from an interpretation of “last project” to include many system definition projects. It is unlikely that a large number of end-to-end development projects spent over 16% on SE.) Respondents believed strongly that SE provided a good or excellent cost benefit to their projects, but few could quantify that benefit sufficiently to respond to a secondary question. Value of Systems Engineering; Summary Report 1/04

Key Survey Results - Schedule At What Stage is SE Most Effective? Vast majority of respondents believe that SE is most effective very early in a project. Very Early Midway End No Matter No Need Impact of SE on Schedule Shorten Lengthen Don’t Know INCOSE respondents believe SE shortened schedule on most recent project NASA respondents uncertain Secondary questions uncertain in the quantification. ...other results available in dissertation Respondents believed overwhelmingly that systems engineering is most effective when applied very early in a project. Other effects on schedule were uncertain. The survey explored many other facets of systems engineering, including the documentation produced, overall perceptive impact, impact on meeting objectives, satisfaction, CASE tools, risk impact, and technical impact. The survey also includes a summary of several NASA case studies. Value of Systems Engineering; Summary Report 1/04

Impact of Systems Engineering on Quality and Schedule Empirical evidence obtained from three parallel (same time) projects Each developed a complex, robotic Universal Holding Fixture (UHF) Each used a different level of SE Results are compared Trait UHF1 UHF2 UHF3 Size 10’ x 40’ 8’ x 50’ 6’ x 14’ Accuracy ±0.005” ±0.003” Contact Sensors None 57 108 Vacuum Sensors 1 70 Real-time checks No Yes Probe contours NC interface Impact of Systems Engineering on Quality and Schedule – Empirical Evidence, W. Forrest Frantz, Boeing Corp. 1995 Value of Systems Engineering; Summary Report 1/04

Project Differences Sys mgmt experience Low Low-Medium Project Trait UHF1 UHF2 UHF3 Sys mgmt experience Low Low-Medium Subcontract approach Design Reviews Full-time SE on site Access to SE support High, but not used High, used Requirements approach Token req’s Complete, detailed, integrated req’s Design approach HW/SW specs, multi-org approach Functional specs fully address HW/SW processes and interfaces Functional adherence Design docs took precedence; specs updated per design Specs followed, CCB control Design reviews Weekly team reviews Formal internal; little external Formal internal and external Integration approach Patterned after design Drive by functional specs; defined early in life cycle Acceptance testing Defined in high-level plan Formal tests based on Req’s and Functional specs Value of Systems Engineering; Summary Report 1/04

Impacts Use of better SE reduced Requirements to RFP (weeks) Design to Production (weeks) Overall Development Time (weeks) Use of better SE reduced Overall cycle time Time to create req’s Time to design/produce Time to test ...even in the face of more complex, higher quality systems! Value of Systems Engineering; Summary Report 1/04

Systems Engineering Effectiveness Study of 8 software product development projects during upgrade of SE processes Determining Systems Engineering Effectiveness, Bruce Barker, IBM Commercial Products, Conference on Systems Integration, Stevens Institute 2003 Evaluation by cost and schedule against a standard estimating method. Identify affected components Evaluate Impact, Complexity Convert to “points” Estimate Cost, Schedule Historical Database, Cost per “Point” New Product Concept Line Architecture Costing method applies only to project management, business management, systems engineering, system integration, and delivery into production. Application development costs are not included. IBM Commercial Products division reported this study in early 2003. Division implemented new SE processes in a largely software-oriented group. Prior costing method for non-SW tasks is based on a “point” system, where points are determined by the attributes of the product to be created/modified. This division maintains a proprietary historical database on cost per “point.” Value of Systems Engineering; Summary Report 1/04 © Copyright IBM Corp 2003 Used With Permission

Project Data $K/Point Averages 2000 2001 2002 $1,454/pt $1,142/pt 12,934 1,223 10,209 8,707 4,678 5,743 14,417 929 18,191 2,400 11,596 10,266 5,099 5,626 10,026 1,600 9.2 10.7 14.4 10.2 16.0 1,406 1,962 1,136 1,179 1,090 980 695 1,739 Year Project “Points” Cost ($K) SE Costs (%) $K/ Point Project data is on eight projects that spanned the introduction of the new SE procedures. Data shows a significant improvement in cost per “point” with the inclusion of SE procedures. $K/Point Averages 2000 2001 2002 $1,454/pt $1,142/pt $818/pt Without SE With SE $1,350/pt $944/pt Value of Systems Engineering; Summary Report 1/04 © Copyright IBM Corp 2003 Used With Permission

SE organization created SE Formal Training Started Timeline of Projects Project w/o SE Cost per “Point” Project With SE 2500 SE organization created As the Systems Engineering process has been enabled and integrated through the organization, productivity has increased Yearly Avg SE Process documented 2000 SE Formal Training Started 1500 1000 Chart shows graphically the improvement in cost per “point” over time as the SE procedures were implemented. 500 1999 2000 2001 2002 Value of Systems Engineering; Summary Report 1/04 © Copyright IBM Corp 2003 Used With Permission

Systems Engineering Effectiveness Significant Findings: Impact and complexity provide an effective method to perform parametric costing. Early parametric costing works. Preliminary data indicates that the use of Systems Engineering will improve project productivity when effectively combined with the Project Management and Test Processes. Systems engineering improves productivity. $K/Point Averages Without SE With SE $1,350/pt $944/pt 2000 2001 2002 $1,454/pt $1,142/pt $818/pt Study evaluated the “point” costing method and found it to be reasonably reliable within risk bounds. Of more interest to SE, the costing method showed a significant reduction in cost per “point” when the SE processes were implemented. Note that this is the cost of the non-SW tasks, indicating that a good SE process significantly reduces the costs incurred in program management, business management, systems engineering, system integration, and delivery into production. On further study the authors expect that SW costs will also be found to have been reduced, adding to the benefit. Study only includes eight projects – eight data points – and so is possibly subject to variability issues. Nonetheless, the improvement numbers are significant enough to outweigh any expected variability. Value of Systems Engineering; Summary Report 1/04 © Copyright IBM Corp 2003 Used With Permission

“Value of SE” (SECOE 01-03) Multi-year effort to obtain statistical data Correlate amount/quality of SE with project quality/success Parameterized by Technical “Size” Technical Complexity Risk Level Development Quality (function of Technical Value, Cost, Schedule, Risk) Effort started in 2001 with small seed money funded by INCOSE. Gathering project data to quantify the value of SE in terms of quality, cost, schedule, and risk parameterized by the amount and types of SE efforts. Original goal was to obtain sufficient data to parameterize the systems engineering efforts by phase and by type of task. The reality has been that gathering even the basic top-level data has been so difficult as to preclude any effective striation by phase or task. 6-10% ? SE Effort Value of Systems Engineering; Summary Report 1/04

Respondent Data 43 respondents 1 project not completed Values $1.1M - $5.6B SE Cost 0.3% - 26% Cost, schedule, quality correlate better with “Systems Engineering Effort”: SEE = SE Qual * (SE Cost %) Respondent data so far has all been subjective and anonymous, submitted in response to widely-distributed forms. Data includes basic project data on project size, planned cost & schedule, actual cost & schedule, percent SE effort, quality of SE effort. Median expenditure is 5% SE. Data correlation is better, however, when SE effort is qualified by the perceived subjective quality. When so qualified, median expenditure is 2%. Value of Systems Engineering; Summary Report 1/04

Cost Overrun vs. SE Effort 90% Assurance (1.6) Average Cost Overrun This is the companion data to the NASA study, but based on SE Effort rather than on Project Definition. The NASA chart shows that 15% of the project should be spent in definition; this chart shows that about 15% of the project should be spent on SE tasks. Despite the similarity of the numbers, these are two completely different findings. Note the opportunity for Skinnerian (a la B.F. Skinner) behavior at low levels of SE Effort. Some projects with little SE do finish on or near cost goals, leading program managers to conclude that it *is* possible and that <2% *is* an appropriate level for SE Effort. Ask any golfer. Intermittent reinforcement is powerful training. Source: SECOE 01-03 INCOSE 2003 Value of Systems Engineering; Summary Report 1/04

Schedule Overrun vs. SE Effort 90% Assurance (1.6) Average Schedule Overrun Companion data chart for schedule overruns. Source: SECOE 01-03 INCOSE 2003 Value of Systems Engineering; Summary Report 1/04

Test Hypothesis: Quality SE Effort Quality Hypothesis The upper left chart shows that the data supports the original hypothesis, that cost and schedule performance are improved (up to a point) by the use of more SE effort. (The vertical axis on this chart is the inverse average of cost and schedule overruns.) The lower right chart is an independent, subjective check on the hypothesis, based on the subjective inputs of respondents. Their evaluation of “comparative success” on a scale of 0-10 also shows higher project success with greater SE effort. Source: SECOE 01-03 INCOSE 2003 Value of Systems Engineering; Summary Report 1/04

Conclusions “Value of SE” SE effort improves development quality Cost, schedule, subjective Hypothesis is supported by the data Optimum SE effort is 10-15% or more Cost, schedule overruns are minimized However, note wide dispersion of data Also note few data points at this level; most projects spent far less Quality of the SE effort matters Lower quality SE reduces effectiveness Findings are reasonably self-evident, but. See the 2002 INCOSE symposium paper for a discussion of limitations of the data. Value of Systems Engineering; Summary Report 1/04

Company Participation Project benchmarking service funded by participants Aggregated data shared among participants Raw data protected by data blinding Every two years Public Quarterly Participating Companies (all data) 1 month Reported results Participating Companies This slide defines a mutually beneficial company participation in SECOE 01-03. Companies choose to participate in a project benchmarking service. For their participation, they receive comparative benchmarking reports on the selected projects as well as quarterly reports of the aggregated data from all participants. All raw data is heavily protected by a blinding system that prevents identification of the gathered data with any specific project or company. Aggregated data will not be provided until sufficient data is obtained to mask the effects of individual projects. SECOE Blinded Raw Data Statistical data Value of SE Project Benchmarking data Value of Systems Engineering; Summary Report 1/04

Company Participation Data gathering Select 4-6 programs One day session per participating company every 4 months 1½-hour sessions with PM+SE of each program Data gathered by two SECOE researchers Forms & notes do not identify programs Reports Benchmark report within 30 days of each quarterly session, compares to all prior data 4-month reports to all participants with aggregated results from all data, all sources Participation price $??K for each year Participation involves quarterly data gathering by interviews with the Program Manager and Systems Engineer of 4-6 projects. Interviews take one day for each participant company, with comparative results provided within 30 days. Participation price is not yet established, but would be on the order of $15-25K per year. Value of Systems Engineering; Summary Report 1/04

Questions? Eric Honour INCOSE Director for Sponsored Research Pensacola, FL, USA +1 (850) 479-1985 ehonour@hcode.com Value of Systems Engineering; Summary Report 1/04