of Heritage and New Hardware For Launch Vehicle Reliability Models

Slides:



Advertisements
Similar presentations
Design of Experiments Lecture I
Advertisements

Structural reliability analysis with probability- boxes Hao Zhang School of Civil Engineering, University of Sydney, NSW 2006, Australia Michael Beer Institute.
COE 444 – Internetwork Design & Management Dr. Marwan Abu-Amara Computer Engineering Department King Fahd University of Petroleum and Minerals.
Uncertainty in Engineering The presence of uncertainty in engineering is unavoidable. Incomplete or insufficient data Design must rely on predictions or.
Departments of Medicine and Biostatistics
Errors & Uncertainties Confidence Interval. Random – Statistical Error From:
Modeling Human Reasoning About Meta-Information Presented By: Scott Langevin Jingsong Wang.
MARLAP Measurement Uncertainty
SIMULATION. Simulation Definition of Simulation Simulation Methodology Proposing a New Experiment Considerations When Using Computer Models Types of Simulations.
National Aeronautics and Space Administration Estimating Flight Software Risk for a Space Launch Vehicle 2014 RAMS VII Workshop November 4,
WMO UNEP INTERGOVERNMENTAL PANEL ON CLIMATE CHANGE NATIONAL GREENHOUSE GAS INVENTORIES PROGRAMME WMO UNEP IPCC Good Practice Guidance Simon Eggleston Technical.
1 841f06parnas13 Evaluation of Safety Critical Software David L. Parnas, C ACM, June 1990.
Hypothesis Testing in Linear Regression Analysis
1 Validation & Verification Chapter VALIDATION & VERIFICATION Very Difficult Very Important Conceptually distinct, but performed simultaneously.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Analysis and Visualization Approaches to Assess UDU Capability Presented at MBSW May 2015 Jeff Hofer, Adam Rauk 1.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Theory of Probability Statistics for Business and Economics.
1 D r a f t Life Cycle Assessment A product-oriented method for sustainability analysis UNEP LCA Training Kit Module f – Interpretation.
Lecture 2: Combinatorial Modeling CS 7040 Trustworthy System Design, Implementation, and Analysis Spring 2015, Dr. Rozier Adapted from slides by WHS at.
National Aeronautics and Space Administration From Determinism to “Probabilism” Changing our mindsets, or why PTC isn’t an easy sell - yet.
West Virginia University Towards Practical Software Reliability Assessment for IV&V Projects B. Cukic, E. Gunel, H. Singh, V. Cortellessa Department of.
Chapter 10 Quality Control.
Audit Sampling: An Overview and Application to Tests of Controls
Cost drivers, cost behaviour and cost estimation
Design of Experiments DoE Antonio Núñez, ULPGC. Objectives of DoE in D&M Processes, Process Investigation, Product and Process Q-improvement, Statistical.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
McGraw-Hill/Irwin © 2006 The McGraw-Hill Companies, Inc., All Rights Reserved. 1.
5-1 ANSYS, Inc. Proprietary © 2009 ANSYS, Inc. All rights reserved. May 28, 2009 Inventory # Chapter 5 Six Sigma.
On Predictive Modeling for Claim Severity Paper in Spring 2005 CAS Forum Glenn Meyers ISO Innovative Analytics Predictive Modeling Seminar September 19,
McGraw-Hill/Irwin Copyright © 2009 by The McGraw-Hill Companies, Inc. All rights reserved.
STATISTICS AND OPTIMIZATION Dr. Asawer A. Alwasiti.
RLV Reliability Analysis Guidelines Terry Hardy AST-300/Systems Engineering and Training Division October 26, 2004.
David Moser USACE Chief Economist
Futron Corporation 2021 Cunningham Drive, Suite 303 Hampton, Virginia Phone Fax Results You Can Trust Assessing.
Probabilistic Risk Assessment and Conceptual Design Bryan C Fuqua – SAIC Diana DeMott – SAIC
Building Valid, Credible & Appropriately Detailed Simulation Models
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
1 © Trinity Horne Limited Analysing pollution and targeting prevention activity in a UK water company Alec Ross Senior Statistician Luke Cooper.
Uncertain Judgements: Eliciting experts’ probabilities Anthony O’Hagan et al 2006 Review by Samu Mäntyniemi.
Dealing with Uncertainty: A Survey of Theories and Practice Yiping Li, Jianwen Chen and Ling Feng IEEE Transactions on Knowledge and Data Engineering,
Prior beliefs Prior belief is knowledge that one has about a parameter of interest before any events have been observed – For example, you may have an.
Uncertainty and controversy in environmental research
MECH 373 Instrumentation and Measurements
Data Science Credibility: Evaluating What’s Been Learned
Uncertainty Analysis in Emission Inventories
Audit Sampling: An Overview and Application
Audit Sampling: An Overview and Application to Tests of Controls
Overview Modern chip designs have multiple IP components with different process, voltage, temperature sensitivities Optimizing mix to different customer.
ThiQar college of Medicine Family & Community medicine dept
Complex Models in the Water Industry
Is High Placebo Response Really a Problem in Clinical Trials?
Most people will have some concept of what reliability is from everyday life, for example, people may discuss how reliable their washing machine has been.
ANCOLD/NZSOLD Conference 2004 Melbourne, Victoria, Australia
Uncertainty Analysis in Emission Inventories
Quality Risk Management
RAM XI Training Summit October 2018
Jose GONZALEZ PASTOR Economic & Adequacy Analyst - Elia
Differences between social & natural sciences
Knowing When to Stop: An Examination of Methods to Minimize the False Negative Risk of Automated Abort Triggers RAM XI Training Summit October 2018 Patrick.
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Simple Linear Regression
Failure Mode and Effect Analysis
Diseño de Experimentos
MECH 3550 : Simulation & Visualization
Building Valid, Credible, and Appropriately Detailed Simulation Models
Measure Phase Wrap Up and Action Items
Cost behaviour, cost drivers and cost estimation
Data Processing, Errors, Propagation of Uncertainty
Introduction to Decision Sciences
Presentation transcript:

of Heritage and New Hardware For Launch Vehicle Reliability Models Data Applicability of Heritage and New Hardware For Launch Vehicle Reliability Models Huntsville Society of Reliability Engineers RAM VIII Training Summit November 3-4, 2015 Mohammad AL Hassan, Steven Novack , Rob Ring Bastion Technologies, Inc.

Purpose Introduce heuristic guidelines for Launch Vehicles (LV) to apply a consistent method for estimating uncertainty across all LV elements Define uncertainty and applicability Discuss the heuristic approach Provide a brief method for uncertainty reduction

Uncertainty Point Estimate is single statistic value Generic failure data are often provided in reliability databases as point estimates (mean or median) Random Variable is s statistical distribution of probable values A component’s failure rate is considered a random variable where all possible values are represented by a probability distribution Uncertainty is represented as the bounds of a probability distribution for a parameter Median Mean Probability Density 0 5th 50th 65th 95th Lognormal (PDF) representing Random Variable uncertainty

Background on Uncertainty Understanding the sources of uncertainty, how to estimate it, and methods for reducing it supports better decision making Design and Development of Complex Launch Vehicles Launch Readiness Decisions Scenario and System Trade Studies Two types of uncertainty Aleatory (inherent physical random variability) Typically inherent in every system and cannot be reduced Epistemic (lack of knowledge or ignorance) Can be reduced by increasing knowledge Epistemic uncertainty includes: Completeness (missing scope/scenarios) Parameter (component/subsystem) Model (assumptions and development) This presentation will focus on epistemic uncertainty associated with the parameters of reliability data

Parameter Data Sources and the Concern of Applicability New Launch vehicles (LV) comprise heritage and new hardware Generic reliability data are collected from a wide spectrum of sources: Component databases (NPRD, EPRD, NUCLARR, etc.) Aerospace historical data Other industry historical data Piece part count method (MIL-HDBK-217F) Engineering judgment This variety of data sources raises the concern of source data applicability to target parameter

What is Data Applicability and why is it important Applicability is the degree of credibility and relevance of the source data Data applicability is a significant source of epistemic uncertainty that affects the spread of the distribution The heuristic approach is one way to consistently incorporate subjective judgments about applicability into the modeling process that fits in well with the Bayesian methods permit the use of subjective information The evaluation of applicability with a heuristic approach is most useful When other Bayesian statistical methods are not available or yield poor results To ensure consistent judgments throughout the model

An approach for quantifying data applicability Source applicability is divided into two categories Data source application Rank most applicable to least applicable data sources Data source environment Uncertainty increases when converting from one type of environment to another (i.e., GF to AUF, or AUF to AIF) Classify the applicability of the reliability data source Assess the environmental conditions between the source and the target parameter Apply the appropriate K factor (Logistic Performance Factors) to adjust the source data mean from its environment to predict the mean failure rate of the component in the application environment For each data source, apply the heuristic guidelines to quantify the uncertainty due to applicability

Data Source Classification Approach The complete Table

Process to Reduce Uncertainty Data Sources Assign Applicability values (Uncertainty) Assign Correlation Uncertainty Analysis Uncertainty Importance Analyses Loop Collect failure rate data for the components Assign lognormal prior distributions to failure rates by selecting the appropriate error factor for each unique reliability block Correlate reliability blocks Run uncertainty analyses (i.e., MC) Run uncertainty importance analyses and determine which blocks drive the lower and upper bounds Iterative

Case Study Simple Fault Tree A simple model consists of 4 components 1, 2, 3 and 4 Component 1 is in parallel with components 2, 3, and 4 Components 2, 3, and 4 are connected in a series configuration Note: Numbers shown on this slide are examples only and do not represent data from NASA systems

Case Study Uncertainty Quantification Results Run1 Restate the following: An EF is a measure of uncertainty Model Error Factor (EF) = 95th/ Median = 16.2

Case Study Uncertainty-Importance Analysis The Uncertainty-importance routine identified component 1 as a major driver of the model uncertainty A data research on Component 1 yielded more applicable data Found historical data for a like component from the aerospace industry

Case Study Uncertainty Quantification Results Run2 Restate the following: An EF is a measure of uncertainty Model Error Factor (EF) = 95th/ Median = 5.6

Conclusion Uncertainty represents the spread of the parameter estimate. How confident are we that the estimate is correct Useful for decision makers Communicates credibility or lack of it Highly applicable data improves the certainty of model estimates Crucial step that increases the confidence level of the components and subsequent uncertainty estimate Translating between environments may be a significant source of parameter uncertainty Uncertainty-Importance routines can be a basis for data analysis efforts By prioritizing the need to collect additional parameter data

Questions? POC: Mohammad AL Hassan (Mo) Mohammad.i.alhassan@nasa.gov 205-544-2410