What to look at in fire engineering analysis

Slides:



Advertisements
Similar presentations
Accident and Incident Investigation
Advertisements

Running a model's adjoint to obtain derivatives, while more efficient and accurate than other methods, such as the finite difference method, is a computationally.
Design of Experiments Lecture I
Chapter 7 Statistical Data Treatment and Evaluation
What does integrated statistical and contextual knowledge look like for 3.10? Anne Patel & Jake Wills Otahuhu College & Westlake Boys High School CensusAtSchool.
Simple Linear Regression
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Classroom Assessment A Practical Guide for Educators by Craig A
Simple Linear Regression. Introduction In Chapters 17 to 19, we examine the relationship between interval variables via a mathematical equation. The motivation.
Decision analysis and Risk Management course in Kuopio
IOPS Toolkit for Risk-based Supervision
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 24 Slide 1 Critical Systems Validation 1.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Towards Appropriate Selection of Analysis Tools and Methods.
Simulation Prepared by Amani Salah AL-Saigaly Supervised by Dr. Sana’a Wafa Al-Sayegh University of Palestine.
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
IAEA International Atomic Energy Agency Methodology and Responsibilities for Periodic Safety Review for Research Reactors William Kennedy Research Reactor.
1 1 Slide Simulation Professor Ahmadi. 2 2 Slide Simulation Chapter Outline n Computer Simulation n Simulation Modeling n Random Variables and Pseudo-Random.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
Building Industry Authority Determination 2003/3 Commentary Paul Clements.
Conducting a research project. Clarify Aims and Research Questions Conduct Literature Review Describe methodology Design Research Collect DataAnalyse.
Risk control in the use of computer models The reflective approach Iain MacLeod.
Aaron Nicholson Beca. by Aaron Nicholson C/VM2 Design vs Non C/VM2 Performance Based Design.
Controlled Assessment Unit - CAU Investigative Skills Assessment - ISA 7 ISA section 2 exam – Additional / Separate Science Section 2 Exam: Up to 50 minutes.
Best in Market Pricing. What is Best in Market Pricing ? An extension of parametric modeling for negotiating lowest pricing for a statement of work consisting.
Stages of Research and Development
Understanding Standards: Advanced Higher Statistics
Conclusions & Recommendations
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
GCE AS/A Level Biology Practical Assessment.
Robert Anderson SAS JMP
Planning my research journey
Prepared by Lloyd R. Jaisingh
DSS & Warehousing Systems
AP Seminar: Statistics Primer
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Impact evaluations at IFAD-IOE
ITEC 451 Network Design and Analysis
Strategies to incorporate pharmacoeconomics into pharmacotherapy
Quality Risk Management
DSS & Warehousing Systems
Critical Reading of Clinical Study Results
H070 Topic Title H470 Topic Title.
Chapter 10 Verification and Validation of Simulation Models
IB Environmental Systems and Societies
Actuaries Climate Index™
Experimental Power Graphing Program
Writing for academic publication: introduction to the presentation and formatting of research papers Rod Gameson.
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Part 2 – Evaluation of Findings (distinction) Broken down into 5 areas: Evaluation of statistical data Evaluation of conclusions drawn Evaluating.
Comparison of different gearshift prescriptions
Critical Systems Validation
Fundamental Test Process
Statistical Analysis Error Bars
Geology Geomath Chapter 7 - Statistics tom.h.wilson
Using Statistical techniques in Geography
Warmup To check the accuracy of a scale, a weight is weighed repeatedly. The scale readings are normally distributed with a standard deviation of
YuankaiGao,XiaogangLi
Rick Hoyle Duke Dept. of Psychology & Neuroscience
Secure Knowledge (1-3) Describe investigation process
Two Halves to Statistics
Analyzing Reliability and Validity in Outcomes Assessment
A VOYAGE THROUGH PERFORMANCE MANAGEMENT TOOLS
DESIGN OF EXPERIMENTS by R. C. Baker
Numerical Analysis of slopes
Backtesting.
A Stochastic Approach to Occupant Pre-Movement in Fires
Dr. Arslan Ornek MATHEMATICAL MODELS
Presentation transcript:

What to look at in fire engineering analysis Jamie Vistnes Manager, Fire Safety Policy Unit

Approaches and Methods of Analysis Absolute Comparative Qualitative Quantitative Deterministic Probabilistic 2

1.2.9.4 Methods of analysis There are many forms of analysis methods: formulas, equations and hand calculations spreadsheet calculations statistical studies experiments with physical scale models full-scale experimental tests such as fire tests or trial evacuations of real buildings computer simulation of fire development and smoke spread computer simulation of people movement. 3

1.2.9.4 Methods of analysis The methods chosen should: be well documented (especially their limitations and assumptions) either in the literature or by the fire engineer be well validated be suitable for the task generate outputs that can be compared with the acceptance criteria agreed for the analysis have clearly defined limitations and assumptions that are well documented. “The FEB report should record, as appropriate, the above information for each method chosen.” 4

1.2.9.4 Methods of analysis http://www.nist.gov/el/fire_grants.cfm These can be summarised as needing to be: Well documented and well validated Suitable for the task Used within limitations and assumptions Generate suitable outputs to address acceptance criteria http://www.nist.gov/el/fire_grants.cfm 5

Well Documented and Well Validated Standards available for assessing and verifying fire models ATS 5387.3—2006 ISO/TR 13387-3:1999 Australian Technical Specification Guidelines—Fire safety engineering Part 3: Assessment and verification of mathematical fire models ASTM E 1355-97, Standard Guide for Evaluating the Predictive Capability of Deterministic Fire Models 6

input parameters supplied to the model are inaccurate; A.6 Review of FRIEDMAN, R., An International Survey of Computer Models for Fire And Smoke, Journal of Fire Protection Engineering, 4, pp. 81-92, 1992 Reasons why a model may not yield results in complete accord with actual fire behaviour are: idealizations and simplifications on which the model is based deviate significantly from reality; input parameters supplied to the model are inaccurate; "default" values of coefficients used internally in the model are incorrect; the computation process yields a wrong result (due to poor choice of time steps or mesh size or due to mathematical singularities or instabilities); the experimental measurements are incorrect or non-repeatable. 7

A.6 Review of FRIEDMAN, R., An International Survey of Computer Models for Fire And Smoke, Journal of Fire Protection Engineering, 4, pp. 81-92, 1992 It is generally risky to apply a model to conditions which are drastically different from those for which the model has been validated. Conclusion: Just because a model exists, doesn’t mean it is necessarily accurate and appropriate for use. 8

Well Documented and Well Validated How much effort goes in to reviewing validation and verification documentation? How much effort goes in to investigating whether it even exists? How much of this review is then documented in the FEB/FER? 9

Well Documented and Well Validated FDS may be considered generally well validated for predicting smoke movement in a building. Can also be used to predict fire spread and sprinkler operation Questions still remain as to how accurately are these predicted. Therefore, just because something is included in a model, doesn’t mean it can always be used. http://www.tyco-fire.com/TFP_common/DoorWayFlowsWP.pdf 10

Suitable for task Some typical examples not considered suitable for the task: Egress modelling may be undertaken to demonstrate that the evacuation time is short compared to the FRL prescribed, however these times do not correlate. Fire Brigade Intervention is often analysed and quantified (to some extent), however not linked to the acceptance criteria Whilst quantitative modelling and robust technical analysis is encouraged, it needs to be appropriate to the agreed acceptance criteria and not used as a “smoke screen” to fill out the report. 11

Within Limitations and Assumptions How well do you know the limitations and assumptions of a tool? Have you looked at the basis for deriving the tool? For example, fire severity calculations derived from experiments in relatively small compartments. When is it OK to move outside these limitations? For example, using a zone model for a 2,500m2 compartment? Again, are these clearly documented in the FEB/FER? 12

Generate suitable outputs to address acceptance criteria Can the results generated be used to demonstrate that the acceptance criteria has been achieved? Without sufficient outputs: The conclusions drawn from the analysis cannot be appropriately documented Makes it difficult for a reviewer to determine how much engineering judgement has been used 13

Use of Technical Tools - Inputs Obviously affect the outcome – so critical FRNSW experience a general lack of detail in FEB, therefore how can we agree Need to further work on what are considered critical to the analysis All inputs/assumptions clearly justified and documented Also often find changes occur between FEB and FER, but not documented Note: New Zealand Verification Method not considered an appropriate reference as this is a VM for an different country’s building code with no specific supporting references 14

Use of Technical Tools - Inputs Sensitivities need to be considered, which may vary according to the tool being used These should also be identified at FEB stage FRNSW experience a wide variation in how this is considered https://pages.nist.gov/cfast/ 15

Use of Technical Tools - Outputs Need to be sufficient to justify the results/conclusions made FRNSW experience a wide variation in both what outputs are shown and how they are used, for example: No results, only reference to modelling being undertaken A random sample of CFD results provided that may not directly correlate with times concerned CFD results at certain locations only A full set of CFD results for each criteria and time of interest Only thermocouple readings 16

Use of Technical Tools Most frequent “technical tool” used is arguably the fire engineer themselves! More development in models, less use in practice? Is this because of the types of issues being addressed by fire engineering, or fire engineers changing methods of analysis to reduce costs? 17

What is Needed for the Future? Encourage more robust quantitative analysis using technical tools, rather than qualitative comparisons to obscure DtS designs that are far from what is proposed Full probabilistic risk assessments utilising technical tools to determine the risk Individual and societal risk likely to drive policy development / direction in the built environment AFAC part of BCC working group in this space Wider consultation likely to occur Quantifiable risk likely to be the underpinning structure on which building codes developed – need for proper risk assessment 18

Jamie Vistnes Manager - Fire Safety Policy Unit Questions Jamie Vistnes Manager - Fire Safety Policy Unit | E Jamir.Vistnes@fire.nsw.gov.au | T (02) 9742 7434 | www.fire.nsw.gov.au 19