Filtering and Control of Flow in Hypersonic Engines Nick West, Peter Glynn, George Papanicolaou and Gianluca Iaccarino Institute for Computational and.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Programming exercises: Angel – lms.wsu.edu – Submit via zip or tar – Write-up, Results, Code Doodle: class presentations Student Responses First visit.
Monte Carlo Methods and Statistical Physics
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Sensitivity Analysis In deterministic analysis, single fixed values (typically, mean values) of representative samples or strength parameters or slope.
MACRODISPERSION AND DISPERSIVE TRANSPORT BY UNSTEADY RIVER FLOW UNDER UNCERTAIN CONDITIONS M.L. Kavvas and L.Liang UCD J.Amorocho Hydraulics Laboratory.
Discrete Event Simulation How to generate RV according to a specified distribution? geometric Poisson etc. Example of a DEVS: repair problem.
Sérgio Pequito Phd Student
Nonlinear and Non-Gaussian Estimation with A Focus on Particle Filters Prasanth Jeevan Mary Knox May 12, 2006.
A general assistant tool for the checking results from Monte Carlo simulations Koi, Tatsumi SLAC/SCCS.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Evaluating Hypotheses
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Efficient Estimation of Emission Probabilities in profile HMM By Virpi Ahola et al Reviewed By Alok Datar.
Comparative survey on non linear filtering methods : the quantization and the particle filtering approaches Afef SELLAMI Chang Young Kim.
Single Point of Contact Manipulation of Unknown Objects Stuart Anderson Advisor: Reid Simmons School of Computer Science Carnegie Mellon University.
CHAPTER 6 Statistical Analysis of Experimental Data
Operations Management
Lehrstuhl für Informatik 2 Gabriella Kókai: Maschine Learning 1 Evaluating Hypotheses.
8-1 Introduction In the previous chapter we illustrated how a parameter can be estimated from sample data. However, it is important to understand how.
Maximum likelihood (ML)
The Monte Carlo Method: an Introduction Detlev Reiter Research Centre Jülich (FZJ) D Jülich
1/49 EF 507 QUANTITATIVE METHODS FOR ECONOMICS AND FINANCE FALL 2008 Chapter 9 Estimation: Additional Topics.
Standard error of estimate & Confidence interval.
Introduction to Monte Carlo Methods D.J.C. Mackay.
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
1 Statistical Mechanics and Multi- Scale Simulation Methods ChBE Prof. C. Heath Turner Lecture 11 Some materials adapted from Prof. Keith E. Gubbins:
01/24/05© 2005 University of Wisconsin Last Time Raytracing and PBRT Structure Radiometric quantities.
Particle Filtering in Network Tomography
1 Miodrag Bolic ARCHITECTURES FOR EFFICIENT IMPLEMENTATION OF PARTICLE FILTERS Department of Electrical and Computer Engineering Stony Brook University.
Kalman filtering techniques for parameter estimation Jared Barber Department of Mathematics, University of Pittsburgh Work with Ivan Yotov and Mark Tronzo.
Bayesian parameter estimation in cosmology with Population Monte Carlo By Darell Moodley (UKZN) Supervisor: Prof. K Moodley (UKZN) SKA Postgraduate conference,
Random Sampling, Point Estimation and Maximum Likelihood.
Equation-Free (EF) Uncertainty Quantification (UQ): Techniques and Applications Ioannis Kevrekidis and Yu Zou Princeton University September 2005.
General Principle of Monte Carlo Fall 2013 By Yaohang Li, Ph.D.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Kalman Filter (Thu) Joon Shik Kim Computational Models of Intelligence.
Probabilistic Robotics Bayes Filter Implementations.
Lecture 2 Review Probabilities Probability Distributions Normal probability distributions Sampling distributions and estimation.
Suppressing Random Walks in Markov Chain Monte Carlo Using Ordered Overrelaxation Radford M. Neal 발표자 : 장 정 호.
Mobile Robot Localization (ch. 7)
Maximum a posteriori sequence estimation using Monte Carlo particle filters S. J. Godsill, A. Doucet, and M. West Annals of the Institute of Statistical.
Statistics What is statistics? Where are statistics used?
Application of the MCMC Method for the Calibration of DSMC Parameters James S. Strand and David B. Goldstein The University of Texas at Austin Sponsored.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Short Introduction to Particle Filtering by Arthur Pece [ follows my Introduction to Kalman filtering ]
Transport in potentials random in space and time: From Anderson localization to super-ballistic motion Yevgeny Krivolapov, Michael Wilkinson, SF Liad Levy,
Lab for Remote Sensing Hydrology and Spatial Modeling Dept of Bioenvironmental Systems Engineering National Taiwan University 1/45 GEOSTATISTICS INTRODUCTION.
SLAM Tutorial (Part I) Marios Xanthidis.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Application of Dynamic Programming to Optimal Learning Problems Peter Frazier Warren Powell Savas Dayanik Department of Operations Research and Financial.
Evaluating Hypotheses. Outline Empirically evaluating the accuracy of hypotheses is fundamental to machine learning – How well does this estimate its.
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
Lecture 1.31 Criteria for optimal reception of radio signals.
SUR-2250 Error Theory.
ASEN 5070: Statistical Orbit Determination I Fall 2014
Probabilistic Robotics
PSG College of Technology
Chapter 7: Sampling Distributions
Course: Autonomous Machine Learning
Lecture 10: Observers and Kalman Filters
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
Predictive distributions
Igor V. Cadez, Padhraic Smyth, Geoff J. Mclachlan, Christine and E
Filtering and State Estimation: Basic Concepts
Lecture 2 – Monte Carlo method in finance
STOCHASTIC HYDROLOGY Random Processes
Presentation transcript:

Filtering and Control of Flow in Hypersonic Engines Nick West, Peter Glynn, George Papanicolaou and Gianluca Iaccarino Institute for Computational and Mathematical Engineering, Stanford University We model the flow of air through a scramjet engine with the quasi one dimensional compressible Euler equations: The geometry of the engine has the form: And the fueling term is modeled by: Sources of Uncertainty The Effect of Uncertainties on the Solution Effects of Uncertainty Filtering: Algorithms and Results Particle FiltersFilter AccuracyPredicting Unstart Motivation and Background Background and GoalsReduced Model Deterministic Solver and Solutions Flow Control Control LoopControl Results Future Work Effect of Increasing Mixing Coefficient: (a) Low; (b) Moderate; (c) High, results in unstart. (a) (b) (c) Variance in Shock Location due to: (a) inflow uncertainty; (b) mixing coefficient uncertainty (a) (b) In one realization, variability in fueling can lead to unstart after a second fuel burst. Setup: The particle filter was run with: 200 particles; 10% uncertainty in the inflow velocity; 20% uncertainty in the mixing coef. with a spatial c.l. of m and temporal c.l. of s; observation noise was 10% The initial fueling period was 0.001s; the back-off fraction was 50% and the increase fraction was 10%; upt = 0.66; spt = 0.9; Result: Under these initial conditions, unstart should have occurred by seconds (see figure above). We achieved sustained operation of the engine for about 0.01 seconds (the engine did not unstart). Engine flow resulting from the control. The fueling periods can be identified as the dark blue regions in x = [0,0.1]. while(the engine is not unstarted) { start fueling now until now + fuel_time while(fueling) { observe mass flow in inflow update filter predict unstart prob. ∆ time ahead if (unstart prob. > upt) { stop fueling; fuel_time *= back-off-pct } if (unstart predicted) { while(engine still has a shock) { observe mass flow; update filter estimate prob. shocking if (prob. shocking < 1 - spt) {break} } } else { fuel_time *= increase-pct } } Specify: Initial Fueling Period: fuel_time, Predict Ahead Time: ∆, Reduction and Increase percentages and probability thresholds for unstart upt and “normal” engine operation spt Inflow Uncertainty: The value of the inflow variables are never known precisely. We model this by specifying a mean value for the inflow variable  u(i) and a relative standard deviation  so that These are unknown a priori, however, they are held constant through a single simulation. Fueling Uncertainty: Due to uncertainty in how well the fuel mixes with the air and other imprecisions in the chemistry, it is never clear how much heat is being released. We model this by making the fueling term random, where  (x,t) is a random field with given correlation lengths in space and time ( t and x ). Two approaches to model  (x,t) : Discrete Field: The field is discretized over the correlation lengths and the randomness is modeled with uniform random variables: Karhunen Loeve Expansion: The field is approximated through the expansion on basis functions with uniform random variables for weights. The basis functions are taken to be sinusoids over the fueling region. Observation Uncertainty: Observations of the mass flow (  u) are taken at discrete locations x i and t j. The observation noise is proportional to the mass flow, so an observation y ij is so that the standard deviation of the observation is  m  100% of the true value. The PDE is solved in two steps: 1.Solve U t = -F(U) x via: u* = u n -  t where is evaluated using an ENO-LLF method (2nd order) 2. Solve u t = f rhs (u) via Forth Order Runge Kutta, with initial condition u* for one time step  t. The plots below show that the estimate of the shock location can vary significantly due to the uncertainty in both fueling and inflow conditions. Furthermore, the fueling fluctuations can make the difference between an unstarted engine and a functional one. Filtering Control Investigation of other filtering algorithms: How well do filters based on Kalman Filtering (the Ensemble Kalman Filter) behave on this problem. Theoretical work on the behavior of filters of unstable dynamical systems and systems with rapid fluctuations. How does the ability to predict unstart depend on the number of particles, the observation noise size, the quality of the numerical solutions used for each particle Filter Divergence: can we detect when the filter has stopped working? What are practical values of the parameters used in the control loop? Can an optimal (in the sense of thrust produced) control or a practical approximation to the HJB solution be derived? How do the noise levels effect the control; how robust is the control? Mean Relative Error (MRE) is used to measure the performance of the filtering algorithm. M trajectories with M independent approximate filters are averaged to estimate the MRE: As the dynamical noise increases, the filter performance gets better; as the observation noise increases, it gets worse. Error is largest during fueling and around the shock. (a)(b) (c) Effect of Noise on Filter Performance: (a) 100% observation noise, 1% dynamical noise; (b) 10% observation noise, 1% dynamical noise; (c) 10% observation noise, 30% fueling noise, 10% inflow uncertainty Particle Filters use a collection of sample trajectories {X k (x,t)} to approximate the distribution of the random trajectory X(x,t). Each sample X k has a weight w k that is the likelihood that X is X k. The weights evolve over time to account for new observations y(t j ). The algorithm is as follows: Initialize {X k (x,0)} according to the inflow distribution while(there is another observation (at time t j )) { advance each X k (x,t j-1 ) to X k (x,t j ) by solving a d.e. compute the likelihood l kj of y(t j ) given X k (x,t j ) update the weights via: w k  w k l kj normalize the weights so they sum to 1 } To make a prediction at some time t j ≤ t < t j+1 of a function f of the state, the “sample” mean of f(X) is used: Some variants of the algorithm resample the particles after the update so that particle X k makes up approximately w k  100% of the post updated particles; the weights are then set to be uniform. The fueling of a hypersonic engine causes a shock to develop in the inlet of the engine. If this shock reaches the entrance of the engine, the engine will stall and cannot be restarted. This event is called unstart. The 1D compressible Euler equations capture the same phenomena of a developing shock when fueled, so we utilize this as our reduced model. Given observations of the flow in the inlet, are filtering algorithms effective at predicting unstart in an actionable time? Is the output of the filter of high enough quality that a control can be built around the predictions to prevent unstart? where the  I are independent uniforms. The filter was used to predict the occurrence of unstart. The plots below show the probability of unstart (at a given time) as a function of the amount of information (the time up to which observations were) used. The results show that filtering allows the prediction of unstart with plenty of time to adjust the fueling. (a)(b) (c) Both filters (Bayesian and Particle) converge to the correct value of “probability” of unstart faster than the Monte Carlo estimate. Occasionally the particle filter “gets it wrong” at one prediction time (near the transition to unstart), however, it always gets it right the next time, so this is of little concern. Probability of Unstart as a function of last observation included: (a) t = ; (b) t = ; (c) t = Dark Blue: Monte Carlo; Red: Particle Filter; Light Blue: Bayesian Filter