MODEGAT 2009-09-14 Chalmers University of Technology Use of Latent Variables in the Parameter Estimation Process Jonas Sjöblom Energy and Environment Chalmers.

Slides:



Advertisements
Similar presentations
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 March 12, 2012.
Advertisements

3.3 Hypothesis Testing in Multiple Linear Regression
Aggregating local image descriptors into compact codes
PCA for analysis of complex multivariate data. Interpretation of large data tables by PCA In industry, research and finance the amount of data is often.
Regression analysis Relating two data matrices/tables to each other Purpose: prediction and interpretation Y-data X-data.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Principal Component Analysis (PCA) for Clustering Gene Expression Data K. Y. Yeung and W. L. Ruzzo.
« هو اللطیف » By : Atefe Malek. khatabi Spring 90.
A Short Introduction to Curve Fitting and Regression by Brad Morantz
Lecture 7: Principal component analysis (PCA)
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
From last time….. Basic Biostats Topics Summary Statistics –mean, median, mode –standard deviation, standard error Confidence Intervals Hypothesis Tests.
Kernel Methods for De-noising with Neuroimaging Application
Design of Engineering Experiments - Experiments with Random Factors
2DS00 Statistics 1 for Chemical Engineering Lecture 3.
Independent Component Analysis (ICA) and Factor Analysis (FA)
Today Introduction to MCMC Particle filters and MCMC
Identifiability of biological systems Afonso Guerra Assunção Senra Paula Freire Susana Barbosa.
PATTERN RECOGNITION : PRINCIPAL COMPONENTS ANALYSIS Prof.Dr.Cevdet Demir
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Efficiency of stroke clinical trials with ordinal outcomes: a simulation study UPC, Julio 2010 BASEL, October 2011 Juan Vicente Torres Supervisors: Dr.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Correlation Scatter Plots Correlation Coefficients Significance Test.
Simple Linear Regression
Latent Semantic Indexing Debapriyo Majumdar Information Retrieval – Spring 2015 Indian Statistical Institute Kolkata.
Department of Tool and Materials Engineering Investigation of hot deformation characteristics of AISI 4340 steel using processing map.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
Factor Analysis Psy 524 Ainsworth. Assumptions Assumes reliable correlations Highly affected by missing data, outlying cases and truncated data Data screening.
Digital Media Lab 1 Data Mining Applied To Fault Detection Shinho Jeong Jaewon Shim Hyunsoo Lee {cinooco, poohut,
© 2014 Carl Lund, all rights reserved A First Course on Kinetics and Reaction Engineering Class 13.
DOX 6E Montgomery1 Design of Engineering Experiments Part 9 – Experiments with Random Factors Text reference, Chapter 13, Pg. 484 Previous chapters have.
Introduction to Matrices and Matrix Approach to Simple Linear Regression.
IX. Transient Model Nonlinear Regression and Statistical Analysis.
Introduction to Biostatistics and Bioinformatics Regression and Correlation.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
© 2014 Carl Lund, all rights reserved A First Course on Kinetics and Reaction Engineering Class 13.
Journal Club Journal of Chemometrics May 2010 August 23, 2010.
Examining Data. Constructing a variable 1. Assemble a set of items that might work together to define a construct/ variable. 2. Hypothesize the hierarchy.
Speech Lab, ECE, State University of New York at Binghamton  Classification accuracies of neural network (left) and MXL (right) classifiers with various.
PATTERN RECOGNITION : PRINCIPAL COMPONENTS ANALYSIS Richard Brereton
WCRP Extremes Workshop Sept 2010 Detecting human influence on extreme daily temperature at regional scales Photo: F. Zwiers (Long-tailed Jaeger)
Principal Component Analysis (PCA)
Regression Analysis1. 2 INTRODUCTION TO EMPIRICAL MODELS LEAST SQUARES ESTIMATION OF THE PARAMETERS PROPERTIES OF THE LEAST SQUARES ESTIMATORS AND ESTIMATION.
Feature Selection and Extraction Michael J. Watts
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
Learning Photographic Global Tonal Adjustment with a Database of Input / Output Image Pairs.
Sensitivity Analysis and Experimental Design - case study of an NF-  B signal pathway Hong Yue Manchester Interdisciplinary Biocentre (MIB) The University.
MULTI-COMPONENT FUEL VAPORIZATION IN A SIMULATED AIRCRAFT FUEL TANK C. E. Polymeropoulos Department of Mechanical and Aerospace Engineering, Rutgers University.
Special Topics in Educational Data Mining HUDK5199 Spring, 2013 April 3, 2013.
1 Tom Edgar’s Contribution to Model Reduction as an introduction to Global Sensitivity Analysis Procedure Accounting for Effect of Available Experimental.
Team 5 Binge Thinkers (formerly known as People doing Math) Statistical Analysis of Vibrating Beam Peter Gross Keri Rehm Regal Ferrulli Anson Chan.
ChE 551 Lecture 04 Statistical Tests Of Rate Equations 1.
Chapter 14 EXPLORATORY FACTOR ANALYSIS. Exploratory Factor Analysis  Statistical technique for dealing with multiple variables  Many variables are reduced.
Dimension reduction (1) Overview PCA Factor Analysis Projection persuit ICA.
Institute of Statistics and Decision Sciences In Defense of a Dissertation Submitted for the Degree of Doctor of Philosophy 26 July 2005 Regression Model.
1 Bilinear Classifiers for Visual Recognition Computational Vision Lab. University of California Irvine To be presented in NIPS 2009 Hamed Pirsiavash Deva.
Chapter 12 REGRESSION DIAGNOSTICS AND CANONICAL CORRELATION.
MECH 373 Instrumentation and Measurements
March 9th, 2015 Maxime Lapalme Nazim Ould-Brahim
GRAPHICAL REPRESENTATIONS OF A DATA MATRIX
ENM 310 Design of Experiments and Regression Analysis
…Don’t be afraid of others, because they are bigger than you
Matrices Definition: A matrix is a rectangular array of numbers or symbolic elements In many applications, the rows of a matrix will represent individuals.
SMEM Algorithm for Mixture Models
Joanna Romaniuk Quanticate, Warsaw, Poland
Nonlinear regression.
Principal Components Analysis
Lecture 8: Factor analysis (FA)
Examining Data.
Presentation transcript:

MODEGAT Chalmers University of Technology Use of Latent Variables in the Parameter Estimation Process Jonas Sjöblom Energy and Environment Chalmers University of Technology

MODEGAT Chalmers University of Technology NO X Reduction catalysis Introduction ~mm ~µm~nm

MODEGAT Chalmers University of Technology Use of Latent Variables (LV) What is LV? How does it work? How can it be applied in the parameter estimation process? –3 case studies Why is it good? Outline

MODEGAT Chalmers University of Technology Latent Variable modelling Reduces a data matrix (using projections) to new, few and independent components (Latent Variables). Latent Variable (LV) Model: –P: loadings (linear combination of original variables) –T: scores (projections on the subspace defined by P) –# components: # linear independent directions Different types of Latent Variable (LV) models: –Principal Components Analysis (PCA) –Partial Least Squares (PLS) x1=dY/d  1 x2=dY/d  2 x3=dY/d  3 p1 p2 What is LV modelling?

MODEGAT Chalmers University of Technology Parameter Estimation Process How can LV models be applied? Define model and model assumptions Define ”experimental space” Fit parameters Satisfactory results? Yes! Evaluate the Design by LV-model (experimental rank) No! Evaluate the Design (perform experiments) Choice of experiments to perform Use of LV models &2

MODEGAT Chalmers University of Technology Application 1: LV models during the fitting process NO X Storage and Reduction (NSR) Mechanism –62 parameters Poor experimental design Jacobian  f/  used in gradient search –ill-conditioned –Local minima Objective: to improve parameter fitting by analysing parameter correlations and make parameters more orthogonal Ref: Sjoblom et al, Comput. Chem. Eng. 31 (2007)

MODEGAT Chalmers University of Technology Parameter assessment Jacobian  f/  –Evaluated for ALL adjustable parameters (not only fitted ones) Latent Variable (LV) method: –Partial Least Squares (PLS) using the Jacobian as "X" and f (Residual: simulated-observed gas phase concentrations) as “Y” Outcomes: 1.Correlation structure ! 2.Number of independent directions (# parameters to fit) ! 3.Which parameters to choose ! (method 1) 4.Parameter fit in LV space (method 2) How can LV models be used? -appl.1

MODEGAT Chalmers University of Technology LV example: "loading" plot X Y How can LV models be used? -appl.1

MODEGAT Chalmers University of Technology Results  Fitting results are comparable, but the fitting is more efficient (faster) due to fewer and more independent parameters, adopted for the data set at hand Method I (9 selected parameters) Method II (fitting of 9 scores) Method “brute force” (all 62 parameters) How can LV models be used? -appl.1

MODEGAT Chalmers University of Technology Application 2: Model-based DoE for precise parameter estimation "Simple" but realistic system: –NO-oxidation on Pt –Model from Olsson et.al. (1999) –Using simulated data (noise added) as experiments Objective: –How to find the experiments that enable precise estimation of the kinetic parameters Ref: Sjoblom et al, Comput. Chem. Eng 32 (2008)

MODEGAT Chalmers University of Technology Experiment assessment Jacobian  f/  –Evaluated for ALL "possible" experiments (3 iterations) Latent Variable (LV) method: –Principal Component Analysis (PCA) of J (unfolded 3 way matrix) –D-optimal design to select experiments Outcomes: –Correlation structure ! –Number of independent directions (# parameters to fit) ! –Which experiments to choose ! How can LV models be applied? -appl.2 Define model and model assumptions Define experimental space D-optimalChoice of experiments to perform using X or T from LV-model fit, analyze Satisfactory results? Yes! Evaluate the Design by LV-model (experimental rank) No! Evaluate the Design by LV- model Choice of experiments to perform

MODEGAT Chalmers University of Technology Results Overcomes dimensional reduction of the Fischer information matrix: by use of PCA (LV model of unfolded 3-way matrix) Almost perfect fit was obtained but parameter values were different (J not full rank) Using X (as is) or an LV approximation of X performs equally well –but becomes more efficient since it requires less experiments The LV model gives additional information of the dimensionality of selected experiments before they are performed. How can LV models be applied? -appl.2

MODEGAT Chalmers University of Technology Application 3: Extended Sensitivity Analysis for targeted Model Improvements H 2 assisted HC-SCR over Ag-Al 2 O 3 –Detailed model (23 reactions, heat balance) –Acceptable fit, but still significant Lack-of-Fit Objectives: –Verify (falsify) model assumptions –Get indications on how to improve model fit Refs: Creaser et al. Appl.Catal.B 90 (2009) 18-28, Sjöblom PhD Thesis (2009) Chalmers Thesis available at: NO 2 NO 3 CH 2 NO 2 O O O C 8 H 18 CO 2,  H N2N2 NO, H 2

MODEGAT Chalmers University of Technology Experimental Sensitivity analysis of 62 model parameters (not only fitted ones, not only kinetic parameters) –46 kinetic parameters –10 mass and heat transport parameters –6 other parameters Scaled local sensitivities –Unfold 3-way matrix to size n x pk, where n=26025 time points, p=62 parameters and k=5 responses Univariate analysis as well as LV modelling How can LV models be applied? -appl.3

MODEGAT Chalmers University of Technology LV-model and univariate measures How can LV models be applied? -appl.3 PCA model –Scores plot –Loadings plot –25 components Univariate table data –Confidence intervals –Sensitivity average, std, max –Correlations

MODEGAT Chalmers University of Technology Sensitivity Analysis results (examples) Mass transfer model needs attention –Include diffusivities in fitting? –Include internal mass transport? –Targeted transients? Heat transfer model needs attention –Improve/extend temperature measurements? –Consider additional sensors (HC, H 2 )? –Modify heat transfer model? –Targeted experiments? (For more details, see poster) How can LV models be applied? -appl.3

MODEGAT Chalmers University of Technology Ability to master different parts of the process –The model (assumptions) –The available data (experiments) –The parameter values (which to fit) Ability to “change focus” in the process as the fit develops Why are LV models good? Factors for successful parameter estimation Model assumptions ”experimental space” Fit parameters Happy? Yes! No! Evaluate the Design Choice of experiments New PhD project: “Improved methods for parameter estimation” Advertisement out now! Application dead line 20 th sept

MODEGAT Chalmers University of Technology LV Components: Few, New & linearly Independent Few: Improved efficiency Linear: Non-linear systems, LV models provide more robust linearisations Independent: Orthogonal sensitivities fulfils statistical requirements Why are LV models good?

MODEGAT Chalmers University of Technology Conclusions The LV concept is a viable way in the Parameter estimation process Widely applicable –during fitting, DoE, evaluation Proven more efficient (due to fewer dimensions) –Superior? Yet to be “proven”...

MODEGAT Chalmers University of Technology End Acknowledgements The Swedish Research council for financial support The Competence Centre for Catalysis (KCK) for good collaboration Derek Creaser & Bengt Andersson for fruitful supervision Thank you for your attention!