Presentation on theme: "Probabilistic Modeling, Multiscale and Validation Roger Ghanem University of Southern California Los Angeles, California."— Presentation transcript:
Probabilistic Modeling, Multiscale and Validation Roger Ghanem University of Southern California Los Angeles, California
Outline Introduction and Objectives Representation of Information Model Validation Efficiency Issues
Introduction Objectives: –Determine certifiable confidence in model-based predictions: Certifiable = amenable to analysis Accept the possibility that certain statements, given available resources, cannot be certified. –Compute actions to increase confidence in model predictions: change the information available to the prediction. More experimental/field data, more detailed physics, more resolution for numerics… Stochastic models package information in a manner suitable for analysis: –Adapt this packaging to the needs of our decision-maker Craft a mathematical model that is parameterized with respect to the relevant uncertainties.
Two meaningful questions Nothing new here. What is new is: sensor technology computing technology Can/must adapt our “packaging” of information and knowledge accordingly.
Theoretical basis: Cameron-Martin Theorem The polynomial chaos decomposition of any square-integrable functional of the Brownian motion converges in mean-square as N goes to infinity. For a finite-dimensional representation, the coefficients are functions of the missing dimensions: they are random variables (Slud,1972).
The random quantities are resolved as surfaces in a normalized space: These could be, for example: Parameters in a PDE Boundaries in a PDE (e.g. Geometry) Field Variable in a PDE Multidimensional Orthogonal Polynomials Independent random variables Dimension of vector reflects complexity of Representation of Uncertainty
Uncertainty due to small experimental database or anything else. Uncertainty in model parameters Dimension of vector reflects complexity of Representation of Uncertainty
Galerkin Projections Maximum Likelihood Maximum Entropy Bayes Theorem Ensemble Kalman Filter Characterization of Uncertainty
Characterization of Uncertainty Maximum Entropy Estimation / Spatio-Temporal Processes Temperature is measured as function of time along cables. Temperature fluctuations affect sound speed in the ocean.
Temperature time histories,, at various depths. Characterization of Uncertainty Maximum Entropy Estimation / Spatio-Temporal Processes
Reduced order model of KL expansion A typical plot of marginal pdf for a Karhunen- Loeve variable. Characterization of Uncertainty Maximum Entropy Estimation with Histogram Constraints Spearman Rank Correlation Coefficient is also matched:
Example Application: W76 Foam study Built-up structure with shell, foam and devices. Foam domain. 1. Modeled as non-stationary random field. 2. Accounting for random and structured variations 3. Limited observations are assumed: selected 30 locations on the foam. Limited statistical observations: Correlation estimator from small sample size: interval bounds on correlation matrix. System has 10320 HEX elements. Stochastic block has 2832 elements.
Example Application: W76 Foam study Polynomial Chaos representation of epistemic information Constrained polynomial chaos construction Radial Basis function consistent spatial interpolation Cubature integration in high-dimensions
Foam study Statistics of maximum acceleration Histogram of average of maximum acceleration
Foam study Statistics of maximum acceleration Plots of density functions of the maximum acceleration
Estimate %95 probability box Remarks: Confidence intervals are due to finite sample size. CDF of calibrated stochastic parameters (3 out of 9 shown)
Sample Mean ofSample Variance of 0.08350.000830 Remark: Based on only 25 samples. Validation Challenge Problem Treated as a random variables: Criterion for certifying a design (we would like to assess it without full- scale experiments:
Solution with 3 rd order chaos Solution with 3 rd order chaos and enrichment Exact solution Efficiency Issues: Basis Enrichment
semiconductor conductor OBJETCIVE: 1.Determine requirements on manufacturing tolerance. 2.Determine relationship between manufacturing tolerance and performance reliability.
APPROACH Define the problem on some underlying deterministic geometry. Define a random mapping from the deterministic geometry to the random geometry. Approximate this mapping using a polynomial chaos decomposition. Solve the governing equations using coupled FEM-BEM. Compare various implementations.
TREATMENT OF RANDOM GEOMETRY Ref: Tartakovska & Xiu, 2006.
Comparison of Monte Carlo, Quadrature and Exact Evaluations of the Element Integrations
Only one deterministic solve required. Minimal change to existing codes. Need iterative solutions with multiple right hand sides. Integrated into ABAQUS (not commercially). Using Components of Existing Analysis Software