Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

ELEN 5346/4304 DSP and Filter Design Fall Lecture 15: Stochastic processes Instructor: Dr. Gleb V. Tcheslavski Contact:
Sampling and Pulse Code Modulation
Information Theory EE322 Al-Sanie.
Use of Kalman filters in time and frequency analysis John Davis 1st May 2011.
Intensity Transformations (Chapter 3)
STAT 497 APPLIED TIME SERIES ANALYSIS
Aspects of Conditional Simulation and estimation of hydraulic conductivity in coastal aquifers" Luit Jan Slooten.
Hidden Markov Models Fundamentals and applications to bioinformatics.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
HMM-BASED PATTERN DETECTION. Outline  Markov Process  Hidden Markov Models Elements Basic Problems Evaluation Optimization Training Implementation 2-D.
Mutual Information Mathematical Biology Seminar
Fundamental limits in Information Theory Chapter 10 :
Statistical Background
Review of Probability and Random Processes
Lecture II-2: Probability Review
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Radial Basis Function Networks
Review of Probability.
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
The Statistical Properties of Large Scale Structure Alexander Szalay Department of Physics and Astronomy The Johns Hopkins University.
Probability Theory and Random Processes
Isolated-Word Speech Recognition Using Hidden Markov Models
Random Sampling, Point Estimation and Maximum Likelihood.
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
TELECOMMUNICATIONS Dr. Hugh Blanton ENTC 4307/ENTC 5307.
EE484: Probability and Introduction to Random Processes Autocorrelation and the Power Spectrum By: Jason Cho
1 Part 5 Response of Linear Systems 6.Linear Filtering of a Random Signals 7.Power Spectrum Analysis 8.Linear Estimation and Prediction Filters 9.Mean-Square.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Modern Navigation Thomas Herring
ENM 503 Lesson 1 – Methods and Models The why’s, how’s, and what’s of mathematical modeling A model is a representation in mathematical terms of some real.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Definitions Random Signal Analysis (Review) Discrete Random Signals Random.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Improved Cross Entropy Method For Estimation Presented by: Alex & Yanna.
CHAPTER 5 SIGNAL SPACE ANALYSIS
Name Iterative Source- and Channel Decoding Speaker: Inga Trusova Advisor: Joachim Hagenauer.
ECE-7000: Nonlinear Dynamical Systems Overfitting and model costs Overfitting  The more free parameters a model has, the better it can be adapted.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Discrete-time Random Signals
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
V.M. Sliusar, V.I. Zhdanov Astronomical Observatory, Taras Shevchenko National University of Kyiv Observatorna str., 3, Kiev Ukraine
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
Stochastic Hydrology Random Field Simulation Professor Ke-Sheng Cheng Department of Bioenvironmental Systems Engineering National Taiwan University.
Spatial Point Processes Eric Feigelson Institut d’Astrophysique April 2014.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
Dario Grana and Tapan Mukerji Sequential approach to Bayesian linear inverse problems in reservoir modeling using Gaussian mixture models SCRF Annual Meeting,
Computacion Inteligente Least-Square Methods for System Identification.
Proposed Courses. Important Notes State-of-the-art challenges in TV Broadcasting o New technologies in TV o Multi-view broadcasting o HDR imaging.
CWR 6536 Stochastic Subsurface Hydrology Optimal Estimation of Hydrologic Parameters.
Biointelligence Laboratory, Seoul National University
Stochastic Process - Introduction
Multiple Random Variables and Joint Distributions
Advanced Statistical Computing Fall 2016
Outline Introduction Signal, random variable, random process and spectra Analog modulation Analog to digital conversion Digital transmission through baseband.
Task: It is necessary to choose the most suitable variant from some set of objects by those or other criteria.
Random Noise in Seismic Data: Types, Origins, Estimation, and Removal
Stochastic Hydrology Random Field Simulation
Lecture 2 – Monte Carlo method in finance
EE513 Audio Signals and Systems
SCRF High-order stochastic simulations and some effects on flow through heterogeneous media Roussos Dimitrakopoulos COSMO – Stochastic Mine Planning.
Handwritten Characters Recognition Based on an HMM Model
Presentation transcript:

Binary Stochastic Fields: Theory and Application to Modeling of Two-Phase Random Media Steve Koutsourelakis University of Innsbruck George Deodatis Columbia University Presented at “Probability and Materials: From Nano- to Macro-Scale,” Johns Hopkins University, Baltimore, MD. January 5-7, 2005 Effects of Random Heterogeneity of Soil Properties on Bearing Capacity Radu Popescu and Arash Nobahar Memorial University George Deodatis Columbia University

What is a two-phase medium ? A continuum which consists of two materials (phases) that have different properties. What is a random two-phase medium ? A two-phase medium in which the distribution of the two phases is so intricate that it can only be characterized statistically. Examples: Synthetic: fiber composites, colloids, particulate composites, concrete. Natural: soils, sandstone, wood, bone, tumors.

Characterization of Two-Phase Random Media Through Binary Fields black : phase 1 white : phase 2 Complimentarity Condition:  Binary fields assumed statistically homogeneous  Only one of two phases used to describe medium j = 1 or 2

Random Fields Description First Order Moments – Volume Fraction Second Order Moments – Autocorrelation Properties of the Autocorrelation If no long range correlation exists: Positive Definite (Bochner’s Theorem)

Simulation of Homogeneous Binary Fields based on 1 st and 2 nd order information Available Methods: 1) Memoryless transformation of homogeneous Gaussian fields (translation fields) (Berk 1991, Grigoriu 1988 & 1995, Roberts 1995) Advantage : Computationally Efficient Disadvantage : Limited Applicability

Simulation of Homogeneous Binary Fields based on 1 st and 2 nd order information Available Methods: 2) Yeong and Torquato 1996 Using a stochastic optimization algorithm, one sample at a time can be generated whose spatial averages match the target. Advantage : Able to incorporate higher order probabilistic information Disadvantage : Computationally costly when a large number of samples needs to be generated

Modeling the Two-Phase Random Medium in 1D Using Zero Crossings are equidistant values of a stationary, Gaussian stochastic process Y(x) with zero mean, unit variance and autocorrelation medium

Modeling the Two-Phase Random Medium in 1D is also statistically homogeneous with autocorrelation 2 nd order joint Gaussian p.d.f Observe that:

Modeling the Two-Phase Random Medium in 1D For any pair, the correlation matrix : is always positive definite Observe that:

Modeling the Two-Phase Random Medium in 1D The function H doesn’t have an explicit form, except for special cases. It can be calculated numerically with great computational efficiency (Genz 1992). 4 th order joint Gaussian p.d.f

Three Gaussian Autocorrelations Corresponding Binary Autocorrelations

Sample Realizations of Three Cases with Different Clustering (but same ) case 1 – strong clustering case 2– medium clustering case 3 – weak clustering

Simulation: Inversion Algorithm For simulation purposes, the inverse path has to be followed. The goal is to find a Gaussian autocorrelation that can produce the target binary autocorrelation Questions:  Existence of for arbitrary  Uniqueness of Approximate solutions – Optimization Formulation Find Gaussian autocorrelation that produces a binary autocorrelation which minimizes the error with :

Iterative Inversion Algorithm – Basic Concept Step 1: Start with an arbitrary Gaussian autocorrelation such that and. Calculate the binary autocorrelation and the error Step 2: Perturb the values of by small amounts, keeping and the same. Calculate the new and the new error e. If the error is smaller, then keep the changes in otherwise reject them. Step 3: Repeat Step 2 until the error e becomes smaller than a prescribed tolerance of if a large number of iterations do not further reduce the error e.

Example – Known Gaussian Autocorrelation Binary autocorrelation Gaussian autocorrelation Observe stability of the mapping

Example – Debye Medium Binary autocorrelation

Example – Debye Medium Gaussian autocorrelation Gaussian Spectral Density Function

Example – Debye Medium Progression of Error Sample Realization

Advantage of the Method Proposed: The inversion procedure has to be performed only once. Once the underlying Gaussian autocorrelation is determined, samples of the corresponding Gaussian process can be generated very efficiently using the Spectral Representation Method (Shinozuka & Deodatis 1991). These Gaussian samples are then mapped according to: Simulation: Inversion Algorithm in order to produce the samples of the binary sequence.

Example – Anisotropic Medium Target Inversion

Gaussian autocorrelation Gaussian Spectral Density Function Example – Anisotropic Medium

Sample Realization

Example – Fontainebleau Sandstone Target Inversion

Gaussian autocorrelation Gaussian Spectral Density Function Example – Fontainebleau Sandstone

Actual Image Simulated Image

Generalized Formulation Consider a homogeneous, zero mean, unit variance, Gaussian random field with autocorrelation: will also be homogeneous

Generalized Formulation Autocorrelation In general:

Generalized Formulation: Discretization in 1D Properties of the Autocorrelation For we recover the previous formulation depend on: Surplus of parameters Surplus of parameters makes the method more flexible and able to describe a wider range of binary autocorrelation functions.

Conclusions  It takes advantage of existing methods for the generation of Gaussian samples and requires minimum computational cost especially when a large number of samples is needed.  The method proposed is shown capable of generating samples of a wide range of binary fields using nonlinear transformations of Gaussian fields.  Extension to higher order probabilistic information.  Generalized formulation increases the range of binary fields that can be modeled.  Extension to more than two phases.  Extension to three dimensions.