Nonstationary covariance structures II NRCSE. Drawbacks with deformation approach Oversmoothing of large areas Local deformations not local part of global.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Spatial point patterns and Geostatistics an introduction
Spatial point patterns and Geostatistics an introduction
Copula Regression By Rahul A. Parsa Drake University &
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Periodograms Bartlett Windows Data Windowing Blackman-Tukey Resources:
STAT 497 APPLIED TIME SERIES ANALYSIS
Gaussian process emulation of multiple outputs Tony O’Hagan, MUCM, Sheffield.
Pattern Recognition and Machine Learning
Meet the professor Friday, January 23 at SFU 4:30 Beer and snacks reception.
Nonstationary covariance structures II NRCSE. Drawbacks with deformation approach Oversmoothing of large areas Local deformations not local part of global.
The Fourier transform. Properties of Fourier transforms Convolution Scaling Translation.
Using wavelet tools to estimate and assess trends in atmospheric data Peter Guttorp University of Washington NRCSE.
Continuity and covariance Recall that for stationary processes so if C is continuous then Z is mean square continuous. To get results about sample paths.
Modelling non-stationarity in space and time for air quality data Peter Guttorp University of Washington NRCSE.
Arizona State University DMML Kernel Methods – Gaussian Processes Presented by Shankar Bhargav.
Global processes Problems such as global warming require modeling of processes that take place on the globe (an oriented sphere). Optimal prediction of.
Space-time processes NRCSE. Separability Separable covariance structure: Cov(Z(x,t),Z(y,s))=C S (x,y)C T (s,t) Nonseparable alternatives Temporally varying.
Statistical analysis and modeling of neural data Lecture 4 Bijan Pesaran 17 Sept, 2007.
Bayesian kriging Instead of estimating the parameters, we put a prior distribution on them, and update the distribution using the data. Model: Prior: Posterior:
STAT 592A(UW) 526 (UBC-V) 890-4(SFU) Spatial Statistical Methods NRCSE.
Using wavelet tools to estimate and assess trends in atmospheric data NRCSE.
Space-time Modelling Using Differential Equations Alan E. Gelfand, ISDS, Duke University (with J. Duan and G. Puggioni)
Spatial Statistics III Stat 518 Sp08. Bayesian kriging Instead of estimating the parameters, we put a prior distribution on them, and update the distribution.
Stat 592A Spatial Statistical Methods NRCSE.
Statistical Tools for Environmental Problems NRCSE.
Groundwater permeability Easy to solve the forward problem: flow of groundwater given permeability of aquifer Inverse problem: determine permeability from.
(1) A probability model respecting those covariance observations: Gaussian Maximum entropy probability distribution for a given covariance observation.
Lecture II-2: Probability Review
Review of Probability.
EE513 Audio Signals and Systems Statistical Pattern Classification Kevin D. Donohue Electrical and Computer Engineering University of Kentucky.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
The horseshoe estimator for sparse signals CARLOS M. CARVALHO NICHOLAS G. POLSON JAMES G. SCOTT Biometrika (2010) Presented by Eric Wang 10/14/2010.
Time Series Analysis.
STAT 497 LECTURE NOTES 2.
Principles of Pattern Recognition
The Poisson Process. A stochastic process { N ( t ), t ≥ 0} is said to be a counting process if N ( t ) represents the total number of “events” that occur.
Removal of Artifacts T , Biomedical Image Analysis Seminar presentation Hannu Laaksonen Vibhor Kumar.
ECE 8443 – Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Normal Distributions Whitening Transformations Linear Discriminants Resources.
The Story of Wavelets.
Mathematical Preliminaries. 37 Matrix Theory Vectors nth element of vector u : u(n) Matrix mth row and nth column of A : a(m,n) column vector.
Regional climate prediction comparisons via statistical upscaling and downscaling Peter Guttorp University of Washington Norwegian Computing Center
Yaomin Jin Design of Experiments Morris Method.
1 E. Fatemizadeh Statistical Pattern Recognition.
The Dirichlet Labeling Process for Functional Data Analysis XuanLong Nguyen & Alan E. Gelfand Duke University Machine Learning Group Presented by Lu Ren.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Week 21 Stochastic Process - Introduction Stochastic processes are processes that proceed randomly in time. Rather than consider fixed random variables.
Semivariogram Analysis and Estimation Tanya, Nick Caroline.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
ENEE631 Digital Image Processing (Spring'04) Basics on 2-D Random Signal Spring ’04 Instructor: Min Wu ECE Department, Univ. of Maryland, College Park.
1 EE571 PART 3 Random Processes Huseyin Bilgekul Eeng571 Probability and astochastic Processes Department of Electrical and Electronic Engineering Eastern.
Joint Moments and Joint Characteristic Functions.
ECE-7000: Nonlinear Dynamical Systems 2. Linear tools and general considerations 2.1 Stationarity and sampling - In principle, the more a scientific measurement.
MODELS FOR NONSTATIONARY TIME SERIES By Eni Sumarminingsih, SSi, MM.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION Kalman Filter with Process Noise Gauss- Markov.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Space-time processes NRCSE. Separability Separable covariance structure: Cov(Z(x,t),Z(y,s))=C S (x,y)C T (s,t) Nonseparable alternatives Temporally varying.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Chapter 6 Random Processes
Yun, Hyuk Jin. Theory A.Nonuniformity Model where at location x, v is the measured signal, u is the true signal emitted by the tissue, is an unknown.
Review of Probability Theory
Tatiana Varatnitskaya Belаrussian State University, Minsk
Ch3: Model Building through Regression
NRCSE 2. Covariances.
Nonstationary covariance structures II
Paul D. Sampson Peter Guttorp
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
EE513 Audio Signals and Systems
Generally Discriminant Analysis
Chapter 6 Random Processes
Presentation transcript:

Nonstationary covariance structures II NRCSE

Drawbacks with deformation approach Oversmoothing of large areas Local deformations not local part of global fits Covariance shape does not change over space Limited class of nonstationary covariances

Thetford revisited Features depend on spatial location

Kernel averaging Fuentes (2000): Introduce uncorrelated stationary processes Z k (s), k=1,...,K, defined on disjoint subregions S k and construct where w k (s) is a weight function related to dist(s,S k ). Then

Spectral version so where Hence

Estimating spectrum Asymptotically

Details K = 9; h = 687 km Mixture of Matérn spectra

An example Models-3 output, 81x87 grid, 36km x 36km. Hourly averaged nitric acid concentration week of

Models-3 fit

A spectral approach to nonstationary processes Spectral representation:  s slowly varying square integrable, Y uncorrelated increments Hence is the space- varying spectral density Observe at grid; use FFT to estimate in nbd of s

Testing for nonstationarity U(s,w) = log has approximately mean f(s,  ) = log f s (  ) and constant variance in s and .Taking s 1,...,s n and  1,...,  m sufficently well separated, we have approximately U ij = U(s i,  j ) = f ij +  ij with the  ij iid. We can test for interaction between location and frequency (nonstationarity) using ANOVA.

Details The general model has The hypothesis of no interaction (  ij =0) corresponds to additivity on log scale: (uniformly modulated process: Z 0 stationary) Stationarity corresponds to Tests based on approx  2 -distribution (known variance)

Models-3 revisited Sourcedfsum of squares 22 Between spatial points Between frequencies Interaction Total

Moving averages A simple way of constructing stationary sequences is to average an iid sequence. A continuous time counterpart is, where  is a random measure which is stationary and assigns independent random variables to disjoint sets, i.e., has stationary and independent increments.

Lévy-Khinchine is the Lévy measure, and  t is the Lévy process. W can construct it from a Poisson measure H(du,ds) on R 2 with intensity E(H(du,ds))= (du)ds and a standard Brownian motion B t as

Examples Brownian motion with drift:  t ~N(  t,  2 t) (du)=0. Poisson process:  t ~Po( t)  =  2 =0, (du)=  {1} (du) Gamma process:  t ~  (  t,  )  =  2 =0, (du)=  e -  u 1(u>0)du/u Cauchy process:  =  2 =0, (du)=  u -2 du/ 

Spatial moving averages We can replace R for t with R 2 (or a general metric space) We can replace R for s with R 2 (or a general metric space) We can replace b(t-s) by b t (s) to relax stationarity We can let the intensity measure for H be an arbitrary measure (ds,du)

Gaussian moving averages Higdon (1998), Swall (2000): Let  be a Brownian motion without drift, and. This is a Gaussian process with correlogram Account for nonstationarity by letting the kernel b vary with location:

Details yields an explicit covariance function which is squared exponential with parameter changing with spatial location. The prior distribution describes “local ellipses,” i.e., smoothly changing random kernels.

Local ellipses Foci

Prior parametrization Foci chosen independently Gaussian with isotropic squared exponential covariance Another parameter describes the range of influence of a given ellipse. Prior gamma.

Example Piazza Road Superfund cleanup. Dioxin applied to road seeped into groundwater.

Posterior distribution

Estimating nonstationary covariance using wavelets 2-dimensional wavelet basis obtained from two functions  and  : First generation scaled translates of all four; subsequent generations scaled translates of the detail functions. Subsequent generations on finer grids. detail functions

W-transform

Karhunen-Loeve expansion and where A i are iid N(0,1) Idea: use wavelet basis instead of eigenfunctions, allow for dependent A i

Covariance expansion For covariance matrix  write Useful if D close to diagonal. Enforce by thresholding off-diagonal elements (set all zero on finest scales)

Surface ozone model ROM, daily average ozone 48 x 48 grid of 26 km x 26 km centered on Illinois and Ohio. 79 days summer x3 coarsest level (correlation length is about 300 km) Decimate leading 12 x 12 block of D by 90%, retain only diagonal elements for remaining levels.

ROM covariance