Geo597 Geostatistics Ch9 Random Function Models.

Slides:



Advertisements
Similar presentations
Estimation of Means and Proportions
Advertisements

Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Sampling: Final and Initial Sample Size Determination
AP Statistics Chapter 7 Notes. Random Variables Random Variable –A variable whose value is a numerical outcome of a random phenomenon. Discrete Random.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Prediction, Correlation, and Lack of Fit in Regression (§11. 4, 11
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 13 Nonlinear and Multiple Regression.
Simple Linear Regression
Confidence Intervals for Proportions
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Chap 9-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 9 Estimation: Additional Topics Statistics for Business and Economics.
Copyright © 2010 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
Introduction to Probability and Statistics Linear Regression and Correlation.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
 The Law of Large Numbers – Read the preface to Chapter 7 on page 388 and be prepared to summarize the Law of Large Numbers.
Measurement, Quantification and Analysis Some Basic Principles.
1 (Student’s) T Distribution. 2 Z vs. T Many applications involve making conclusions about an unknown mean . Because a second unknown, , is present,
Simple Linear Regression and Correlation
Simple Linear Regression Analysis
Modern Navigation Thomas Herring
Linear Regression/Correlation
Standard error of estimate & Confidence interval.
AM Recitation 2/10/11.
Review of Statistical Inference Prepared by Vera Tabakova, East Carolina University ECON 4550 Econometrics Memorial University of Newfoundland.
Concepts and Notions for Econometrics Probability and Statistics.
Jointly Distributed Random Variables
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies, Inc. All rights reserved. Statistical Inferences Based on Two Samples Chapter 9.
Topic 5 Statistical inference: point and interval estimate
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 8-1 Confidence Interval Estimation.
QBM117 Business Statistics Estimating the population mean , when the population variance  2, is known.
Review: Two Main Uses of Statistics 1)Descriptive : To describe or summarize a collection of data points The data set in hand = all the data points of.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Chapter 9 Hypothesis Testing and Estimation for Two Population Parameters.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
+ Chapter 12: Inference for Regression Inference for Linear Regression.
Business Statistics for Managerial Decision Farideh Dehkordi-Vakil.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
Research Seminars in IT in Education (MIT6003) Quantitative Educational Research Design 2 Dr Jacky Pow.
Fundamentals of Data Analysis Lecture 3 Basics of statistics.
1 Chapter 7 Sampling Distributions. 2 Chapter Outline  Selecting A Sample  Point Estimation  Introduction to Sampling Distributions  Sampling Distribution.
© Copyright McGraw-Hill 2000
Stat 1510: Sampling Distributions
Copyright © 2009 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
1 Estimation of Population Mean Dr. T. T. Kachwala.
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
Learning Objectives After this section, you should be able to: The Practice of Statistics, 5 th Edition1 DESCRIBE the shape, center, and spread of the.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 19 Confidence Intervals for Proportions.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Statistics 19 Confidence Intervals for Proportions.
Fundamentals of Data Analysis Lecture 3 Basics of statistics.
Biostatistics Class 3 Probability Distributions 2/15/2000.
Chapter 9 Introduction to the t Statistic
Statistics for Business and Economics 7 th Edition Chapter 7 Estimation: Single Population Copyright © 2010 Pearson Education, Inc. Publishing as Prentice.
Class Six Turn In: Chapter 15: 30, 32, 38, 44, 48, 50 Chapter 17: 28, 38, 44 For Class Seven: Chapter 18: 32, 34, 36 Chapter 19: 26, 34, 44 Quiz 3 Read.
Statistica /Statistics Statistics is a discipline that has as its goal the study of quantity and quality of a particular phenomenon in conditions of.
Estimating standard error using bootstrap
Confidence Intervals for Proportions
Making inferences from collected data involve two possible tasks:
Ch9 Random Function Models (II)
Confidence Intervals for Proportions
AP Statistics: Chapter 7
Chapter 5 Sampling Distributions
CHAPTER 12 More About Regression
Chapter 5 Sampling Distributions
Statistical Inference for the Mean: t-test
Presentation transcript:

Geo597 Geostatistics Ch9 Random Function Models

The Necessity of Modeling  Estimation needs a model of how the phenomenon behaves at locations where it has not been sampled.  Geostatistics emphasize on the underlying model in order to inference the unknown values at locations where the phenomenon is not sampled.

The Necessity of Modeling...  If we know the physical or chemical processes that generate the data, deterministic models help describe the behavior of a phenomenon based on a few samples.  In most earth sciences,data are results of a vast number of processes whose complex interactions we are not yet able to describe quantitatively.  The random function models recognize this uncertainty and estimate values at unknown locations based on assumptions about the statistical characteristics of the phenomenon.

The Necessity of Modeling...  Without an exhaustive data set to check the estimations, it is impossible to prove whether the model is right or wrong.  The judgment of the goodness is largely qualitative and depends on the appropriateness of the underlying model.  This judgement, which must take into account the goals of the study, will benefit considerably from a clear statement of the model.

Deterministic Models  Examples: the height of a bouncing ball vs. Interest rates of a bank (Fig 9.2, 9.3).  Based on simplifying assumptions, deterministic models can capture the overall char of a phenomenon and extrapolate beyond the available sampling.  Deterministic modeling is possible only if the context of the data values is well understood. The data values, by themselves, do not reveal what the appropriate model should be.

Probabilistic Models  In earth sciences, the available sample data are viewed as the result of some random process. Though they may not be the result of random processes, this approach helps predict unknown values.  Therefore, geostatistical approach to estimation is based on a probabilistic model.  It also enables us to gauge the accuracy of our estimates and to assign confidence intervals to them.

Probabilistic Models...  Most commonly used geostatistical estimation requires only certain parameters of a random process.  Most frequently used:  The mean and variance of a linear combination of random variables.

Random Variables  A random variable is a variable whose values are randomly generated according to some probabilistic mechanism.  Random variables V vs. actual outcomes v  All possible outcomes:  Actually observed outcomes:  A set of corresponding probabilities

Random Variables... Results of throwing a die  Random variable: D  Possible outcomes: d (1) =1, d (2) =2, d (3) =3, d (4) =4, d (5) =5, d (6) =6  Probability of each outcome: p 1 =p 2 =p 3 =p 4 =p 5 =p 6 =1/6  Observed outcomes: 4,5,3,3,2,4,3,5,5,6,6,2,5,2,1,4 d 1 =4, d 2 =5, …, d 10 =6, …, d 16 =4

Random Variables...  Possible outcomes of a random variable need not all have equal probability  Throwing two dies and taking the larger of the two 4,5,3,3,2,4,3,5,5,6,6,2,5,2,1,4 1,4,4,3,1,3,5,2,3,2,3,3,4,6,5,4  Observed outcomes: l i 4,5,4,3,2,4,5,5,5,6,6,3,5,6,5,4  Probability of each outcome: l (I) (p i ) 1(1/36), 2(3/36), 3(5/36), 4(7/36), 5(9/36), 6(11/36)

Functions of Random Variables  It is also possible to define other random variables by performing mathematical operations on the outcomes of a random variable. e.g. 2D: d={1,2,3,4,5,6}, 2d={2,4,6,8,10,12}, p i =1/6 e.g. L 2 +L: l={1,2,3,4,5,6}, l 2 +l={2,6,12,20,30,42} l 2 +l i (p i ): 2(1/36),6(3/36),12(5/36),20(7/36),30(9/36),42(11/36)

Functions of Random Variables  Or on the outcomes of several random variables. e.g.T=(D 1 +D 2 ) t i =5,9,7,6,3,7,8,7,8,8,9,5,9,8,6,8 t i (p i ): 2(1/36),3(2/36),4(3/36),5(4/36),6(5/36),7(6/36) 8(5/36),9(4/36),10(3/36),11(2/36),12(1/36)

Functions of Random Variables...  For a random variable V with values, and probability, the random variable f ( V ) has a possible outcome  It is difficult to define the complete set of possible outcomes for random var that are functions of other random var. Fortunately we never have to deal with anything more complicated than a sum of several random var.

Functions of Random Variables …  We often use transformation functions to satisfy the assumption that the underlying distribution of the random variable of our interest is close to normal distribution.

Parameters of a Random Variable  The set of outcomes and their corresponding probabilities is referred to probability distribution of a random variable.  If the probability distribution is known, one can calculate parameters that describe features of the random variable.  Examples of parameters: min, max, mean, and standard deviations.

Parameters of a Random Variable...  The complete distribution cannot be determined from a few parameters, but Gaussian distribution can be determined by a mean and a variance.  Parameters cannot be obtained by calculating sample statistics of the outcomes of a random variable.  The statistical mean of the 16 die outcomes is 3.75, but the mean, as the parameter of the die population, is 3.5.

Parameters of a Random Variable...  Parameters of a conceptual model:  Statistics from a set of observations:

Parameters of a Random Variable...  Expected value:  Expected value of L E{L} =1/36(1)+3/36(2)+5/36(3)+7/36(4)+9/36(5)+11/36(6) =4.47

Parameters of a Random Variable...  Variance:  Variance of L Var(L) = 1/36(1 2 )+3/36(2 2 )+…- {[1/36(1)]+[3/36(2)]+… } 2 =1.97

Joint Random Variables  Random variables can be generated in pairs by some probabilistic mechanism - the outcome of one may influence the outcome of the other.  The possible outcomes of (U,V)  With the corresponding probabilities Where there are n possible outcomes for U and m for V

Joint Random Variables …  e.g. L,S L: the larger of two throws; S: the smaller of the two; l i,s i : (4,1) (5,4) (4,3) (3,3) (2,1) (4,3) (5,3) (5,2) (5,3) (6,2) (6,3) (3,2) (5,4) (6,2) (5,1) (4,4)

Joint Random Variables...  l i,s i : (4,1) (5,4) (4,3) (3,3) (2,1) (4,3) (5,3) (5,2) (5,3) (6,2) (6,3) (3,2) (5,4) (6,2) (5,1) (4,4) Possible outcomes of s (j) p ij Possible 1 1/ outcomes 2 2/36 1/ of l (i) 3 2/36 2/36 1/ /36 2/36 2/36 1/ /36 2/36 2/36 2/36 1/ /36 2/36 2/36 2/36 2/36 1/36

Marginal Distribution  Marginal distribution is the distribution of a single random variable regardless of the other random variable.  Discrete case:  P{L=5} = p5 = = 2/36+2/36+2/36+2/36+1/36 =9/36 The same as table 9.1(p204)

Conditional Distributions  Using the joint distribution of two random variables, we can calculate a distribution of one variable given a particular outcome of the other random variable.  Discrete case:

Conditional Distributions …  Conditional distribution  Discrete case :  P{L=3|S=3}= =(1/36)/(2/36+2/36+2/36+1/36+0+0)=1/7

Parameters of Joint Random Variables  Covariance

Parameters of Joint Random Variables  Correlation coefficient

Weighted Linear Combinations of Random Variables (9.14, p216)

Weighted Linear Combinations of Random Variables if u and v are independent if V i are independent