Vamsi Sundus Shawnalee. “Data collected under different conditions (i.e. treatments)  whether the conditions are different from each other and […] how.

Slides:



Advertisements
Similar presentations
Spatial point patterns and Geostatistics an introduction
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Lesson 10: Linear Regression and Correlation
Objectives 10.1 Simple linear regression
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
EPI 809/Spring Probability Distribution of Random Error.
SPATIAL DATA ANALYSIS Tony E. Smith University of Pennsylvania Point Pattern Analysis Spatial Regression Analysis Continuous Pattern Analysis.
Objectives (BPS chapter 24)
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Multiple regression analysis
Multiple Linear Regression Model
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
The Simple Regression Model
Applied Geostatistics
Chapter Topics Types of Regression Models
Deterministic Solutions Geostatistical Solutions
Chapter 11 Multiple Regression.
Lecture 23 Multiple Regression (Sections )
Ordinary Kriging Process in ArcGIS
BCOR 1020 Business Statistics
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Introduction to Regression Analysis, Chapter 13,
Linear Regression/Correlation
Multiple Linear Regression A method for analyzing the effects of several predictor variables concurrently. - Simultaneously - Stepwise Minimizing the squared.
Lecture 5 Correlation and Regression
Multiple Linear Regression Response Variable: Y Explanatory Variables: X 1,...,X k Model (Extension of Simple Regression): E(Y) =  +  1 X 1 +  +  k.
Example of Simple and Multiple Regression
Inference for regression - Simple linear regression
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Simple Linear Regression Models
CORRELATION & REGRESSION
Inferences for Regression
Spatial Interpolation of monthly precipitation by Kriging method
Using ESRI ArcGIS 9.3 Spatial Analyst
Ms. Khatijahhusna Abd Rani School of Electrical System Engineering Sem II 2014/2015.
Inferences in Regression and Correlation Analysis Ayona Chatterjee Spring 2008 Math 4803/5803.
1 Experimental Statistics - week 10 Chapter 11: Linear Regression and Correlation Note: Homework Due Thursday.
Multilevel Data in Outcomes Research Types of multilevel data common in outcomes research Random versus fixed effects Statistical Model Choices “Shrinkage.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 19 Linear Patterns.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
The Completely Randomized Design (§8.3)
Simple Linear Regression ANOVA for regression (10.2)
Slide 1 DSCI 5340: Predictive Modeling and Business Forecasting Spring 2013 – Dr. Nick Evangelopoulos Lecture 2: Review of Multiple Regression (Ch. 4-5)
3.2 - Least- Squares Regression. Where else have we seen “residuals?” Sx = data point - mean (observed - predicted) z-scores = observed - expected * note.
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
Semivariogram Analysis and Estimation Tanya, Nick Caroline.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Agresti/Franklin Statistics, 1 of 88 Chapter 11 Analyzing Association Between Quantitative Variables: Regression Analysis Learn…. To use regression analysis.
11 Chapter 5 The Research Process – Hypothesis Development – (Stage 4 in Research Process) © 2009 John Wiley & Sons Ltd.
Chapter 10 Inference for Regression
1 Statistics 262: Intermediate Biostatistics Regression Models for longitudinal data: Mixed Models.
Section 1.6 Fitting Linear Functions to Data. Consider the set of points {(3,1), (4,3), (6,6), (8,12)} Plot these points on a graph –This is called a.
Linear Regression Linear Regression. Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Purpose Understand Linear Regression. Use R functions.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Regression Analysis: A statistical procedure used to find relations among a set of variables B. Klinkenberg G
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Stats Methods at IC Lecture 3: Regression.
The simple linear regression model and parameter estimation
Inference for Regression
Chapter 4 Basic Estimation Techniques
Inference for Least Squares Lines
Linear Mixed Models in JMP Pro
Inference for Geostatistical Data: Kriging for Spatial Interpolation
CHAPTER 26: Inference for Regression
CHAPTER 3 Describing Relationships
Daniela Stan Raicu School of CTI, DePaul University
Simple Linear Regression
Presentation transcript:

Vamsi Sundus Shawnalee

“Data collected under different conditions (i.e. treatments)  whether the conditions are different from each other and […] how the differences manifest themselves.” This data concerns soil.

Soils are first chisel-plowed in the spring Samples from 0-2 inches were collected. Measured N percentage (TN) Measured C percentage (CN) Calculated C/N, ratio between the two treatments. Looking at the sample setup on 674, we see that it wasn’t randomly allocated. We expect perhaps some spatial autocorrelation among the sample sites.

Author: calculated simple pooled t-test: p =.809. p > α Thus no relation… Doesn’t account for spatial autocorrelation among the 195 chisel-plow and 200 non-till strips. Doesn’t convey the differences in the spatial structure of the treatments.

They used SAS to obtain least squares + restricted maximum likelihood  common nugget effect was fit. Considerable variability of C/N ratios due to nugget effect. Using “proc mixed” we get predictions of the C/N ratio.

With proc mixed we assume that the C/N ratios are assumed to depend on the tillage treatments. The SAS program is included in the section. Omitted since this is a class in R. But, in the programming Semivariogram – ensure both have same nugget effect.

Pg (SAS Output) Looking at the curvy wavy thingy (surface plots) We see one looks smoother and more predictable (no- tillage). This means greater spatial continuity (larger range). I.e. positive autocorrelations = stronger over same distance.

At this point in the analysis: There is no difference in the average C/N values in the study. [when sampling two months after installment of treatment.] [pooled t-test] There are differences in the spatial structure of the treatments [3D plot]. If we do a SSR (sum of squares reduction) we see that it’s extremely significant that a single spherical semivariogram cannot be used for bother semivariograms (Ha). Using ordinary least squares we also find significance, but less so versus versus.-4-9.

Next section

What if only one variable was important (i.e. either C or N) but not the combination of the two (i.e. C/N or N/C ratio)? Here: Consider: predicting soil carbon as a function of soil nitrogen. From the scatterplot (TC v TN) we see an extremely strong correlation of sorts. [pg. 679]

If we wanted to have a more accurate model though, we’d have to include spatiality: instead of linear model: TC(s i ) = β 0 + β 1 *TN(s i ) + e(s i ) Errors are spatially correlated. We need to model it though

Need to model the semivariogram. Two steps Model fit by normal least squares and the “empirical semivariogram of the OLS residuals is computed to suggest a theoretical semivariogram model.” We need the theoretical model to get initial semivariogram parameters. Need mean and autocorrelation structure  restricted maximum likelihood. Here: we use proc mixed to estimate both the mean function and the autocorrelation structure (and predictions at unobserved locations).

(1-Residual sum of squares)/corrected total sum of squares =.92 = estimate of R 2 Doing the proc mixed procedure, we generate a lot of output: 9.17 (pg 682 – 683) From the output generated we look at the “solutions for fixed effects” for estimates of the parameters were interested in. Specifically, β 0 = intercept and β 1 = TN.

For every additional percent of N, we increase C by percentage points. After playing a short game of “find the difference” on 9.50, I see that they are nearly the same patterns. Wow…estimates of the expected value of TC and Predictions of TC are almost the same. Amazing! [pg 684]