Melinda K. Higgins, Ph.D. 11 April 2008

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Statistical Techniques I EXST7005 Simple Linear Regression.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
July 1, 2008Lecture 17 - Regression Testing1 Testing Relationships between Variables Statistics Lecture 17.
1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Summarizing Bivariate Data Introduction to Linear Regression.
LINEAR REGRESSION: Evaluating Regression Models. Overview Standard Error of the Estimate Goodness of Fit Coefficient of Determination Regression Coefficients.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Chapter Topics Types of Regression Models
1 MF-852 Financial Econometrics Lecture 6 Linear Regression I Roy J. Epstein Fall 2003.
Business Statistics - QBM117 Least squares regression.
This Week Continue with linear regression Begin multiple regression –Le 8.2 –C & S 9:A-E Handout: Class examples and assignment 3.
Lecture 5: Simple Linear Regression
Lecture 19 Simple linear regression (Review, 18.5, 18.8)
Lecture 5 Correlation and Regression
MAT 254 – Probability and Statistics Sections 1,2 & Spring.
Categorical Data Analysis School of Nursing “Categorical Data Analysis 2x2 Chi-Square Tests and Beyond (Multiple Categorical Variable Models)” Melinda.
Multilevel and Random Coefficients Models School of Nursing “Multi-Level Models: What Are They and How Do They Work? ” Melinda K. Higgins, Ph.D. 30 March.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Econ 3790: Business and Economics Statistics
12a - 1 © 2000 Prentice-Hall, Inc. Statistics Multiple Regression and Model Building Chapter 12 part I.
Simple Linear Regression One reason for assessing correlation is to identify a variable that could be used to predict another variable If that is your.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Linear Regression Least Squares Method: the Meaning of r 2.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
18 April 2008 – M. Higgins School of Nursing – Research Roundtable Longitudinal Models: Repeated Measures; Survival Analysis and Cox Regression “Longitudinal.
Regression. Population Covariance and Correlation.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA291 Statistical Methods Lecture LINEar Association o r measures “closeness” of data to the “best” line. What line is that? And best in what terms.
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Environmental Modeling Basic Testing Methods - Statistics III.
June 30, 2008Stat Lecture 16 - Regression1 Inference for relationships between variables Statistics Lecture 16.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
LESSON 6: REGRESSION 2/21/12 EDUC 502: Introduction to Statistics.
Chapter 11: Linear Regression and Correlation Regression analysis is a statistical tool that utilizes the relation between two or more quantitative variables.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Regression Modeling Applications in Land use and Transport.
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
1 Experimental Statistics - week 12 Chapter 11: Linear Regression and Correlation Chapter 12: Multiple Regression.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Multiple Regression.
Lecture #26 Thursday, November 17, 2016 Textbook: 14.1 and 14.3
Reasoning in Psychology Using Statistics
Linear Regression Prof. Andy Field.
Relationship with one independent variable
Statistics in Data Mining on Finance by Jian Chen
Regression Analysis PhD Course.
Regression Computer Print Out
Multiple Regression.
Prediction of new observations
PENGOLAHAN DAN PENYAJIAN
Linear Models and Equations
Width vs. Area for Sample Squares
Relationship with one independent variable
Nonlinear Fitting.
Introduction to Regression
Model Adequacy Checking
Presentation transcript:

Melinda K. Higgins, Ph.D. 11 April 2008 “Research Design, Sources of Variance & Initial Discussion of Intent to Treat” Melinda K. Higgins, Ph.D. 11 April 2008

Sources of Variance – 1 All data Variations Can of “averages” Residuals Relationships Lack of Fit Pure Error

Collection of Data

Breakdown Sources of Variance (part 1) Actual value P R * Avg response L C Model Prediction F Avg of all data J Y Y = data = avg all data = model prediction J = avg response at factor level C = corrected for mean = Y- R = residual = Y- F = factor = - L = “lack of fit” = J- P = “pure error” = Y-J

Solving Least-Squares Equations (1) n = number of data points (samples = 6) p = number of parameters in model (int and slope = 2) f = number of factor levels (3 “levels” of X)

Solving Least-Squares Equations (2)

Sources of Variance (“SS/df tree”) – 2 SStotal df=n What you usually see as the “SSt” SScorr df=n-1 SSmean df=1 SSresid df=n-p SSfactor/model df=p-1 SSlof df=f-p SSpe df=n-f

Finish SS/df tree for dataset N=6; p=2; f=3 SStotal = Y’Y = 119.00 df = n = 6 SSmean = = 104.17 df = 1 SScorr = C’C = 14.833 df = n-1 = 5 SSfact = F’F = 6.2500 df = p-1 = 1 SSresid = R’R = 8.5833 df = n-p = 4 SSlof = L’L = 0.0833 df = f-p = 1 SSpe = P’P = 8.5000 df = n-f = 3 SStotal=119 df=6 SSmean=104.17 df=1 SScorr=14.833 df=5 SSfact=6.25 df=1 SSresid=8.5833 df=4 SSlof=0.0833 df=1 SSpe=8.5 df=3

SPSS Results – Part 1 (variance components) Variance Components Estimation /Analyze/General Linear Model/Variance Components SSmean SSpe SStotal SScorr But where are SSfact, SSresid, and SSlof?

SPSS Results – Part 2 (Regression) SSfact SSresid SScorr … and SSlof? SSlof = SSresid-SSpe = 8.583-8.5 = 0.08333

VIII. Statistical Resources and Contact Info SON S:\Shared\Statistics_MKHiggins Shared resource for all of SON – faculty and students Will continually update with tip sheets (for SPSS, SAS, and other software), lectures (PPTs and handouts), datasets, other resources and references Statistics At Nursing Website: [moving to main website] S:\Shared\Statistics_MKHiggins\website2\index.htm And Blackboard Site (in development) for “Organization: Statistics at School of Nursing” Contact Dr. Melinda Higgins Melinda.higgins@emory.edu Office: 404-727-5180 / Mobile: 404-434-1785