General Linear Model. Instructional Materials MultReg.htmhttp://core.ecu.edu/psyc/wuenschk/PP/PP- MultReg.htm.

Slides:



Advertisements
Similar presentations
Test of (µ 1 – µ 2 ),  1 =  2, Populations Normal Test Statistic and df = n 1 + n 2 – 2 2– )1– 2 ( 2 1 )1– 1 ( 2 where ] 2 – 1 [–
Advertisements

Regression and correlation methods
Kin 304 Regression Linear Regression Least Sum of Squares
Analysis of Variance (ANOVA). Hypothesis H 0 :  i =  G H 1 :  i | (  i   G ) Logic S 2 within = error variability S 2 between = error variability.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Chapter 10 Curve Fitting and Regression Analysis
Linear regression models
Ch11 Curve Fitting Dr. Deshi Ye
Simple Linear Regression
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
LINEAR REGRESSION: Evaluating Regression Models Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: Evaluating Regression Models. Overview Assumptions for Linear Regression Evaluating a Regression Model.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r.
LINEAR REGRESSION: What it Is and How it Works. Overview What is Bivariate Linear Regression? The Regression Equation How It’s Based on r Assumptions.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
Chapter Topics Types of Regression Models
Pertemua 19 Regresi Linier
REGRESSION Predict future scores on Y based on measured scores on X Predictions are based on a correlation from a sample where both X and Y were measured.
Simple Linear Regression and Correlation
Relationships Among Variables
Bivariate Linear Correlation. Linear Function Y = a + bX.
Bivariate Linear Regression. Linear Function Y = a + bX +e.
1 MULTI VARIATE VARIABLE n-th OBJECT m-th VARIABLE.
Chapter 11 Simple Regression
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Chapter 14 – Correlation and Simple Regression Math 22 Introductory Statistics.
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Canonical Correlation/Regression. AKA multiple, multiple regression AKA multivariate multiple regression Have two sets of variables (Xs and Ys) Create.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
4 basic analytical tasks in statistics: 1)Comparing scores across groups  look for differences in means 2)Cross-tabulating categoric variables  look.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Simple linear regression and correlation Regression analysis is the process of constructing a mathematical model or function that can be used to predict.
The “Big Picture” (from Heath 1995). Simple Linear Regression.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Chapter 11 Linear Regression and Correlation. Explanatory and Response Variables are Numeric Relationship between the mean of the response variable and.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Chapter 20 Linear and Multiple Regression
REGRESSION G&W p
B&A ; and REGRESSION - ANCOVA B&A ; and
Regression Chapter 6 I Introduction to Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Multiple Regression.
BPK 304W Regression Linear Regression Least Sum of Squares
BIVARIATE REGRESSION AND CORRELATION
BPK 304W Correlation.
Correlation and Regression
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
6-1 Introduction To Empirical Models
Linear Regression/Correlation
Association, correlation and regression in biomedical research
Regression Chapter 8.
Simple Linear Regression
Simple Linear Regression
Linear Regression and Correlation
Linear Regression and Correlation
3 basic analytical tasks in bivariate (or multivariate) analyses:
Inferences 10-3.
Presentation transcript:

General Linear Model

Instructional Materials MultReg.htmhttp://core.ecu.edu/psyc/wuenschk/PP/PP- MultReg.htm aka,

Introducing the General

Linear Models As noted by the General, the GLM can be used to relate one set of things (Ys) to another set of things (X). It can also be used with only one set of things.

Bivariate Linear Function Y = a + bX + error This is probably what you have in mind when thinking of a linear model. Spatially, it is represented in two- dimensional (Cartesian) space.

Least Squares Criterion Linear models produce parameter estimates (intercepts and slopes) such that the sum of squared deviations between Y and predicted Y is minimized.

Univariate Regression The mean is a univariate least squares predictor. The prediction model is The sum of the squared deviations between Y and mean Y is smaller than that for any reference value of Y.

Fixed and Random Variables A FIXED variable is one for which you have every possible value of interest in your sample. –Example: Subject sex, female or male. A RANDOM variable is one where the sample values are randomly obtained from the population of values. –Example: Height of subject.

Correlation & Regression If Y is random and X is fixed, the model is a regression model. If both Y and X are random, the model is a correlation model. Researchers generally think that –Correlation = compute the corr coeff, r –Regression = find an equation to predict Y from X

Assumptions, Bivariate Correlation 1.Homoscedasticity across Y|X 2.Normality of Y|X 3.Normality of Y ignoring X 4.Homoscedasticity across X|Y 5.Normality of X|Y 6.Normality of X ignoring Y The first three should look familiar, you make them with the pooled variances t.

Bivariate Normal

When Do Assumptions Apply? Only when employing t or F. That is, obtaining a p value or constructing a confidence interval. With regression analysis, only the first three assumptions (regarding Y) are made.

Sources of Error Y = a + bX + error Error in the measurement of X and or Y or in the manipulation of X. The influence upon Y of variables other than X (extraneous variables), including variables that interact with X. Any nonlinear influence of X upon Y.

The Regression Line r 2 < 1  Predicted Y regresses towards mean Y In univariate regression, it regresses all the way to the mean for every case.

Uses of Correlation/Regression Analysis Measure the degree of linear association Correlation does imply causation –Necessary but not sufficient –Third variable problems Reliability Validity Independent Samples t – point biserial r –Y = a + b  Group (Group is 0 or 1)

Uses of Correlation/Regression Analysis Contingency tables --  Rows = a + b  Columns Multiple correlation/regression

Uses of Correlation/Regression Analysis Analysis of variance (ANOVA) PolitConserv = a + b 1 Republican? + b 2 Democrat? k = 3, the third group is all others

Uses of Correlation/Regression Analysis Canonical correlation/regression (homophobia, homo-aggression) = (psychopathic deviance, masculinity, hypomania, clinical defensiveness) High homonegativity = hypomanic, unusually frank, stereotypically masculine, psychopathically deviant (antisocial)