Gu Yuxian Wang Weinan Beijing National Day School.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The Maximum Likelihood Method
Ordinary Least-Squares
General Linear Model With correlated error terms  =  2 V ≠  2 I.
A. The Basic Principle We consider the multivariate extension of multiple linear regression – modeling the relationship between m responses Y 1,…,Y m and.
The Simple Regression Model
Statistical Techniques I EXST7005 Simple Linear Regression.
Chapter 7. Statistical Estimation and Sampling Distributions
Data Modeling and Parameter Estimation Nov 9, 2005 PSCI 702.
Cost of surrogates In linear regression, the process of fitting involves solving a set of linear equations once. For moving least squares, we need to.
SOLVED EXAMPLES.
The General Linear Model. The Simple Linear Model Linear Regression.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Simple Linear Regression
1 Chapter 3 Multiple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Chapter 4 Multiple Regression.
Maximum-Likelihood estimation Consider as usual a random sample x = x 1, …, x n from a distribution with p.d.f. f (x;  ) (and c.d.f. F(x;  ) ) The maximum.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Variance and covariance Sums of squares General linear models.
Separate multivariate observations
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Chapter 12 Multiple Linear Regression Doing it with more variables! More is better. Chapter 12A.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Linear Regression James H. Steiger. Regression – The General Setup You have a set of data on two variables, X and Y, represented in a scatter plot. You.
MTH 161: Introduction To Statistics
Week 11 Similar figures, Solutions, Solve, Square root, Sum, Term.
P ROBABLITY S TATICS &. PROJECT. 1 Assuming that the error terms are distributed as: Please derive the maximum likelihood estimator for the simple linear.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
Solving Quadratic Equations – Part 1 Methods for solving quadratic equations : 1. Taking the square root of both sides ( simple equations ) 2. Factoring.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: ML and Simple Regression Bias of the ML Estimate Variance of the ML Estimate.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Chapter 13 Multiple Regression
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Statistical Estimation Vasileios Hatzivassiloglou University of Texas at Dallas.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Stats & Summary. The Woodbury Theorem where the inverses.
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and  2 Now, we need procedures to calculate  and  2, themselves.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Correlation and Regression Basic Concepts. An Example We can hypothesize that the value of a house increases as its size increases. Said differently,
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Statistics 350 Review. Today Today: Review Simple Linear Regression Simple linear regression model: Y i =  for i=1,2,…,n Distribution of errors.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
MathematicalMarketing Slide 3c.1 Mathematical Tools Chapter 3: Part c – Parameter Estimation We will be discussing  Nonlinear Parameter Estimation  Maximum.
Data Modeling Patrice Koehl Department of Biological Sciences
The simple linear regression model and parameter estimation
The Maximum Likelihood Method
Probability Theory and Parameter Estimation I
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
CSE 4705 Artificial Intelligence
The Maximum Likelihood Method
Multiple Regression.
BIVARIATE REGRESSION AND CORRELATION
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 11.1: Least squares estimation CIS Computational.
The Maximum Likelihood Method
Linear Regression.
LESSON 23: MULTIPLE REGRESSION
Today’s class Multiple Variable Linear Regression
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Statistical Assumptions for SLR
Linear regression Fitting a straight line to observations.
مدلسازي تجربي – تخمين پارامتر
Simple Linear Regression
Regression and Correlation of Data
Presentation transcript:

Gu Yuxian Wang Weinan Beijing National Day School

Part 1 The Simple Linear Regression Given two variables X and Y. , … are measured without an error, … are measured with error So we can let We can use the least squares estimators and the maximum likelihood estimator to estimate parameter and.

The Least Squares Estimators Let All we need to do is to minimize Δ. Let, Solve the equation.

The Maximum Likelihood Estimator Assume that So

The likelihood function Compute and Solve We get

Efficiency Analysis They are unbiased.

Part2 Errors-in-Variables (EIV) Regression Model When the measurements for X is not accurate. There are two ways to measure errors. The orthogonal regression and the geometric mean regression.

The Orthogonal Regression(OR) The distances between the regression line and points are To minimize Compute and solve We are supposed to get

The Geometric Mean Regression(GMR) The area is To minimize Compute and solve we get

Parametric Method Assume X and Y follow a bivariate normal distribution We use moment generating function (mgf) to derive the distribution of X and Y :

Since are independent, we can separate mgf. The bivariate normal distribution that method of moment estimator(MOME)

We get:

Special Situation for MLE The Orthogonal Regression(OR) The Geometric Mean Regression (GMR)

– This is when Y has no error. – This is when X has no error, so we get the same answer as our first discussion.

Another Estimator We want to (1)occupy all la (like MLE) (2)without distributions(like (OR)&(G)) Calculate

Let So is increasing and We get Prove 1-1 to

Let So there is at least one root for Prove 1-1 to We have

So there is ONLY one root for (when ) And when Then we have We can proof

Another Estimator Again The angle Let Compute & solve We get ***

Part3 Multiple Linear Regression The Least Squares Estimators Similar to simple linear regression: Compute We will get a group of equations:

Assume its coefficient matrix is The solution is

Errors-in-Variables (EIV) Regression Model(Two Variables) The Orthogonal Regression(OR) The Geometric Mean Regression(GMR1)(the volume ) The Geometric Mean Regression(GMR2)(the sum of area )