Stats & Summary
The Woodbury Theorem where the inverses
Block Matrices Let the n × m matrix be partitioned into sub-matrices A 11, A 12, A 21, A 22, Similarly partition the m × k matrix
Product of Blocked Matrices Then
The Inverse of Blocked Matrices Let the n × n matrix be partitioned into sub-matrices A 11, A 12, A 21, A 22, Similarly partition the n × n matrix Suppose that B = A -1
Summarizing Let Suppose that A -1 = B then
Symmetric Matrices An n × n matrix, A, is said to be symmetric if Note:
The trace and the determinant of a square matrix Let A denote then n × n matrix Then
also where
Some properties
Special Types of Matrices 1.Orthogonal matrices –A matrix is orthogonal if P ˊ P = PP ˊ = I –In this cases P -1 =P ˊ. –Also the rows (columns) of P have length 1 and are orthogonal to each other
Special Types of Matrices (continued) 2.Positive definite matrices –A symmetric matrix, A, is called positive definite if: –A symmetric matrix, A, is called positive semi definite if:
Theorem The matrix A is positive definite if
Special Types of Matrices (continued) 3.Idempotent matrices –A symmetric matrix, E, is called idempotent if: –Idempotent matrices project vectors onto a linear subspace
Eigenvectors, Eigenvalues of a matrix
Definition Let A be an n × n matrix Let then is called an eigenvalue of A and and is called an eigenvector of A and
Note:
= polynomial of degree n in. Hence there are n possible eigenvalues 1, …, n
Thereom If the matrix A is symmetric with distinct eigenvalues, 1, …, n, with corresponding eigenvectors Assume
The Generalized Inverse of a matrix
Definition B (denoted by A - ) is called the generalized inverse (Moore – Penrose inverse) of A if 1. ABA = A 2. BAB = B 3. (AB)' = AB 4. (BA)' = BA Note: A - is unique
Hence B 1 = B 1 AB 1 = B 1 AB 2 AB 1 = B 1 (AB 2 ) ' (AB 1 ) ' = B 1 B 2 ' A ' B 1 ' A ' = B 1 B 2 ' A ' = B 1 AB 2 = B 1 AB 2 AB 2 = (B 1 A)(B 2 A)B 2 = (B 1 A) ' (B 2 A) ' B 2 = A ' B 1 ' A ' B 2 ' B 2 = A ' B 2 ' B 2 = (B 2 A) ' B 2 = B 2 AB 2 = B 2 The general solution of a system of Equations The general solution where is arbitrary
Let C be a p×q matrix of rank k < min(p,q), then C = AB where A is a p×k matrix of rank k and B is a k×q matrix of rank k
The General Linear Model
Geometrical interpretation of the General Linear Model
Estimation The General Linear Model
the Normal Equations
The Normal Equations
Solution to the normal equations
Estimate of 2
Properties of The Maximum Likelihood Estimates Unbiasedness, Minimum Variance
s 2 is an unbiased estimator of 2. Unbiasedness
Distributional Properties Least square Estimates (Maximum Likelidood estimates)
The General Linear Model and The Estimates
Theorem
The General Linear Model with an intercept
The matrix formulation (intercept included) Then the model becomes Thus to include an intercept add an extra column of 1’s in the design matrix X and include the intercept in the parameter vector
The Gauss-Markov Theorem An important result in the theory of Linear models Proves optimality of Least squares estimates in a more general setting
The Gauss-Markov Theorem Assume Consider the least squares estimate of, an unbiased linear estimator of and Let denote any other unbiased linear estimator of
Hypothesis testing for the GLM The General Linear Hypothesis
Testing the General Linear Hypotheses The General Linear Hypothesis H 0 :h 11 1 + h 12 2 + h 13 h 1p p = h 1 h 21 1 + h 22 2 + h 23 h 2p p = h 2... h q1 1 + h q2 2 + h q3 h qp p = h q where h 11 h 12, h 13,..., h qp and h 1 h 2, h 3,..., h q are known coefficients. In matrix notation
Testing
An Alternative form of the F statistic
Confidence intervals, Prediction intervals, Confidence Regions General Linear Model
One at a time (1 – )100 % confidence interval for (1 – )100 % confidence interval for 2 and .
Multiple Confidence Intervals associated with the test Theorem: Let H be a q × p matrix of rank q. then form a set of (1 – )100 % simultaneous confidence interval for