 # 1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the.

## Presentation on theme: "1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the."— Presentation transcript:

1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the Theory of Quadratic Forms

2 Linear Statistical Models Regression Model One Factor (Fixed Effects) General Linear Model Common Matrix Form Regression: X full column rank GLM: X less than full column rank

3 Notation Response Vector Design / Regressor Matrix Error Vector General Matrices : A, B, …

4 Matrix Rank Linear Independence Can’t Express any of the Vectors as a Linear Combination of the Other Vectors Rank of a Matrix Maximum Number of Linearly Independent Columns (Row Rank = Column Rank) Note: A square matrix with a nonzero determinant is full rank, or nonsingular.

5 Special Matrices Diagonal Matrix Identity Matrix

6 Special Matrices Matrix of Ones Null Matrix (any dimensions)

7 Addition Matrix Operations A and B must have the same dimensions Matrix Multiplication A must have the same number of columns as B has rows: A (n x s), B (s x k) Vector Multiplication

8 Matrix Operations Transpose Interchange rows and columns Inverse Matrix A has an inverse, denoted A -1 if and only if (a) A is a square (n x n) matrix and (b) A is of full (row, column) rank. Then AA -1 = A -1 A = I. A matrix inverse is unique. Symmetric Matrix: A (n x n) with A = A ’ i.e, a ij = a ji

9 Special Vector and Matrix Properties Orthogonal VectorsNormalized Vectors a’b = 0 Orthonormal MatrixSymmetric Idempotent Matrix Only Full-Rank Symmetric Idempotent Matrix: I Note: A matrix all of whose columns are mutually orthogonal is called an orthogonal matrix. Often “orthogonal” is used in place of “orthonormal.” Note : then A -1 = A’

10 Eigenvalues and Eigenvectors A is square (n x n) and symmetric: All eigenvalues and eigenvectors are real-valued. Eigenvalues: 1, 2, …, n (solve an n th degree polynomial equation in ) Eigenvectors: v 1, v 2, …, v n Note: If all eigenvalues are distinct, all eigenvectors are mutually orthogonal and can, without loss of generality, be normalized. If some eigenvalues have multiplicities greater than 1, the corresponding eigenvectors can be made to be orthogonal. Eigenvectors are unique up to a multiple of –1.

11 Eigenvalues and Rank The rank of a symmetric matrix equals the number of nonzero eigenvalues All the eigenvalues of an idempotent matrix are 0 or 1 It’s rank equals the number of eigenvalues that are 1 The sum of its diagonal elements equals its rank A diagonal matrix has its eigenvalues equal to the diagonal elements of the matrix The identity matrix has all its eigenvalues equal to 1 Any set of mutually orthonormal vectors can be used as eigenvectors

12 Quadratic Forms A can always be assumed to be symmetric: For any B, x’Bx = x’Ax with a ij = (b ij + b ji )/2

13 Assignment 3 1.Determine the rank of each of these matrices. 2.For each full-rank matrix, find its inverse. 3.Determine whether any of these matrices are orthogonal 4.Determine whether any of these matrices are idempotent. 5.Find the eigenvalues and eigenvectors of A and B.

Download ppt "1 Statistical Analysis Professor Lynne Stokes Department of Statistical Science Lecture 5QF Introduction to Vector and Matrix Operations Needed for the."

Similar presentations