HOMOGENEOUS LINEAR SYSTEMS (A different focus) Until now we have looked at the equation with the sole aim of computing its solutions, and we have been.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

Linear Equations in Linear Algebra
5.3 Linear Independence.
Section 5.1 Eigenvectors and Eigenvalues. Eigenvectors and Eigenvalues Useful throughout pure and applied mathematics. Used to study difference equations.
2 2.3 © 2012 Pearson Education, Inc. Matrix Algebra CHARACTERIZATIONS OF INVERTIBLE MATRICES.
4 4.3 © 2012 Pearson Education, Inc. Vector Spaces LINEARLY INDEPENDENT SETS; BASES.
NICE LOOKING MATRICES By now it shoud be clear that matrices are the backbone (computationally at least) of the attempt to solve linear systems, or, even.
Eigenvalues and Eigenvectors
THE DIMENSION OF A VECTOR SPACE
Orthogonality and Least Squares
1 © 2012 Pearson Education, Inc. Matrix Algebra THE INVERSE OF A MATRIX.
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
5 5.1 © 2012 Pearson Education, Inc. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
10.1 Gaussian Elimination Method
Table of Contents Solving Linear Systems - Elementary Row Operations A linear system of equations can be solved in a new way by using an augmented matrix.
Systems of linear equations. Simple system Solution.
SYSTEMS OF LINEAR EQUATIONS
Linear Equations in Linear Algebra
Chapter 2 Simultaneous Linear Equations (cont.)
4 4.2 © 2012 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
4 4.4 © 2012 Pearson Education, Inc. Vector Spaces COORDINATE SYSTEMS.
A matrix equation has the same solution set as the vector equation which has the same solution set as the linear system whose augmented matrix is Therefore:
Linear Equations in Linear Algebra
1 1.7 © 2016 Pearson Education, Inc. Linear Equations in Linear Algebra LINEAR INDEPENDENCE.
Orthogonality and Least Squares
Systems of Equations and Inequalities Systems of Linear Equations: Substitution and Elimination Matrices Determinants Systems of Non-linear Equations Systems.
1 Chapter 3 – Subspaces of R n and Their Dimension Outline 3.1 Image and Kernel of a Linear Transformation 3.2 Subspaces of R n ; Bases and Linear Independence.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Section 2.3 Properties of Solution Sets
HOMOGENEOUS LINEAR SYSTEMS Up to now we have been studying linear systems of the form We intend to make life easier for ourselves by choosing the vector.
Section 1.2 Gaussian Elimination. REDUCED ROW-ECHELON FORM 1.If a row does not consist of all zeros, the first nonzero number must be a 1 (called a leading.
Chapter 1 Linear Algebra S 2 Systems of Linear Equations.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Arab Open University Faculty of Computer Studies M132: Linear Algebra
CHARACTERIZATIONS OF INVERTIBLE MATRICES
1.7 Linear Independence. in R n is said to be linearly independent if has only the trivial solution. in R n is said to be linearly dependent if there.
is a linear combination of and depends upon and is called a DEPENDENT set.
5 5.1 © 2016 Pearson Education, Ltd. Eigenvalues and Eigenvectors EIGENVECTORS AND EIGENVALUES.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
4 4.2 © 2016 Pearson Education, Inc. Vector Spaces NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS.
2 2.2 © 2016 Pearson Education, Ltd. Matrix Algebra THE INVERSE OF A MATRIX.
Linear Algebra Engineering Mathematics-I. Linear Systems in Two Unknowns Engineering Mathematics-I.
Def: A matrix A in reduced-row-echelon form if 1)A is row-echelon form 2)All leading entries = 1 3)A column containing a leading entry 1 has 0’s everywhere.
REVIEW Linear Combinations Given vectors and given scalars
1.4 The Matrix Equation Ax = b
7.3 Linear Systems of Equations. Gauss Elimination
Eigenvalues and Eigenvectors
Systems of linear equations
Linear Equations in Linear Algebra
Eigenvalues and Eigenvectors
CHARACTERIZATIONS OF INVERTIBLE MATRICES
1.7 Linear Independence 线性无关
Linear Algebra Lecture 22.
Linear Equations in Linear Algebra
4.6: Rank.
Linear Algebra Lecture 37.
Properties of Solution Sets
EIGENVECTORS AND EIGENVALUES
Linear Algebra Lecture 6.
RAYAT SHIKSHAN SANSTHA’S S. M. JOSHI COLLEGE HADAPSAR, PUNE
Sec 3.5 Inverses of Matrices
LINEAR INDEPENDENCE Definition: An indexed set of vectors {v1, …, vp} in is said to be linearly independent if the vector equation has only the trivial.
Eigenvalues and Eigenvectors
Vector Spaces RANK © 2012 Pearson Education, Inc..
Matrix Algebra THE INVERSE OF A MATRIX © 2012 Pearson Education, Inc.
NULL SPACES, COLUMN SPACES, AND LINEAR TRANSFORMATIONS
Linear Equations in Linear Algebra
CHARACTERIZATIONS OF INVERTIBLE MATRICES
Eigenvalues and Eigenvectors
CHAPTER 4 Vector Spaces Linear combination Sec 4.3 IF
Presentation transcript:

HOMOGENEOUS LINEAR SYSTEMS (A different focus) Until now we have looked at the equation with the sole aim of computing its solutions, and we have been quite successful at it, we can describe precisely what we have called its solution set. We shift our focus now, away from and con- centrate instead on the column vectors of the matrix Let’s name a few things and define some words.

We will call Obviously the fact that non-trivial solution says something about the vectors Not knowing what namely the following Definition. We say the set of vectors is linearly dependent if the equation non-trivial solution. Otherwise we say the set is linearly independent.

In order to follow the textbook, let’s turn the definition around and change the wording somewhat. Definition. We say the set of vectors is linearly independent if the vector equation has only the trivial solution We say that the set is linearly dependent if there are weights

A few remarks are in order. 1 Note how our old friends, the solutions have become 2 By abuse of notation, when the context is clear we will say that the vectors are linearly dependent/independent rather than the set is linearly dependent/independent. 3 Let be the matrix with column vectors

Then the two statements A. B. are equivalent. (See statement on p. 67) This means that as far as verifying the linear dependence/independence of a set of vectors we are back to our old Let’s do an example. Are the three vectors

linearly independent? We row reduce the augmented matrix and conclude that they

Here are some interesting facts about linear dependence. 1 If the zero vector is part of the set of vectors then the set is linearly dependent. Proof: the set looks like Can you fill the blanks with at least one non-zero weight ? How about all zeroes, but In a formal proof I would write

2 Two non-zero vectors one is a multiple of the other. This is so trivial I am leaving out the proof. 3 If Proof. Think of the matrix, It has more columns than rows, therefore … the system

4 (Characterization of linear dependence) The set This is an important theorem (theorem 7, p. 58.) We will show the proof on the board. (and add it to the presentation later.)