Download presentation
Presentation is loading. Please wait.
Published byRose Barker Modified over 9 years ago
1
5 - 1 3.8 Inner Product Spaces Euclidean n-space: Euclidean n-space: vector lengthdot productEuclidean n-space R n was defined to be the set of all ordered n-tuples of real numbers. When R n is combined with the standard operations of vector addition, scalar multiplication, vector length, and the dot product, the resulting vector space is called Euclidean n-space. The dot product of two vectors is defined to be the metric concept The definitions of the vector length and the dot product are needed to provide the metric concept for the vector space.
2
5 - 2 (1) (2) (3) (4) and if and only if Axioms of inner product: Axioms of inner product: An inner product on V is a function that associates a real number with each pair of vectors u and v Let u, v, and w be vectors in a vector space V, and let c be any scalar. An inner product on V is a function that associates a real number with each pair of vectors u and v and satisfies the following axioms.
3
5 - 3 Note: A vector space V with an inner product is called an inner product space. Vector space: Inner product space:
4
5 - 4 Ex: (A different inner product for R n ) Show that the function defines an inner product on R 2, where and. Sol:
5
5 - 5 Ex: (A function that is not an inner product) Show that the following function is not an inner product on R 3. Sol: Let Axiom 4 is not satisfied. Thus this function is not an inner product on R 3.
6
5 - 6 Ex: (A function that is not an inner product) Show that the following function is not an inner product on R 3. Sol: Let Axiom 4 is not satisfied. Thus this function is not an inner product on R 3.
7
5 - 7 Norm (length) of u: (5) If A and B are two matrices, an inner product can be =Tr(A†B), where † is the transpose complex conjugate of the matrix and Tr means the trace. Therefore For a norm, there are many possibilities.
8
5 - 8 For a norm, there are many possibilities. There is an example in criminal law in which the distinctions between some of these norms has very practical consequences. If you’re caught selling drugs in New York there is a longer sentence if your sale is within 1000 feet of a school. If you are an attorney defending someone accused of this crime, which of the norms would you argue for? The legislators didn’t know linear algebra, so they didn’t specify which norm they intended. The prosecuting attorney argued for norm #1, “as the crow flies.” The defense argued that “crows don’t sell drugs” and humans move along city streets, so norm #2 is more appropriate. The New York Court of Appeals decided that the Pythagorean norm (#1) is the appropriate one and they rejected the use of the pedestrian norm that the defendant advocated (#2).
9
5 - 9 Distance Distance between u and v: Angle Angle between two nonzero vectors u and v: u and v are orthogonal if. Orthogonal Orthogonal: Note:unit vector Note: If, then v is called a unit vector. (the unit vector in the direction of v)
10
5 - 10 Properties of norm: Properties of norm: (1) (2) if and only if (3) Properties of distance: Properties of distance: (1) (2) if and only if (3)
11
5 - 11 Ex: (Finding inner product) is an inner productSol:
12
5 - 12 Thm 3.21 Thm 3.21 : Let u and v be vectors in an inner product space V. Cauchy-Schwarz inequality (1) Cauchy-Schwarz inequality: Triangle inequality (2) Triangle inequality: Pythagorean theorem (3) Pythagorean theorem : u and v are orthogonal if and only if
13
5 - 13 Note Note: We can solve for this coefficient by noting that because is orthogonal to a scalar multiple of, it must be orthogonal to itself. Therefore, the consequent fact that the dot product is zero, giving that Orthogonal projections Orthogonal projections in inner product spaces: Let u and v be two vectors in an inner product space V, such that. Then the orthogonal projection of u onto v is given by v u
14
5 - 14 Ex: (Finding an orthogonal projection in R 3 ) Use the Euclidean inner product in R 3 to find the orthogonal projection of u=(6, 2, 4) onto v=(1, 2, 0). Sol:
15
5 - 15 Thm 3.22 Thm 3.22: (Orthogonal projection and distance) Let v and s be two vectors in an inner product space V, such that. Then
16
5 - 16 3.9Orthonormal Bases: Gram-Schmidt Process 3.9 Orthonormal Bases: Gram-Schmidt Process Orthogonal Orthogonal: orthogonal set A set S of vectors in an inner product space V is called an orthogonal set if every pair of vectors in the set is orthogonal. Orthonormal Orthonormal: unit vector An orthogonal set in which each vector is a unit vector is called orthonormal.
17
5 - 17 The standard basis is orthonormal. Ex: (An orthonormal basis for ) In, with the inner product Sol: Then Thus, B is an orthonormal basis for.
18
5 - 18 Thm 3.23:Orthogonal sets are linearly independent Thm 3.23: (Orthogonal sets are linearly independent) orthogonal set linearly independent If is an orthogonal set of nonzero vectors in an inner product space V, then S is linearly independent. Pf: S is an orthogonal set of nonzero vectors
19
5 - 19 Ex: (Using orthogonality to test for a basis) Show that the following set is a basis for. Sol: : nonzero vectors
20
5 - 20 Thm 3.24: Thm 3.24: (Coordinates relative to an orthonormal basis) orthonormal basis If is an orthonormal basis for an inner product space V, then the coordinate representation of a vector w with respect to B is is orthonormal (unique representation) Pf: is a basis for V
21
5 - 21
22
5 - 22 Ex: (Representing vectors relative to an orthonormal basis) Find the coordinates of w = (5, -5, 2) relative to the following orthonormal basis for. Sol:
23
5 - 23 Gram-Schmidt orthonormalization process: Gram-Schmidt orthonormalization process: is a basis for an inner product space V is an orthogonal basis. is an orthonormal basis.
24
5 - 24 Sol: Ex: (Applying the Gram-Schmidt orthonormalization process) Apply the Gram-Schmidt process to the following basis.
25
5 - 25 Orthogonal basis Orthonormal basis
26
5 - 26 solution space Ex: Find an orthonormal basis for the solution space of the homogeneous system of linear equations. Sol:
27
5 - 27 Thus one basis for the solution space is ( orthogonal basis) (orthonormal basis )
28
5 - 28 3.10 Mathematical Models and Least-Squares Analysis WsubspaceV Let W be a subspace of an inner product space V. (a) A vector u in V is said to orthogonal to W, if u is orthogonal to every vector in W. (b) The set of all vectors in V that are orthogonal to W is called the orthogonal complement of W. (read “ perp”) Orthogonal complement Orthogonal complement of W:
29
5 - 29 Direct sum Direct sum: Let and be two subspaces of. If each vector can be uniquely written as a sum of a vector from and a vector from,, then is the direct sum of and, and you can write Thm 3.25 Thm 3.25: (Properties of orthogonal subspaces) Let W be a subspace of R n. Then the following properties are true. (1) (2) (3)
30
5 - 30 Thm 3.26 Thm 3.26: (Projection onto a subspace) If is an orthonormal basis for the subspace W of V, and for, then
31
5 - 31 Ex: (Projection onto a subspace) Find the projection of the vector v onto the subspace W. Sol: an orthogonal basis for W an orthonormal basis for W
32
5 - 32 Scientists are often presented with a system that has no solution and they must find an answer that is as close as possible to being an answer. Suppose that we have a coin to use in flipping and this coin has some proportion m of heads to total flips. Because of randomness, we do not find the exact proportion with this sample The vector of experimental data {16, 34, 51} is not in the subspace of solutions. Fitting by Least-Squares
33
5 - 33 However, we want to find the m that most nearly works. An orthogonal projection of the data vector into the line subspace gives our best guess. line of best fit The estimate (m = 7110/12600 ~ 0.56) is a bit high than 1/2 but not much, so probably the penny is fair enough. The line with the slope m= 0.56 is called the line of best fit for this data. fitting by least-squares Minimizing the distance between the given vector and the vector used as the left- hand side minimizes the total of these vertical lengths. We say that the line has been obtained through fitting by least-squares.
34
5 - 34 The different denominations of U.S. money have different average times in circulation The linear system with equations has no solution, but we can use orthogonal projection to find a best approximation.
35
5 - 35 A The method on the Projection into a Subspace says that coefficients b and m so that the linear combination of the columns of A is as close as possible to the vector are the entries of b = 1.05 Some calculation gives an intercept of b = 1.05 and a slope of m = 0.18 m = 0.18.
36
5 - 36 Thm 3.27 Thm 3.27: (Orthogonal projection and distance) S Let S be a subspace of an inner product space V, and. Then for all, S ( is the best approximation to v from S)
37
5 - 37 Pf: Pf: By the Pythagorean theorem
38
5 - 38 mnA mn For a given mn matrix [A mn ]: The spaces NS(A) and RS(A) are orthogonal complements of each other within R n. This means that any vector from NS(A) is orthogonal to any vector from CS(A T ), and the vectors in these two spaces span R n, i.e.,. Fundamental subspaces of a matrix
39
5 - 39 Thm 3.28 Thm 3.28: If A is an m×n matrix, then (1) (2) (3) (4)
40
5 - 40 Ex: (Fundamental subspaces) Find the four fundamental subspaces of the matrix. (reduced row-echelon form) Sol:
41
5 - 41 Check:
42
5 - 42 Ex: Let W is a subspace of R 4 and. (a) Find a basis for W (b) Find a basis for the orthogonal complement of W. Sol: (reduced row-echelon form)
43
5 - 43 is a basis for W Notes:
44
5 - 44 Least-squares problem: (A system of linear equations) (1) When the system is consistent, we can use the Gaussian elimination with back-substitution to solve for x (2) When the system is consistent, how to find the “best possible” solution of the system. That is, the value of x for which the difference between Ax and b is small.
45
5 - 45 Least-squares solution Least-squares solution: Given a system Ax = b of m linear equations in n unknowns, the least squares problem is to find a vector x in R n that minimizes with respect to the Euclidean inner product on R n. least-squares solution Such a vector is called a least-squares solution of Ax = b.
46
5 - 46 (This is the solution of the normal system associated with Ax = b)
47
5 - 47 Note: the least-squares solution Note: The problem of finding the least-squares solution of an exact solution of the associated normal system is equal to the problem of finding an exact solution of the associated normal system. Thm: Thm: If A is an m×n matrix with linearly independent column vectors, then for every m×1 matrix b, the linear system Ax = b has a unique least-squares solution. This solution is given by Moreover, if W is the column space of A, then the orthogonal projection of b on W is
48
5 - 48 Ex: (Solving the normal equations) Find the least squares solution of the following system and find the orthogonal projection of b on the column space of A.
49
5 - 49 Sol: the associated normal system
50
5 - 50 the least squares solution of Ax = b the orthogonal projection of b on the column space of A
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.