MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)

Slides:



Advertisements
Similar presentations
Euclidean m-Space & Linear Equations Euclidean m-space.
Advertisements

Linear Algebra (Aljabar Linier) Week 13 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
The Characteristic Equation of a Square Matrix (11/18/05) To this point we know how to check whether a given number is an eigenvalue of a given square.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Signal , Weight Vector Spaces and Linear Transformations
Signal , Weight Vector Spaces and Linear Transformations
Chapter 5 Orthogonality
Basis of a Vector Space (11/2/05)
TFIDF-space  An obvious way to combine TF-IDF: the coordinate of document in axis is given by  General form of consists of three parts: Local weight.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Lecture 12 Least Square Approximation Shang-Hua Teng.
Orthogonality and Least Squares
Orthogonal Sets (12/2/05) Recall that “orthogonal” matches the geometric idea of “perpendicular”. Definition. A set of vectors u 1,u 2,…,u p in R n is.
Dirac Notation and Spectral decomposition
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
MAT 1236 Calculus III Section 12.5 Part II Equations of Line and Planes
Chapter 5: The Orthogonality and Least Squares
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Chapter 5 Orthogonality.
Gram-Schmidt Orthogonalization
Linear Algebra (Aljabar Linier) Week 10 Universitas Multimedia Nusantara Serpong, Tangerang Dr. Ananda Kusuma
Section 3.5 Lines and Planes in 3-Space. Let n = (a, b, c) ≠ 0 be a vector normal (perpendicular) to the plane containing the point P 0 (x 0, y 0, z 0.
1.3 Lines and Planes. To determine a line L, we need a point P(x 1,y 1,z 1 ) on L and a direction vector for the line L. The parametric equations of a.
AN ORTHOGONAL PROJECTION
1 Chapter 5 – Orthogonality and Least Squares Outline 5.1 Orthonormal Bases and Orthogonal Projections 5.2 Gram-Schmidt Process and QR Factorization 5.3.
Orthogonality and Least Squares
Chapter 7 Inner Product Spaces 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
MAT 4725 Numerical Analysis Section 7.1 (Part II) Norms of Vectors and Matrices
AGC DSP AGC DSP Professor A G Constantinides©1 Hilbert Spaces Linear Transformations and Least Squares: Hilbert Spaces.
Elementary Linear Algebra Anton & Rorres, 9th Edition
MAT 2401 Linear Algebra 5.3 Orthonormal Bases
Inner Products, Length, and Orthogonality (11/30/05) If v and w are vectors in R n, then their inner (or dot) product is: v  w = v T w That is, you multiply.
Chapter 10 Real Inner Products and Least-Square
Orthonormal Bases; Gram-Schmidt Process; QR-Decomposition
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Section 5.1 Length and Dot Product in ℝ n. Let v = ‹v 1­­, v 2, v 3,..., v n › and w = ‹w 1­­, w 2, w 3,..., w n › be vectors in ℝ n. The dot product.
Chap. 5 Inner Product Spaces 5.1 Length and Dot Product in R n 5.2 Inner Product Spaces 5.3 Orthonormal Bases: Gram-Schmidt Process 5.4 Mathematical Models.
MAT 2401 Linear Algebra 2.5 Applications of Matrix Operations
MAT 2401 Linear Algebra 4.4 II Spanning Sets and Linear Independence
1 MAC 2103 Module 11 lnner Product Spaces II. 2 Rev.F09 Learning Objectives Upon completing this module, you should be able to: 1. Construct an orthonormal.
MAT 4725 Numerical Analysis Section 7.1 Part I Norms of Vectors and Matrices
12.1 Orthogonal Functions a function is considered to be a generalization of a vector. We will see the concepts of inner product, norm, orthogonal (perpendicular),
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
Honors Precalc: Warm-Up Solve for x. 1.) x = 0 2.)Perform the operation on the following complex numbers (express answers as a + bi). a.) (5 – 3i)(-2+4i)
8.5 The Dot Product Precalculus. Definition of the Dot Product If u= and v= are vectors, then their dot product (u v) is defined by: u v = a 1 a 2 + b.
Chapter 4 Vector Spaces Linear Algebra. Ch04_2 Definition 1: ……………………………………………………………………. The elements in R n called …………. 4.1 The vector Space R n Addition.
MAT 2401 Linear Algebra 4.2 Vector Spaces
An inner product on a vector space V is a function that, to each pair of vectors u and v in V, associates a real number and satisfies the following.
6 6.1 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares INNER PRODUCT, LENGTH, AND ORTHOGONALITY.
Section 8 Numerical Analysis CSCI E-24 José Luis Ramírez Herrán October 20, 2015.
MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)
6 6.5 © 2016 Pearson Education, Ltd. Orthogonality and Least Squares LEAST-SQUARES PROBLEMS.
Lecture 11 Inner Product Spaces Last Time Change of Basis (Cont.) Length and Dot Product in R n Inner Product Spaces Elementary Linear Algebra R. Larsen.
1 Chapter 6 Orthogonality 6.1 Inner product, length, and orthogonality 内积 , 模长 , 正交 6.2 Orthogonal sets 正交组,正交集 6.4 The Gram-Schmidt process 格莱姆 - 施密特过程.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Euclidean Inner Product on Rn
10.6: Applications of Vectors in the Plane
Zeros of a Polynomial Function
Label your paper DNA 5.
Orthogonality and Least Squares
Linear Algebra Lecture 40.
Linear Algebra Lecture 39.
Linear Algebra Lecture 38.
Elementary Linear Algebra Anton & Rorres, 9th Edition
RAYAT SHIKSHAN SANSTHA’S S.M.JOSHI COLLEGE HADAPSAR, PUNE
Orthogonality and Least Squares
Approximation of Functions
Approximation of Functions
Presentation transcript:

MAT 4725 Numerical Analysis Section 8.2 Orthogonal Polynomials and Least Squares Approximations (Part II)

Preview Inner Product Spaces Gram-Schmidt Process

A Different Technique for Least Squares Approximation Computationally Efficient Once P n (x) is known, it is easy to determine P n+1 (x)

Recall (Linear Algebra) General Inner Product Spaces

Inner Product

Example 0 Let f,g  C[a,b]. Show that is an inner product on C[a,b]

Norm, Distance,…

Orthonormal Bases A basis S for an inner product space V is orthonormal if 1. For u,v  S, =0. 2. For u  S, u is a unit vector.

Gram-Schmidt Process

The component in v 2 that is “parallel” to w 1 is removed to get w 2. So w 1 is “perpendicular” to w 2.

Simple Example

Specific Inner Product Space

Definition 8.1

Theorem 8.2 Idea

Definition

Theorem 8.3

Example 1

Definition (Skip it for the rest)

Weight Functions to assign varying degree of importance to certain portion of the interval

Modification of the Least Squares Approximation Recall from part I

Least Squares Approximation of Functions

Normal Equations

Modification of the Least Squares Approximation

Where are the Improvements?

Definition 8.5

Theorem 8.6 a k are easier to solve a k are “reusable”

Theorem 8.6 a k are easier to solve a k are “reusable”

Where to find Orthogonal Poly.? the Gram-Schmidt Process

Gram-Schmidt Process

Legendre Polynomials

Example 2 Find the least squares approx. of f(x)=sin(  x) on [-1,1] by the Legendre Polynomials.

Example 2

Homework Download Homework