Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discrete Least Squares Approximation

Similar presentations


Presentation on theme: "Discrete Least Squares Approximation"โ€” Presentation transcript:

1 Discrete Least Squares Approximation
Sec:8.1 Discrete Least Squares Approximation

2 Sec:8.1 Discrete Least Squares Approximation
Find a relationship between x and y. From this graph, it appears that the actual relationship between x and y is linear. From this graph, it appears that the actual relationship between x and y is linear. The likely reason that no line precisely fits the data is because of errors in the data. The likely reason that no line precisely fits the data is because of errors in the data. ๐’‘ ๐Ÿ (๐’™) = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐’‘ ๐Ÿ (๐’™)= ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™

3 Sec:8.1 Discrete Least Squares Approximation
๐’€=๐‘ฟ โˆ—๐’‚ Fit the data in Table with the discrete least squares polynomial of degree at most 2. ๐‘ฟ ๐‘ป โˆ—๐’€= ๐‘ฟ ๐‘ป โˆ— ๐‘ฟ โˆ—๐’‚ This is a linear system of 3 equations in 3 unknowns ๐’ = ๐‘ฉ โˆ—๐’‚ ๐Ÿ‘ร—๐Ÿ‘ 3ร—๐Ÿ ๐Ÿ‘ร—๐Ÿ ๐’‘ ๐Ÿ (๐’™) = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ ๐Ÿ.๐ŸŽ = ๐’‚ ๐ŸŽ + ๐ŸŽโˆ™๐’‚ ๐Ÿ + ๐ŸŽ ๐Ÿ โˆ™๐’‚ ๐Ÿ ๐‘‹ ๐‘‡ 1.284 = ๐’‚ ๐ŸŽ + ๐ŸŽ.๐Ÿ๐Ÿ“๐’‚ ๐Ÿ + ๐ŸŽ.๐Ÿ๐Ÿ“ ๐Ÿ ๐’‚ ๐Ÿ = = ๐’‚ ๐ŸŽ + ๐ŸŽ.๐Ÿ“๐’‚ ๐Ÿ + ๐ŸŽ.๐Ÿ“ ๐Ÿ ๐’‚ ๐Ÿ 2.117 = ๐’‚ ๐ŸŽ + ๐ŸŽ.๐Ÿ•๐Ÿ“๐’‚ ๐Ÿ + ๐ŸŽ.๐Ÿ•๐Ÿ“ ๐Ÿ ๐’‚ ๐Ÿ 8.7680 ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ = ๐’‚ ๐ŸŽ + ๐Ÿ๐’‚ ๐Ÿ + ๐Ÿ ๐Ÿ ๐’‚ ๐Ÿ normal equations = Matrix Form ๐’ = ๐‘ฉ โˆ— ๐’‚ = ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ 0.8418 0.1106 0.0527 = ๐’‚ ๐’€ ๐‘ฟ ๐’‘ ๐Ÿ (๐’™) = ๐’™ ๐’™ ๐Ÿ ๐Ÿ“ร—๐Ÿ ๐Ÿ“ร—๐Ÿ‘ ๐Ÿ‘ร—๐Ÿ

4 Sec:8.1 Discrete Least Squares Approximation
Fit the data in Table with the discrete least squares polynomial of degree at most 2. ๐’‘ ๐Ÿ (๐’™) = ๐’™ ๐’™ ๐Ÿ ๐’™ ๐’š ๐’‘ ๐Ÿ ๐‘ฌ๐’“๐’“๐’๐’“

5 Sec:8.1 Discrete Least Squares Approximation
EXAMPLE: Use least-squares regression to fit a straight line to x [ ] y [ ] Plot the data and the regression line. ๐’‘ ๐Ÿ (๐’™)= ๐’™ clear x = [ ]'; Y = [ ]'; n = length(x) X = [ones(n,1) x ]; % Y = X * a % B = X'*X; Z = X' * Y; a = B\Z a = X\Y xx=[0:0.1:19]'; yy=a(1)+a(2)*xx; plot(x,Y,'xr',xx,yy,'-b'); grid on ๐’† ๐’Š = ๐’š ๐’Š โˆ’( ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™ ๐’Š )

6 Sec:8.1 Discrete Least Squares Approximation
Derivation for the formula: EXAMPLE: One strategy for fitting a โ€œbestโ€ line through the data would be to minimize the sum of the absolute value of residual errors for all the available data Use least-squares regression to fit a straight line to x [ ] y [ ] ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’† ๐’Š = ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐’† ๐’Š = ๐’š ๐’Š โˆ’( ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™ ๐’Š ) find values of ๐’‚ ๐ŸŽ and ๐’‚ ๐Ÿ min ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐’‘ ๐Ÿ (๐’™)= ๐’™ To minimize a function of two variables, we need to set its partial derivatives to zero and simultaneously solve the resulting equations. ๐’† ๐Ÿ = ๐’š ๐Ÿ โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐’† ๐Ÿ = ๐’š ๐Ÿ โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ โ‹ฎ โ‹ฎ The problem is that the absolute-value function is not differentiable at zero, and we might not be able to find solutions to this pair of equations. ๐’† ๐Ÿ๐ŸŽ = ๐’š ๐Ÿ๐ŸŽ โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ๐ŸŽ

7 Sec:8.1 Discrete Least Squares Approximation
EXAMPLE: Least Squares Use least-squares regression to fit a straight line to The least squares approach to this problem involves determining the best approximating line when the error involved is the sum of the squares x [ ] y [ ] ๐’† ๐’Š = ๐’š ๐’Š โˆ’( ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™ ๐’Š ) min ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ The least squares method is the most convenient procedure for determining best linear approximations ๐’‘ ๐Ÿ (๐’™)= ๐’™ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’† ๐Ÿ For a minimum to occur, we need both = ๐’Š=๐Ÿ ๐Ÿ๐ŸŽ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ

8 Sec:8.1 Discrete Least Squares Approximation
๐’Š=๐Ÿ ๐’Ž โˆ’๐Ÿ ๐’™ ๐’Š ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š =๐ŸŽ For a minimum to occur, we need both ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐ŸŽ ๐’™ ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž โˆ’๐Ÿ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š =๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐’Ž ๐’‚ ๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐’Ž ๐’‚ ๐ŸŽ These are called the normal equations.

9 Sec:8.1 Discrete Least Squares Approximation
+ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐‘ฆ 1 โ‹ฎ ๐‘ฆ ๐‘š = 1 ๐‘ฅ 1 โ‹ฎ โ‹ฎ 1 ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’Ž ๐’‚ ๐ŸŽ Least Squares Use least-squares regression to fit a straight line to ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š ๐’€ ๐‘ฟ ๐’‚ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐’Žร—๐Ÿ ๐’Žร—๐Ÿ ๐Ÿร—๐Ÿ These are called the normal equations. ๐’€=๐‘ฟ โˆ—๐’‚ ๐‘ฟ ๐‘ป โˆ—๐’€= ๐‘ฟ ๐‘ป โˆ— ๐‘ฟ โˆ—๐’‚ Matrix Form ๐’‘ ๐Ÿ (๐’™)= ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™ ๐‘š ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ = ๐‘ฆ ๐‘– ๐‘ฅ ๐‘– ๐‘ฆ ๐‘– 1 โ‹ฏ 1 ๐‘ฅ 1 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฆ 1 โ‹ฎ ๐‘ฆ ๐‘š = 1 โ‹ฏ 1 ๐‘ฅ 1 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฅ 1 โ‹ฎ โ‹ฎ 1 ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐‘ฆ 1 +โ‹ฏ+ ๐‘ฆ ๐‘š ๐‘ฅ 1 ๐‘ฆ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š = 1+โ‹ฏ+1 ๐‘ฅ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฅ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฅ 1 2 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ

10 Derivation for the formula: (polynomial of degree 2)
Sec:8.1 Discrete Least Squares Approximation Derivation for the formula: (polynomial of degree 2)

11 Derivation for the formula: (polynomial of degree 2)
Sec:8.1 Discrete Least Squares Approximation Least Squares = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐Ÿ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž โˆ’๐Ÿ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ Derivation for the formula: (polynomial of degree 2) โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐’Ž ๐’‚ ๐ŸŽ

12 Sec:8.1 Discrete Least Squares Approximation
= ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐Ÿ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž โˆ’๐Ÿ ๐’™ ๐’Š ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ‘ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š

13 Sec:8.1 Discrete Least Squares Approximation
= ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐Ÿ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž โˆ’๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ =๐ŸŽ โˆ’ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ‘ + ๐’Š=๐Ÿ ๐’Ž ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ’ =๐ŸŽ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ’ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š

14 Sec:8.1 Discrete Least Squares Approximation
= ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š ๐Ÿ ๐Ÿ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ Use least-squares regression to fit a polynomial of degree 2 to For a minimum to occur, we need both ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ normal equations normal equations in Matrix Form + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐’Ž ๐’‚ ๐ŸŽ ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š ๐‘› ๐’™ ๐’Š ๐‘ฅ ๐‘– ๐’™ ๐’Š ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ = ๐’š ๐’Š ๐’™ ๐’Š ๐’š ๐’Š ๐‘ฅ ๐‘– 2 ๐’š ๐’Š ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ’ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š

15 Sec:8.1 Discrete Least Squares Approximation
๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ’ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’š ๐’Š ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ‘ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’š ๐’Š ๐’Ž ๐’‚ ๐ŸŽ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐’‚ ๐Ÿ + ๐’Š=๐Ÿ ๐’Ž ๐’™ ๐’Š ๐Ÿ ๐’‚ ๐Ÿ = ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š ๐‘ฆ 1 โ‹ฎ ๐‘ฆ ๐‘š = 1 ๐‘ฅ 1 ๐‘ฅ 1 2 โ‹ฎ โ‹ฎ โ‹ฎ 1 ๐‘ฅ ๐‘š ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ Least Squares Use least-squares regression to fit a poly of deg 2 to ๐’€ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฟ ๐’‚ ๐’Žร—๐Ÿ ๐’Žร—๐Ÿ‘ ๐Ÿ‘ร—๐Ÿ ๐’€=๐‘ฟ โˆ—๐’‚ normal equations. ๐‘ฟ ๐‘ป โˆ—๐’€= ๐‘ฟ ๐‘ป โˆ— ๐‘ฟ โˆ—๐’‚ Matrix Form ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+ ๐’‚ ๐Ÿ ๐’™ ๐Ÿ ๐‘š ๐’™ ๐’Š ๐‘ฅ ๐‘– ๐’™ ๐’Š ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐‘ฅ ๐‘– ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ = ๐’š ๐’Š ๐’™ ๐’Š ๐’š ๐’Š ๐‘ฅ ๐‘– 2 ๐’š ๐’Š 1 โ‹ฏ 1 ๐‘ฅ 1 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฅ 1 2 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฆ 1 โ‹ฎ ๐‘ฆ ๐‘š = 1 โ‹ฏ 1 ๐‘ฅ 1 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฅ 1 2 โ‹ฏ ๐‘ฅ ๐‘š ๐‘ฅ 1 ๐‘ฅ 1 2 โ‹ฎ โ‹ฎ โ‹ฎ 1 ๐‘ฅ ๐‘š ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ ๐‘ฆ 1 +โ‹ฏ+ ๐‘ฆ ๐‘š ๐‘ฅ 1 ๐‘ฆ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ 1 2 ๐‘ฆ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š 2 ๐‘ฆ ๐‘š = 1+โ‹ฏ+1 ๐‘ฅ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฅ 1 2 +โ‹ฏ+ ๐‘ฅ ๐‘š 2 ๐‘ฅ 1 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐‘ฅ 1 2 +โ‹ฏ+ ๐‘ฅ ๐‘š 2 ๐‘ฅ 1 3 +โ‹ฏ+ ๐‘ฅ ๐‘š 3 ๐‘ฅ 1 2 +โ‹ฏ+ ๐‘ฅ ๐‘š 2 ๐‘ฅ 1 3 +โ‹ฏ+ ๐‘ฅ ๐‘š 3 ๐‘ฅ 1 4 +โ‹ฏ+ ๐‘ฅ ๐‘š ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ ๐’‚ ๐Ÿ

16 ๐’Ž ๐’ Sec:8.1 Discrete Least Squares Approximation
= ๐’Š=๐Ÿ ๐’Ž ๐’š ๐’Š โˆ’ ๐’‚ ๐ŸŽ โˆ’ ๐’‚ ๐Ÿ ๐’™ ๐’Š โˆ’โ‹ฏโˆ’ ๐’‚ ๐’ ๐’™ ๐’Š ๐’ ๐Ÿ ๐‘ฅ 1 ๐‘ฆ 1 ๐‘ฅ 2 ๐‘ฆ 2 โ‹ฎ โ‹ฎ ๐‘ฅ ๐‘š ๐‘ฆ ๐‘š ๐‘ฅ ๐‘ฆ ๐‘ฌ ๐’‚ ๐ŸŽ , ๐’‚ ๐Ÿ Use least-squares regression to fit a polynomial of degree n to For a minimum to occur, we need both ๐๐‘ฌ ๐ ๐’‚ ๐ŸŽ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐’ =๐ŸŽ ๐๐‘ฌ ๐ ๐’‚ ๐Ÿ =๐ŸŽ โ‹ฏโ‹ฏ ๐’‘ ๐Ÿ ๐’™ = ๐’‚ ๐ŸŽ + ๐’‚ ๐Ÿ ๐’™+โ‹ฏ+ ๐’‚ ๐’ ๐’™ ๐’ normal equations in Matrix Form ๐‘š ๐’™ ๐’Š โ‹ฏ ๐‘ฅ ๐‘– ๐‘› ๐’™ ๐’Š ๐‘ฅ ๐‘– โ‹ฏ ๐‘ฅ ๐‘– ๐‘› โ‹ฎ โ‹ฎ โ‹ฑ โ‹ฎ ๐‘ฅ ๐‘– ๐‘› ๐‘ฅ ๐‘– ๐‘› โ‹ฏ ๐‘ฅ ๐‘– 2๐‘› ๐’‚ ๐ŸŽ ๐’‚ ๐Ÿ โ‹ฎ ๐’‚ ๐’ = ๐’š ๐’Š ๐’™ ๐’Š ๐’š ๐’Š โ‹ฎ ๐‘ฅ ๐‘– ๐‘› ๐’š ๐’Š ๐‘ฅ ๐‘– 0 ๐’Ž Number of data points ๐’ Degree of polynomial (๐‘›+1)ร—(๐‘›+1) (๐‘›+1)ร—1

17 Sec:8.1 Discrete Least Squares Approximation


Download ppt "Discrete Least Squares Approximation"

Similar presentations


Ads by Google