CSE 541 - Differentiation Roger Crawfis.

Slides:



Advertisements
Similar presentations
Roundoff and truncation errors
Advertisements

CSE Differentiation Roger Crawfis. May 19, 2015OSU/CIS 5412 Numerical Differentiation The mathematical definition: Can also be thought of as the.
CISE301_Topic61 SE301: Numerical Methods Topic 6 Numerical Differentiation Lecture 23 KFUPM Read Chapter 23, Sections 1-2.
EE3561_Unit 6(c)AL-DHAIFALLAH14351 EE 3561 : Computational Methods Unit 6 Numerical Differentiation Dr. Mujahed AlDhaifallah ( Term 342)
Numerical Computation
Differentiation and Richardson Extrapolation
CE33500 – Computational Methods in Civil Engineering Differentiation Provided by : Shahab Afshari
Lecture 2: Numerical Differentiation. Derivative as a gradient
A few words about convergence We have been looking at e a as our measure of convergence A more technical means of differentiating the speed of convergence.
NUMERICAL DIFFERENTIATION The derivative of f (x) at x 0 is: An approximation to this is: for small values of h. Forward Difference Formula.
The Secant-Line Calculation of the Derivative
Ch 8.1 Numerical Methods: The Euler or Tangent Line Method
Chapter 4 Numerical Differentiation and Integration 1/16 Given x 0, approximate f ’(x 0 ). h xfhxf xf h )()( lim)('    x0x0 x1x1 h x1x1 x0x0.
Integration. Indefinite Integral Suppose we know that a graph has gradient –2, what is the equation of the graph? There are many possible equations for.
Numerical Methods on Partial Differential Equation Md. Mashiur Rahman Department of Physics University of Chittagong Laplace Equation.
Lecture 2 Number Representation and accuracy
1.6 – Tangent Lines and Slopes Slope of Secant Line Slope of Tangent Line Equation of Tangent Line Equation of Normal Line Slope of Tangent =
Computational Methods in Physics PHYS 3437 Dr Rob Thacker Dept of Astronomy & Physics (MM-301C)
CSE Differentiation Roger Crawfis Prepaid by:
Problems with Floating-Point Representations Douglas Wilhelm Harder Department of Electrical and Computer Engineering University of Waterloo Copyright.
Today’s class Numerical Differentiation Finite Difference Methods Numerical Methods Lecture 14 Prof. Jinbo Bi CSE, UConn 1.
Chap. 11 Numerical Differentiation and Integration
CHAPTER 3 NUMERICAL METHODS
Tangent Planes and Normal Lines
3 DIFFERENTIATION RULES. We have:  Seen how to interpret derivatives as slopes and rates of change  Seen how to estimate derivatives of functions given.
1/14  5.2 Euler’s Method Compute the approximations of y(t) at a set of ( usually equally-spaced ) mesh points a = t 0 < t 1
NUMERICAL DIFFERENTIATION or DIFFERENCE APPROXIMATION Used to evaluate derivatives of a function using the functional values at grid points. They are.
Section 2.1 – Average and Instantaneous Velocity.
INTEGRALS 5. INTEGRALS In Chapter 3, we used the tangent and velocity problems to introduce the derivative—the central idea in differential calculus.
1 Numerical Differentiation. 2  First order derivatives  High order derivatives  Examples.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Chapter 21 Numerical Differentiation.
Trigonometric Identities
3 DERIVATIVES.
Numerical Methods Some example applications in C++
NUMERICAL DIFFERENTIATION Forward Difference Formula
Ch. 2 – Limits and Continuity
MTH1150 Tangents and Their Slopes
Ch. 11 – Limits and an Introduction to Calculus
5.3 The Fundamental Theorem of Calculus
Chapter 18.
Machine arithmetic and associated errors Introduction to error analysis (cont.) Class III.
Advanced Numerical Methods (S. A. Sahu) Code: AMC 51151
Taylor series in numerical computations (review)
CSE 575 Computer Arithmetic Spring 2003 Mary Jane Irwin (www. cse. psu
Numerical Analysis Lecture 27.
Numerical Differentiation
Trigonometric Identities
3.2 Differentiability, p. 109 Day 1
Class Notes 18: Numerical Methods (1/2)
Chapter 18.
Roundoff and Truncation Errors
Numerical Solutions of Ordinary Differential Equations
Chapter 23.
Tangent line to a curve Definition: line that passes through a given point and has a slope that is the same as the.
CSE Differentiation Roger Crawfis.
Numerical differentiation
Tangent line to a curve Definition: line that passes through a given point and has a slope that is the same as the.
6.5 Taylor Series Linearization
Taylor’s Theorem: Error Analysis for Series
Section 2.1 – Average and Instantaneous Velocity
MATH 174: NUMERICAL ANALYSIS I
30 – Instantaneous Rate of Change No Calculator
CIS 541 – Numerical Methods
Limits and Continuity A BRIEF PREVIEW OF CALCULUS: TANGENT LINES AND THE LENGTH OF A CURVE 1.2 THE CONCEPT OF LIMIT 1.3 COMPUTATION OF LIMITS 1.4.
The Fundamental Theorems of Calculus
Comp 208 Computers in Engineering Yi Lin Winter, 2007
Roundoff and Truncation Errors
Fixed- Point Iteration
CISE-301: Numerical Methods Topic 1: Introduction to Numerical Methods and Taylor Series Lectures 1-4: KFUPM CISE301_Topic1.
Presentation transcript:

CSE 541 - Differentiation Roger Crawfis

Numerical Differentiation The mathematical definition: Can also be thought of as the tangent line. x x+h November 19, 2018 OSU/CIS 541

Numerical Differentiation We can not calculate the limit as h goes to zero, so we need to approximate it. Apply directly for a non-zero h leads to the slope of the secant curve. x x+h November 19, 2018 OSU/CIS 541

Numerical Differentiation This is called Forward Differences and can be derived using Taylor’s Series: Theoretically speaking November 19, 2018 OSU/CIS 541

Truncation Errors Let f(x) = a+e, and f(x+h) = a+f. Then, as h approaches zero, e<<a and f<<a. With limited precision on our computer, our representation of f(x)  a  f(x+h). We can easily get a random round-off bit as the most significant digit in the subtraction. Dividing by h, leads to a very wrong answer for f’(x). November 19, 2018 OSU/CIS 541

Error Tradeoff Using a smaller step size reduces truncation error. However, it increases the round-off error. Trade off/diminishing returns occurs: Always think and test! Point of diminishing returns Total error Log error Round off error Truncation error Log step size November 19, 2018 OSU/CIS 541

Numerical Differentiation This formula favors (or biases towards) the right-hand side of the curve. Why not use the left? x-h x x+h November 19, 2018 OSU/CIS 541

Numerical Differentiation This leads to the Backward Differences formula. November 19, 2018 OSU/CIS 541

Numerical Differentiation Can we do better? Let’s average the two: This is called the Central Difference formula. Forward difference Backward difference November 19, 2018 OSU/CIS 541

Central Differences This formula does not seem very good. It does not follow the calculus formula. It takes the slope of the secant with width 2h. The actual point we are interested in is not even evaluated. x x+h x-h November 19, 2018 OSU/CIS 541

Numerical Differentiation Is this any better? Let’s use Taylor’s Series to examine the error: November 19, 2018 OSU/CIS 541

Central Differences The central differences formula has much better convergence. Approaches the derivative as h2 goes to zero!! November 19, 2018 OSU/CIS 541

Warning Still have truncation error problem. Consider the case of: Build a table with smaller values of h. What about large values of h for this function? November 19, 2018 OSU/CIS 541

Richardson Extrapolation Can we do better? Is my choice of h a good one? Let’s subtract the two Taylor Series expansions again: November 19, 2018 OSU/CIS 541

Richardson Extrapolation Assuming the higher derivatives exist, we can hold x fixed (which also fixes the values of f(x)), to obtain the following formula. Richardson Extrapolation examines the operator below as a function of h. November 19, 2018 OSU/CIS 541

Richardson Extrapolation This function approximates f’(x) to O(h2) as we saw earlier. Let’s look at the operator as h goes to zero. Same leading constants November 19, 2018 OSU/CIS 541

Richardson Extrapolation Using these two formula’s, we can come up with another estimate for the derivative that cancels out the h2 terms. Extrapolates by assuming the new estimate undershot. new estimate difference between old and new estimates November 19, 2018 OSU/CIS 541

Richardson Extrapolation If h is small (h<<1), then h4 goes to zero much faster than h2. Cool!!! Can we cancel out the h6 term? Yes, by using h/4 to estimate the derivative. November 19, 2018 OSU/CIS 541

Richardson Extrapolation Consider the following property: where L is unknown, as are the coefficients, a2k. November 19, 2018 OSU/CIS 541

Richardson Extrapolation Do not forget the formal definition is simply the central-differences formula: New symbology (is this a word?): From previous slide November 19, 2018 OSU/CIS 541

Richardson Extrapolation D(n,0) is just the central differences operator for different values of h. Okay, so we proceed by computing D(n,0) for several values of n. Recalling our cancellation of the h2 term. November 19, 2018 OSU/CIS 541

Richardson Extrapolation If we let hh/2, then in general, we can write: Let’s denote this operator as: November 19, 2018 OSU/CIS 541

Richardson Extrapolation Now, we can formally define Richardson’s extrapolation operator as: or new estimate old estimate November 19, 2018 OSU/CIS 541

Richardson Extrapolation Now, we can formally define Richardson’s extrapolation operator as: Memorize me!!!! November 19, 2018 OSU/CIS 541

Richardson Extrapolation Theorem These terms approach f’(x) very quickly. Order starts much higher!!!! November 19, 2018 OSU/CIS 541

Richardson Extrapolation Since m n, this leads to a two-dimensional triangular array of values as follows: We must pick an initial value of h and a max iteration value N. November 19, 2018 OSU/CIS 541

Example November 19, 2018 OSU/CIS 541

Example November 19, 2018 OSU/CIS 541

Example November 19, 2018 OSU/CIS 541

Example Which converges up to eight decimal places. Is it accurate? November 19, 2018 OSU/CIS 541

Example We can look at the (theoretical) error term on this example. Taking the derivative: 2-144 Round-off error November 19, 2018 OSU/CIS 541

Second Derivatives What if we need the second derivative? Any guesses? November 19, 2018 OSU/CIS 541

Second Derivatives Let’s cancel out the odd derivatives and double up the even ones: Implies adding the terms together. November 19, 2018 OSU/CIS 541

Second Derivatives Isolating the second derivative term yields: With an error term of: November 19, 2018 OSU/CIS 541

Partial Derivatives Remember: Nothing special about partial derivatives: November 19, 2018 OSU/CIS 541

Calculating the Gradient For lab 2, you need to calculate the gradient. Just use central differences for each partial derivative. Remember to normalize it (divide by its length). November 19, 2018 OSU/CIS 541