Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.

Similar presentations


Presentation on theme: "1 Recursion Algorithm Analysis Standard Algorithms Chapter 7."— Presentation transcript:

1 1 Recursion Algorithm Analysis Standard Algorithms Chapter 7

2 2 Recursion Consider things that reference themselves –Cat in the hat in the hat in the hat … –A picture of a picture –Having a dream in your dream!! Recursion has its base in mathematical induction Recursion always has –an anchor (or base or trivial) case –an inductive case

3 3 Recursion A recursive function will call or reference itself. Consider int R(int x) { return 1 + R(x); } What is wrong with this picture? –Nothing will stop repeated recursion –Like an endless loop, but will eventually cause your program to run out of memory The problem is that this function has no anchor.

4 4 Recursion A proper recursive function will have An anchor or base case –the function’s value is defined for one or more values of the parameters An inductive or recursive step –the function’s value (or action) for the current parameter values is defined in terms of … – previously defined function values (or actions) and/or parameter values.

5 5 Recursive Example int Factorial(int n) { if (n == 0) return 1; else return n * Factorial(n - 1); } Which is the anchor? Which is the inductive or recursive part? How does the anchor keep it from going forever?

6 6 A Bad Use of Recursion Fibonacci numbers 1, 1, 2, 3, 5, 8, 13, 21, 34 f 1 = 1, f 2 = 1 … f n = f n -2 + f n -1 –A recursive function double Fib (unsigned n) { if (n <= 2) return 1; else return Fib (n – 1) + Fib (n – 2); } Why is this inefficient? –Note the recursion tree on pg 327

7 7 Uses of Recursion Easily understood recursive functions are not always the most efficient algorithms "Tail recursive" functions –When the last statement in the recursive function is a recursive invocation. –These are much more efficiently written with a loop Elegant recursive algorithms –Binary search (see pg 328) –Palindrome checker (pg 330) –Towers of Hanoi solution (pg 336) –Parsing expressions (pg 338)

8 8 Comments on Recursion Many iterative tasks can be written recursively –but end up inefficient However There are many problems with good recursive solutions And their iterative solutions are –not obvious –difficult to develop

9 9 Algorithm Efficiency How do we measure efficiency –Space utilization – amount of memory required –Time required to accomplish the task Time efficiency depends on : –size of input –speed of machine –quality of source code –quality of compiler These vary from one platform to another

10 10 Algorithm Efficiency We can count the number of times instructions are executed –This gives us a measure of efficiency of an algorithm So we measure computing time as: T(n)= computing time of an algorithm for input of size n = number of times the instructions are executed

11 11 Example: Calculating the Mean Task# times executed 1.Initialize the sum to 01 2.Initialize index i to 01 3.While i < n do followingn+1 4. a) Add x[i] to sumn 5. b) Increment i by 1n 6.Return mean = sum/n1 Total 3n + 4

12 12 Computing Time Order of Magnitude As number of inputs increases  T(n) = 3n + 4 grows at a rate proportional to n Thus T(n) has the "order of magnitude" n The computing time of an algorithm on input of size n,  T(n) said to have order of magnitude f(n),  written T(n) is O(f(n)) if … there is some constant C such that  T(n) < C  f(n) for all sufficiently large values of n

13 13 Big Oh Notation Another way of saying this: The complexity of the algorithm is O(f(n)). Example: For the Mean-Calculation Algorithm: T(n) is O(n) Note that constants and multiplicative factors are ignored.

14 14 Big Oh Notation f(n) is usually simple: n, n 2, n 3,... 2 n 1, log 2 n n log 2 n log 2 log 2 n

15 15 Big-O Notation Cost function –A numeric function that gives performance of an algorithm in terms of one or more variables –Typically the variable(s) capture number of data items Actual cost functions are hard to develop Generally we use approximating functions

16 16 Function Dominance Asymptotic dominance –g dominates f if there is a positive constant c such that Example: suppose the actual cost function is Both of these will dominate T(n) for sufficiently large values of n

17 17 Estimating Functions Characteristics for good estimating functions It asymptotically dominates the actual time function It is simple to express and understand It is as close an estimate as possible Because any constant c > 1 will make n 2 larger

18 18 Estimating Functions Note how the c*n 2 dominates Thus we use n 2 as an estimate of the time required

19 19 Order of a Function To express time estimates concisely we use the concept “order of a function” Definition: Given two nonnegative functions f and g, the order of f is g, iff g asymptotically dominates f Stated –“f is of order g” –“f = O(g)” big-O notation O stands for “Order”

20 20 Order of a Function Note the possible confusion –The notation does NOT say “the order of g is f” nor does it say “f equals the order of g” –It does say “ f is of order g ”

21 21 Big-O Arithmetic Given f and g functions, k a constant

22 22 Example: Calculating the Mean Task# times executed 1.Initialize the sum to 01 2.Initialize index i to 01 3.While i < n do followingn+1 4. a) Add x[i] to sumn 5. b) Increment i by 1n 6.Return mean = sum/n1 Total 3n + 4 Based on Big-O arithmetic this algorithm has O(n)

23 23 Worst-Case Analysis The arrangement of the input items may affect the computing time. How then to measure performance? –best case not very informative –average too difficult to calculate –worst case usual measure Consider Linear search of the list a[0],..., a[n – 1].

24 24 Worst-Case Analysis Algorithm: 1.found = false. 2.loc = 0. 3.While (loc < n && !found ) 4.If item = a[loc] found = true// item found 5.ElseIncrement loc by 1 // keep searching Worst case: Item not in the list: T L (n) is O(n) Average case (assume equal distribution of values) is O(n) Linear search of a[0] … a[n-1]

25 25 Binary Search 1. found = false. 2. first = 0. 3. last = n – 1. 4. While (first < last && !found ) 5. Calculate loc = (first + last) / 2. 6. If item a[loc] then 9.first = loc + 1.// search last half 10.Elsefound = true.// item found Each pass cuts the list in half Worst case : item not in list T B (n) = O(log 2 n) Binary search of a[0] … a[n-1]

26 26 Common Computing Time Functions log 2 log 2 nlog 2 nnn log 2 nn2n2 n3n3 2n2n ---010112 0.00122484 1.00248166416 1.58382464512256 2.0041664256409665536 2.325321601024327684294967296 2.5866438440962621441.84467E+19 3.008256204865536167772161.15792E+77 3.321010241024010485761.07E+091.8E+308 4.32201048576209715201.1E+121.15E+186.7E+315652 For our binary search

27 27 Computing in Real Time Suppose each instruction can be done in 1 microsecond For n = 256 inputs how long for various f(n) FunctionTime log 2 log 2 n3 microseconds Log 2 n8 microseconds n.25 milliseconds n log 2 n2 milliseconds n2n2 65 milliseconds n3n3 17 seconds 2n2n 3.7+E64 centuries!!

28 28 Conclusion Algorithms with exponential complexity –practical only for situations where number of inputs is small Bubble sort has O(n 2 ) –OK for n < 100 –Totally impractical for large n

29 29 Computing Times Of Recursive Functions // Towers of Hanoi void Move(int n, char source, char destination, char spare) { if (n <= 1) // anchor (base) case cout << "Move the top disk from " << source << " to " << destination << endl; else { // inductive case Move(n-1, source, spare, destination); Move(1, source, destination, spare); Move(n-1, spare, destination, source); } T(n) = O(2 n )


Download ppt "1 Recursion Algorithm Analysis Standard Algorithms Chapter 7."

Similar presentations


Ads by Google