Presentation is loading. Please wait.

Presentation is loading. Please wait.

UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Final Review Wed. 12/13.

Similar presentations


Presentation on theme: "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Final Review Wed. 12/13."— Presentation transcript:

1 UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Final Review Wed. 12/13

2 Overview of Today’s Lecture ä Course Grade ä Final Exam: ä Logistics, Coverage, Format ä Handout for basis of 40% of test ä Review of some key course material ä Course Evaluations

3 Course Grading ä ä Homework 40% ä ä Exam 1 10% (closed book) ä ä Exam 2 15% (open book) ä ä Exam 3 15% (open book) ä ä Final Exam 20% (open book) Results are scaled if necessary.

4 Final Exam: Logistics ä Saturday, 12/16 ä Olsen 410: 8:00-10:00 a.m. ä Open book, open notes ä Closed computers, neighbors ä Cumulative ä Worth 20% of grade Moved to OS 311

5 Text/Chapter Coverage ä Discrete Math Review: Chapters 1-6 ä Growth of Functions, Summations, Recurrences, Sets, Counting, Probability ä Sorting: Chapters 7-10 ä Heapsort, Quicksort, LinearTime-Sorting, Medians ä Data Structures: Chapters 11-14 ä Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced Trees ä Advanced Techniques: Chapters 16-17 ä Dynamic Programming, Greedy Algorithms ä Graph Algorithms: Chapters 23-25 ä Traversals, MinimumSpanningTrees, Shortest Paths

6 Format ä This exam will have a mixture of questions of the following types: 1) Multiple Choice 2) Short Answer/ Pseudo-code analysis 3) Design an Algorithm (based on PatrioticTree handout) - Write pseudo-code - Justify correctness - Justify correctness - Analyze asymptotic complexity - Analyze asymptotic complexity

7 FINAL EXAM HANDOUT Patriotic Trees 40% of the Final Exam is based on the representation defined here. (This material is already on our Web site. )

8 PatrioticTree We define a PatrioticTree to be a binary tree of n nodes in which: - - Every internal node has both a left and a right child. - - Each node is labeled with a single color (R, W or B) - - Each node has a weight. The weight is a pair of integer values and is of the form. The first integer value in the pair is the Red Weight of the node = Red Weight of the node’s left child + Red Weight of the node’s right child. The second integer value in the pair is the Blue Weight of the node = Blue Weight of the node’s left child + Blue Weight of the node’s right child. The color of a node is determined as follows: - - R if its Red Weight exceeds its Blue Weight - - B if its Blue Weight exceeds its Red Weight - - W if its Blue Weight equals its Red Weight - - The weight of each W leaf is. - - The weight of each R leaf is of the form. - - The weight of each B leaf is of the form.

9 PatrioticTree: Example W R B R B W W B R B B B B B B

10 PatrioticTree: Representation For algorithmic purposes, we represent a PatrioticTree as follows: A TreeNode represents a node of a PatrioticTree. It contains the attributes: - color: a character representing its color: “R”, “B” or “W” - r: an integer representing its Red Weight - b: an integer representing its Blue Weight - parent: a pointer to the parent of this node - left: a pointer to the left subtree of this node - right: a pointer to the right subtree of this node You can access a node’s attributes using the “.” notation (e.g. for a TreeNode t, t.color gives its color and t.left accesses its left subtree). color r parent left b right

11 PatrioticTree: Representation Example B 62 B 83 B 50 W 33 B 87 B 62 B 149 R 04 R 02 R 03 B 50 B 30 B 60 W 00 W 00

12 What’s It All About? ä Algorithm: ä steps for the computer to follow to solve a problem ä Some of our goals: ä recognize structure of some common problems ä understand important characteristics of algorithms to solve common problems ä select appropriate algorithm to solve a problem ä tailor existing algorithms ä create new algorithms

13 Some Algorithm Application Areas Computer Graphics Geographic Information Systems Robotics Bioinformatics Astrophysics Medical Imaging Telecommunications Design Apply Analyze

14 Tools of the Trade ä Algorithm Design Patterns such as: ä binary search ä divide-and-conquer ä Data Structures such as: ä trees, linked lists, hash tables, graphs ä Theoretical Computer Science principles such as: ä NP-completeness, hardness Growth of Functions Summations Recurrences Sets Probability MATH Proofs

15 Discrete Math Review Chapters 1-6 Growth of Functions, Summations, Recurrences, Sets, Counting, Probability

16 What are we measuring? ä Some Analysis Criteria: ä Scope ä The problem itself? ä A particular algorithm that solves the problem? ä “Dimension” ä Time Complexity? Space Complexity? ä Type of Bound ä Upper? Lower? Both? ä Type of Input ä Best-Case? Average-Case? Worst-Case? ä Type of Implementation ä Choice of Data Structure

17 Function Order of Growth O( ) upper bound  ( ) lower bound  ( ) upper & lower bound n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 know how to use asymptotic complexity notation to describe time or space complexity know how to order functions asymptotically (behavior as n becomes large)

18 Types of Algorithmic Input Best-Case Input: of all possible algorithm inputs of size n, it generates the “best” result for Time Complexity: “best” is smallest running time for Time Complexity: “best” is smallest running time Best-Case Input Produces Best-Case Running Time Best-Case Input Produces Best-Case Running Time provides a lower bound on the algorithm’s asymptotic running time provides a lower bound on the algorithm’s asymptotic running time (subject to any implementation assumptions) (subject to any implementation assumptions) for Space Complexity: “best” is smallest storage for Space Complexity: “best” is smallest storage Average-Case Input Worst-Case Input these are defined similarly Best-Case Time <= Average-Case Time <= Worst-Case Time

19 Bounding Algorithmic Time (using cases) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T(n) =  (1) T(n) =  (2 n ) very loose bounds are not very useful! Worst-Case time of T(n) =  (2 n ) tells us that worst-case inputs cause the algorithm to take at most exponential time (i.e. exponential time is sufficient). But, can the algorithm every really take exponential time? (i.e. is exponential time necessary?) If, for arbitrary n, we find a worst-case input that forces the algorithm to use exponential time, then this tightens the lower bound on the worst-case running time. If we can force the lower and upper bounds on the worst-case time to match, then we can say that, for the worst-case running time, T(n) =  (2 n ) (i.e. we’ve found the minimum upper bound, so the bound is tight.) Using “case” we can discuss lower and/or upper bounds on: best-case running time or average-case running time or worst-case running time

20 Bounding Algorithmic Time (tightening bounds) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T W (n) =  (2 n ) for example... Here we denote best-case time by T B (n); worst-case time by T W (n) T B (n) =  (1) 1st attempt T B (n) =  (n) 1st attempt 2nd attempt T W (n) =  (n 2 ) T B (n) =  (n) 2nd attempt 1st attempt T W (n) =  (n 2 ) Algorithm Bounds

21 Master Theorem Master Theorem : Let with a > 1 and b > 1. Then : Case 1: If f(n) = O ( n (log b a) -  ) for some  > o then T ( n ) =  ( n log b a ) Case 2: If f (n) =  (n log b a ) then T ( n ) =  (n log b a * log n ) Case 3: If f ( n ) =  (n log b (a +  ) for some  > o and if a f( n/b) N 0 then T ( n ) =  ( f ( n ) ) Use ratio test to distinguish between cases: f(n)/ f(n)/ n log b a Look for “polynomially larger” dominance.

22 Sorting Chapters 7-10 Heapsort, Quicksort, LinearTime-Sorting, Medians

23 Comparison-Based Sorting In algebraic decision tree model, comparison-based sorting of n items requires  (n lg n) time. HeapSort To breaking the lower bound and obtain linear time, forego direct value comparisons and/or make stronger assumptions about input. InsertionSort MergeSort QuickSort  (n lg n)  (n 2 ) BestCaseAverageCaseWorstCase Time: Algorithm:  (n lg n)   (n lg n)  (n lg n)  (n lg n)  (n lg n)  (n lg n)  (n 2 )

24 Data Structures Chapters 11-14 Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced Trees

25

26 Advanced Techniques Chapters 16-17 Dynamic Programming, Greedy Algorithms

27 Problem Characteristics Modular Independent pieces Divide-and-Conquer Dynamic Programming Greedy Algorithms ModularOptimization Optimal substructure: optimal solution contains optimal solutions to subproblems Overlapping subproblems ModularOptimization Optimal substructure: optimal solution contains optimal solutions to subproblems Greedy choice property: locally optimal choices lead to global optimum

28 Graph Algorithms Chapters 23-25 DFS/BFSTraversals, Topological Sort, MinimumSpanningTrees, Shortest Paths

29 Traversals: DFS, BFS ä DFS backtracks  visit most recently discovered vertex  LIFO structure  stack data structure ä BFS  vertices close to v are visited before those further away  FIFO structure  queue data structure


Download ppt "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Final Review Wed. 12/13."

Similar presentations


Ads by Google