UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2003 Final Review Wed. 12/10 – Fri. 12/12.

Slides:



Advertisements
Similar presentations
Algorithms (and Datastructures) Lecture 3 MAS 714 part 2 Hartmut Klauck.
Advertisements

Comp 122, Spring 2004 Binary Search Trees. btrees - 2 Comp 122, Spring 2004 Binary Trees  Recursive definition 1.An empty tree is a binary tree 2.A node.
Transform and Conquer Chapter 6. Transform and Conquer Solve problem by transforming into: a more convenient instance of the same problem (instance simplification)
Chapter 5 Decrease and Conquer. Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Midterm Review Fri. Oct 26.
Chapter 23 Minimum Spanning Trees
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 (Part 1) Introduction/Overview Tuesday, 9/4/01.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Chapter 23: Graph Algorithms Chapter 24: Minimum Spanning Trees.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2001 Makeup Lecture Chapter 23: Graph Algorithms Depth-First SearchBreadth-First.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Heap Lecture Chapter 6 Use NOTES feature to see explanation.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Th. 9/3/2009.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lectures 3 Tuesday, 9/25/01 Graph Algorithms: Part 1 Shortest.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2004 Final Review.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 4 Tuesday, 9/30/08 Graph Algorithms: Part 1 Shortest.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2006 Lecture 2 Monday, 2/6/06 Design Patterns for Optimization.
Lists A list is a finite, ordered sequence of data items. Two Implementations –Arrays –Linked Lists.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2003 Review Lecture Tuesday, 5/6/03.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 2 Tuesday, 9/10/02 Design Patterns for Optimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2006 Lecture 2 Monday, 9/13/06 Design Patterns for Optimization Problems.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Lecture 1 (Part 1) Introduction/Overview Tuesday, 9/3/02.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Monday, 12/2/02 Design Patterns for Optimization Problems Greedy.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Final Review.
Graph Algorithms: Part 1
Course Review COMP171 Spring Hashing / Slide 2 Elementary Data Structures * Linked lists n Types: singular, doubly, circular n Operations: insert,
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2002 Review Lecture Tuesday, 12/10/02.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 1 Introduction/Overview Wed. 9/5/01.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lectures 2, 3 Chapters 1, 2 Fri. 9/7/01 – Mon. 9/10/01.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Wed. 9/7/05.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Final Review Wed. 12/13.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2004 Lecture 1 (Part 1) Introduction/Overview Wednesday, 9/8/04.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2007 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Wed. 1/24/07.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2001 Lecture 1 Introduction/Overview Wed. 1/31/01.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2000 Lecture 1 Introduction/Overview Wed. 9/6/00.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2008 Lecture 2 Tuesday, 9/16/08 Design Patterns for Optimization.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2007 Heap Lecture Chapter 6 Use NOTES feature to see explanation.
Data Structures, Spring 2004 © L. Joskowicz 1 DAST – Final Lecture Summary and overview What we have learned. Why it is important. What next.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 (Part 1) Introduction/Overview Tuesday, 1/29/02.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2001 Final Review Mon. 5/14-Wed. 5/16.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Heap Lecture 2 Chapter 7 Wed. 10/10/01 Use NOTES feature to.
Text Chapters 1, 2. Sorting ä Sorting Problem: ä Input: A sequence of n numbers ä Output: A permutation (reordering) of the input sequence such that:
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Lecture 11 Tuesday, 12/4/01 Advanced Data Structures Chapters.
Review of Graphs A graph is composed of edges E and vertices V that link the nodes together. A graph G is often denoted G=(V,E) where V is the set of vertices.
Instructor: Dr. Sahar Shabanah Fall Lectures ST, 9:30 pm-11:00 pm Text book: M. T. Goodrich and R. Tamassia, “Data Structures and Algorithms in.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2002 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Thurs.
CS112A1 Spring 2008 Practice Final. ASYMPTOTIC NOTATION: a)Show that log(n) and ln(n) are the same in terms of Big-Theta notation b)Show that log(n+1)
Review – Part 1 CSE 2011 Winter September 2015.
Chapter 9 – Graphs A graph G=(V,E) – vertices and edges
David Luebke 1 9/18/2015 CS 332: Algorithms Red-Black Trees.
Final Exam Review CS 3610/5610N Dr. Jundong Liu.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 223 – Advanced Data Structures Course Review Midterm.
David Luebke 1 10/16/2015 CS 332: Algorithms Go Over Midterm Intro to Graph Algorithms.
Lecture 13 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2009 Lecture 1 (Part 1) Introduction/Overview Tuesday, 1/27/09.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 9.
Final Exam Review CS Total Points – 60 Points Writing Programs – 50 Points Tracing Algorithms, determining results, and drawing pictures – 50.
Lecture 11 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
+ David Kauchak cs312 Review. + Midterm Will be posted online this afternoon You will have 2 hours to take it watch your time! if you get stuck on a problem,
Review for Final Exam Non-cumulative, covers material since exam 2 Data structures covered: –Treaps –Hashing –Disjoint sets –Graphs For each of these data.
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
Red-Black Trees. Review: Binary Search Trees ● Binary Search Trees (BSTs) are an important data structure for dynamic sets ● In addition to satellite.
Week 15 – Wednesday.  What did we talk about last time?  Review up to Exam 1.
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Review Lecture Tuesday, 12/11/01.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Spring, 2008 Final Review.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
MA/CSSE 473 Day 14 Strassen's Algorithm: Matrix Multiplication Decrease and Conquer DFS.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Final Review Mon. 12/10-Wed. 12/12.
CSC317 1 At the same time: Breadth-first search tree: If node v is discovered after u then edge uv is added to the tree. We say that u is a predecessor.
Lecture 1 (Part 1) Introduction/Overview Tuesday, 9/9/08
Lecture 1 Introduction/Overview Text: Chapters 1, 2 Wed. 1/28/04
Presentation transcript:

UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2003 Final Review Wed. 12/10 – Fri. 12/12

Overview of Next 2 Lectures ä Review of some key course material ä Review material: ä 43-page handout on web from midterm time frame ä problems & solutions from review part of midterm exam, fall 2001 (see web site) ä problems & solutions from review part of midterm exam, fall 2002, spring 2003, fall 2003 ä Final Exam: ä Course Grade ä Logistics, Coverage, Format ä Course Evaluations (on-line)

Review of Key Course Material

What’s It All About? ä Algorithm: ä steps for the computer to follow to solve a problem ä Problem Solving Goals: ä recognize structure of some common problems ä understand important characteristics of algorithms to solve common problems ä select appropriate algorithm & data structures to solve a problem ä tailor existing algorithms ä create new algorithms

Some Algorithm Application Areas Computer Graphics Geographic Information Systems Robotics Bioinformatics Astrophysics Medical Imaging Telecommunications Design Apply Analyze

Tools of the Trade ä Algorithm Design Patterns such as: ä binary search ä divide-and-conquer ä randomized ä Data Structures such as: ä trees, linked lists, stacks, queues, hash tables, graphs, heaps, arrays Growth of Functions Summations Recurrences Sets Probability MATH Proofs

Discrete Math Review Growth of Functions, Summations, Recurrences, Sets, Counting, Probability

Topics ä Discrete Math Review : ä Sets, Basic Tree & Graph concepts ä Counting: Permutations/Combinations ä Probability: Basics, including Expectation of a Random Variable ä Proof Techniques: Induction ä Basic Algorithm Analysis Techniques: ä Asymptotic Growth of Functions ä Types of Input: Best/Average/Worst ä Bounds on Algorithm vs. Bounds on Problem ä Algorithmic Paradigms/Design Patterns: ä Divide-and-Conquer, Randomized ä Analyze pseudocode running time to form summations &/or recurrences

What are we measuring? ä Some Analysis Criteria: ä Scope ä The problem itself? ä A particular algorithm that solves the problem? ä “Dimension” ä Time Complexity? Space Complexity? ä Type of Bound ä Upper? Lower? Both? ä Type of Input ä Best-Case? Average-Case? Worst-Case? ä Type of Implementation ä Choice of Data Structure

Function Order of Growth O( ) upper bound  ( ) lower bound  ( ) upper & lower bound n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 know how to use asymptotic complexity notation to describe time or space complexity know how to order functions asymptotically (behavior as n becomes large) shorthand for inequalities

Types of Algorithmic Input Best-Case Input: of all possible algorithm inputs of size n, it generates the “best” result for Time Complexity: “best” is smallest running time for Time Complexity: “best” is smallest running time Best-Case Input Produces Best-Case Running Time Best-Case Input Produces Best-Case Running Time provides a lower bound on the algorithm’s asymptotic running time provides a lower bound on the algorithm’s asymptotic running time (subject to any implementation assumptions) (subject to any implementation assumptions) for Space Complexity: “best” is smallest storage for Space Complexity: “best” is smallest storage Average-Case Input Worst-Case Input these are defined similarly Best-Case Time <= Average-Case Time <= Worst-Case Time

Bounding Algorithmic Time (using cases) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T(n) =  (1) T(n) =  (2 n ) very loose bounds are not very useful! Worst-Case time of T(n) =  (2 n ) tells us that worst-case inputs cause the algorithm to take at most exponential time (i.e. exponential time is sufficient). But, can the algorithm every really take exponential time? (i.e. is exponential time necessary?) If, for arbitrary n, we find a worst-case input that forces the algorithm to use exponential time, then this tightens the lower bound on the worst-case running time. If we can force the lower and upper bounds on the worst-case time to match, then we can say that, for the worst-case running time, T(n) =  (2 n ) (i.e. we’ve found the minimum upper bound, so the bound is tight.) Using “case” we can discuss lower and/or upper bounds on: best-case running time or average-case running time or worst-case running time

Bounding Algorithmic Time (tightening bounds) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T W (n) =  (2 n ) for example... Here we denote best-case time by T B (n); worst-case time by T W (n) T B (n) =  (1) 1st attempt T B (n) =  (n) 1st attempt 2nd attempt T W (n) =  (n 2 ) T B (n) =  (n) 2nd attempt 1st attempt T W (n) =  (n 2 ) Algorithm Bounds

ä Explore the problem to gain intuition: ä Describe it: What are the assumptions? (model of computation, etc...) ä Has it already been solved? ä Have similar problems been solved? (more on this later) ä What does best-case input look like? ä What does worst-case input look like? ä Establish worst-case upper bound on the problem using an algorithm ä Design a (simple) algorithm and find an upper bound on its worst-case asymptotic running time; this tells us problem can be solved in a certain amount of time. Algorithms taking more than this amount of time may exist, but won’t help us. ä Establish worst-case lower bound on the problem ä Tighten each bound to form a worst-case “sandwich” Approach increasing worst-case asymptotic running time as a function of n n 1 2n2n2n2n n2n2n2n2 n3n3n3n3 n4n4n4n4 n5n5n5n5

Know the Difference! n 1 2n2n2n2n n5n5n5n5 worst-case bounds on problem on problem An inefficient algorithm for the problem might exist that takes this much time, but would not help us. No algorithm for the problem exists that can solve it for worst-case inputs in less than linear time. Strong Bound: This worst-case lower bound on the problem holds for every algorithm that solves the problem and abides by our problem’s assumptions. Weak Bound: This worst-case upper bound on the problem comes from just considering one algorithm. Other, less efficient algorithms that solve this problem might exist, but we don’t care about them! Both the upper and lower bounds are probably loose (i.e. probably can be tightened later on).

Master Theorem MMaster Theorem : LLet with a > 1 and b > 1. Tthen : CCase 1: If f(n) = O ( n (log b a) -  ) for some  > o Tthen T ( n ) =  ( n log b a ) CCase 2: If f (n) =  (n log b a ) Tthen T ( n ) =  (n log b a * log n ) CCase 3: If f ( n ) =  (n ( log b a) +  ) for some  > o and if a f( n/b) N 0 Tthen T ( n ) =  ( f ( n ) ) Use ratio test to distinguish between cases: f(n)/ f(n)/ n log b a Look for “polynomially larger” dominance.

CS Theory Math Review Sheet The Most Relevant Parts... ä p. 1  O, ,  definitions ä Series ä Combinations ä p. 2 Recurrences & Master Method ä p. 3 ä Probability ä Factorial ä Logs ä Stirling’s approx ä p. 4 Matrices ä p. 5 Graph Theory ä p. 6 Calculus ä Product, Quotient rules ä Integration, Differentiation ä Logs ä p. 8 Finite Calculus ä p. 9 Series Math fact sheet (courtesy of Prof. Costello) is on our web site.

Sorting Chapters 6-9 Heapsort, Quicksort, LinearTime-Sorting

Topics ä Sorting: Chapters 6-8 ä Sorting Algorithms: ä [Insertion & MergeSort)], Heapsort, Quicksort, LinearTime-Sorting ä Comparison-Based Sorting and its lower bound ä Breaking the lower bound using special assumptions ä Tradeoffs: Selecting an appropriate sort for a given situation ä Time vs. Space Requirements ä Comparison-Based vs. Non-Comparison-Based

Heaps & HeapSort ä Structure: ä Nearly complete binary tree ä Convenient array representation ä HEAP Property: (for MAX HEAP) ä Parent’s label not less than that of each child ä Operations: strategy worst-case run-time ä HEAPIFY: swap downO(h) [h= ht] ä INSERT: swap upO(h) ä EXTRACT-MAX: swap, HEAPIFY O(h) ä MAX: view rootO(1) ä BUILD-HEAP: HEAPIFY O(n)  HEAP-SORT: BUILD-HEAP, HEAPIFY  (nlgn)

QuickSort ä Divide-and-Conquer Strategy ä Divide: Partition array ä Conquer: Sort recursively ä Combine: No work needed ä Asymptotic Running Time:  Worst-Case:  (n 2 ) (partitions of size 1, n-1)  Best-Case:  (nlgn) (balanced partitions of size n/2)  Average-Case:  (nlgn) (balanced partitions of size n/2) ä Randomized PARTITION ä selects partition element randomly ä imposes uniform distribution Does most of the work on the way down (unlike MergeSort, which does most of work on the way back up (in Merge) PARTITION Recursively sort right partition right partition left partition Recursively sort left partition

Comparison-Based Sorting In algebraic decision tree model, comparison-based sorting of n items requires  (n lg n) worst-case time. HeapSort To break the lower bound and obtain linear time, forego direct value comparisons and/or make stronger assumptions about input. InsertionSort MergeSort QuickSort  (n)  (n 2 ) BestCaseAverageCaseWorstCase Time: Algorithm:  (n lg n)   (n lg n)  (n lg n)*  (n lg n)  (n lg n)  (n lg n)  (n 2 ) (*when all elements are distinct)

Data Structures Chapters Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced Trees

Topics ä Data Structures: Chapters ä Abstract Data Types: their properties/invariants ä Stacks, Queues, LinkedLists, (Heaps from Chapter 6), Trees, HashTables, Binary Search Trees, Balanced (Red/Black) Trees ä Implementation/Representation choices -> data structure ä Dynamic Set Operations: ä Query [does not change the data structure] ä Search, Minimum, Maximum, Predecessor, Successor ä Manipulate: [can change data structure] ä Insert, Delete ä Running Time & Space Requirements for Dynamic Set Operations for each Data Structure ä Tradeoffs: Selecting an appropriate data structure for a situation ä Time vs. Space Requirements ä Representation choices ä Which operations are crucial?

Hash Table ä Structure: ä n << N (number of keys in table much smaller than size of key universe) ä Table with m elements ä m typically prime ä Hash Function: ä Not necessarily a 1-1 mapping ä Uses mod m to keep index in table ä Collision Resolution: ä Chaining: linked list for each table entry ä Open addressing: all elements in table ä Linear Probing: ä Quadratic Probing: Load Factor: Example:

Linked Lists ä Types ä Singly vs. Doubly linked ä Pointer to Head and/or Tail ä NonCircular vs. Circular ä Type influences running time of operations / head9 4 3 / head tail9 4 3 head

Binary Tree Traversal ä “Visit” each node once  Running time in  (n) for an n-node binary tree ä Preorder: ABDCEF ä Visit node ä Visit left subtree ä Visit right subtree ä Inorder: DBAEFC ä Visit left subtree ä Visit node ä Visit right subtree ä Postorder: DBFECA ä Visit left subtree ä Visit right subtree ä Visit node B E C F D A

Binary Search Tree B D F E A C ä Structure: ä Binary tree ä BINARY SEARCH TREE Property: ä For each pair of nodes u, v: ä If u is in left subtree of v, then key[u] <= key[v] ä If u is in right subtree of v, then key[u] >= key[v] ä Operations: strategy worst-case run-time ä TRAVERSAL: INORDER, PREORDER, POSTORDER O(h) [h= ht] ä SEARCH: traverse 1 branch using BST property O(h) ä INSERT: search O(h) ä DELETE: splice out (cases depend on # children) O(h) ä MIN: go left O(h) ä MAX: go right O(h) ä SUCCESSOR: MIN if rt subtree; else go up O(h) ä PREDECESSOR: analogous to SUCCESSOR O(h) ä Navigation Rules ä Left/Right Rotations that preserve BST property

Red-Black Tree Properties ä Every node in a red-black tree is either black or red ä Every null leaf is black ä No path from a leaf to a root can have two consecutive red nodes -- i.e. the children of a red node must be black ä Every path from a node, x, to a descendant leaf contains the same number of black nodes -- the “black height” of node x. newly inserted node

Graph Algorithms Chapters DFS/BFS Traversals, Topological Sort, Minimum Spanning Trees, Shortest Paths

Topics ä Graph Algorithms: Chapters ä Undirected, Directed Graphs ä Connected Components of an Undirected Graph ä Representations: Adjacency Matrix, Adjacency List ä Traversals: DFS and BFS ä Differences in approach: DFS: LIFO/stack vs. BFS:FIFO/queue ä Forest of spanning trees ä Vertex coloring, Edge classification: tree, back, forward, cross ä Shortest paths (BFS) ä Topological Sort ä Weighted Graphs ä Minimum Spanning Trees: 2 different approaches ä Shortest Paths: Single source: Dijkstra’s algorithm ä Tradeoffs: ä Representation Choice: Adjacency Matrix vs. Adjacency List ä Traversal Choice: DFS or BFS

Introductory Graph Concepts: Representations B E C F D A B E C F D A ä Undirected Graph ä Directed Graph (digraph) A B C D E F ABCDEF ABCDEF A BC B ACEF C AB D E E BDF F BE A BC B CEF C D D E BD F E Adjacency Matrix Adjacency List Adjacency Matrix Adjacency List

SEARCHING Elementary Graph Algorithms: SEARCHING: DFS, BFS ä Breadth-First-Search (BFS): ä BFS  vertices close to v are visited before those further away  FIFO structure  queue data structure ä Shortest Path Distance ä From source to each reachable vertex ä Record during traversal ä Foundation of many “shortest path” algorithms See DFS, BFS Handout for PseudoCode ä Depth-First-Search (DFS): ä DFS backtracks  visit most recently discovered vertex  LIFO structure  stack data structure ä Encountering, finishing times : “well- formed” nested (( )( ) ) structure ä DFS of undirected graph produces only back edges or tree edges ä Directed graph is acyclic if and only if DFS yields no back edges for unweighted directed or undirected graph G=(V,E) Time: O(|V| + |E|) adj listO(|V| 2 ) adj matrix predecessor subgraph = forest of spanning trees Vertex color shows status: not yet encountered encountered, but not yet finished finished

Elementary Graph Algorithms: DFS, BFS ä Review problem: TRUE or FALSE? ä The tree shown below on the right can be a DFS tree for some adjacency list representation of the graph shown below on the left. B E C F D A A C B E D F Tree Edge Cross Edge Back Edge

Elementary Graph Algorithms: Topological Sort source: textbook Cormen et al. TOPOLOGICAL-SORT(G) 1 DFS(G) computes “finishing times” for each vertex 2 as each vertex is finished, insert it onto front of list 3 return list for Directed, Acyclic Graph (DAG) G=(V,E) Produces linear ordering of vertices. For edge (u,v), u is ordered before v. See also DFS/BFS slide show

Minimum Spanning Tree: Greedy Algorithms A B C D E F G source: textbook Cormen et al. for Undirected, Connected, Weighted Graph G=(V,E) Produces minimum weight tree of edges that includes every vertex. Invariant: Minimum weight spanning forest Becomes single tree at end Invariant: Minimum weight tree Spans all vertices at end Time: O(|E|lg|E|) given fast FIND-SET, UNION Time: O(|E|lg|V|) = O(|E|lg|E|) slightly faster with fast priority queue

Minimum Spanning Trees ä Review problem: ä For the undirected, weighted graph below, show 2 different Minimum Spanning Trees. Draw each using one of the 2 graph copies below. Thicken an edge to make it part of a spanning tree. What is the sum of the edge weights for each of your Minimum Spanning Trees? A B C D E F G

Single Source Shortest Paths Dijkstra’s Algorithm ä See separate ShortestPath slide show A B C D E F G source: textbook Cormen et al. for (nonnegative) weighted, directed graph G=(V,E)

Single Source Shortest Paths Dijkstra’s Algorithm ä Review problem: ä ä For the directed, weighted graph below, find the shortest path that begins at vertex A and ends at vertex F. List the vertices in the order that they appear on that path. What is the sum of the edge weights of that path? A B C D E F G Why can’t Dijkstra’s algorithm handle negative-weight edges?

FINAL EXAM Logistics, Coverage, Format

Course Grading ä Homework 35% ä Midterm 30% ä Final Exam 35% (open book) Results are scaled if necessary. Consider checking HW score status with us before final

Final Exam: Logistics ä Wednesday, 12/17 ä Southwick 202: 11:30 a.m. ä Open book, open notes ä Closed computers, neighbors ä Cumulative ä Worth 35% of grade

Text/Chapter/Topic Coverage ä Discrete Math Review & Basic Algorithm Analysis Techniques : Chapters 1-5 ä Summations, Recurrences, Sets, Trees, Graph, Counting, Probability, Growth of Functions, Divide-and-Conquer, Randomized Algorithms ä Sorting: Chapters 6-8 ä Heapsort, Quicksort, LinearTime-Sorting ä Data Structures: Chapters ä Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced (Red/Black) Trees ä Graph Algorithms: Chapters ä Traversal, MinimumSpanningTrees, Shortest Paths no * sections

Format ä Mixture of questions of the following types: 1) Multiple Choice 2) True/False 3) Short Answer 4) Analyze Pseudo-Code and/or Data Structure 5) Solve a Problem by Designing an Algorithm ä Select an appropriate paradigm/ design pattern ä Select appropriate data structures ä Write pseudo-code ä Justify correctness ä Analyze asymptotic complexity ~65% ~35%