# Chapter 5 Decrease and Conquer. Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6.

## Presentation on theme: "Chapter 5 Decrease and Conquer. Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6."— Presentation transcript:

Chapter 5 Decrease and Conquer

Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6 –page 168 questions 1 and 4

Decrease and Conquer Also referred to as inductive or incremental approach 1. Reduce problem instance to smaller instance of the same problem and extend solution 2. Solve smaller instance 3. Extend solution of smaller instance to obtain solution to original problem Note: We are not dividing the problem into two smaller problems.

Examples of Decrease and Conquer Decrease by one: Decrease by one: –Insertion sort –Graph search algorithms: DFS DFS BFS BFS Topological sorting Topological sorting –Algorithms for generating permutations, subsets Decrease by a constant factor Decrease by a constant factor –Binary search –Fake-coin problems –multiplication à la russe –Josephus problem Variable-size decrease Variable-size decrease –Euclid’s algorithm –Selection by partition

What’s the difference? Consider the problem of exponentiation: Compute a n Brute Force: Brute Force: Divide and conquer: Divide and conquer: Decrease by one: Decrease by one: Decrease by constant factor: Decrease by constant factor:

a n Brute Force: a*a*a*a*a*... *a*a*a a*a*a*a*a*... *a*a*a Requires n multiplications Requires n multiplications Programmed as a loop Programmed as a loop Obviously O(n) Obviously O(n)

a n Divide and conquer: a n/2 * a n/2 a n/2 * a n/2 Isn’t this clever? Isn’t this clever? a 8 =a 4 * a 4 = a 2 *a 2 * a 2 *a 2 = a 1 *a 1 *a 1 *a 1 * a 1 *a 1 *a 1 *a 1 a 8 =a 4 * a 4 = a 2 *a 2 * a 2 *a 2 = a 1 *a 1 *a 1 *a 1 * a 1 *a 1 *a 1 *a 1 Why is this retarded? Why is this retarded? If you compute a 4 why do you have to compute it again? If you compute a 4 why do you have to compute it again? Sometimes divide and conquer doesn’t yield an advantage! Sometimes divide and conquer doesn’t yield an advantage!

a n Decrease by one: a n = a n-1 * a a n = a n-1 * a Q: Is this really any different than the brute force method? Q: Is this really any different than the brute force method? A: No, except that it can be programmed recursively. A: No, except that it can be programmed recursively. We still haven’t done better than O(n) We still haven’t done better than O(n)

a n Decrease by constant factor: a n = (a n/2 ) 2 a n = (a n/2 ) 2 Your probably thinking: Dr.B. kidding, right? Your probably thinking: Dr.B. kidding, right? Q: Isn’t this exactly the same as the divide an conquer approach? Q: Isn’t this exactly the same as the divide an conquer approach? A: No, check this out. A: No, check this out. a 8 = (a 4 ) 2 = ((a 2 ) 2 ) 2 = ((a*a) 2 ) 2 a 8 = (a 4 ) 2 = ((a 2 ) 2 ) 2 = ((a*a) 2 ) 2 We actually only do 3 multiplications We actually only do 3 multiplications a*a = v 1 ; a*a = v 1 ; v 1 * v 1 = v 2 ; v 1 * v 1 = v 2 ; v 2 * v 2 = a 8 v 2 * v 2 = a 8

a n Decrease by constant factor: a n = (a n/2 ) 2 a n = (a n/2 ) 2 a 16 = (((a*a) 2 ) 2 ) 2  4 multiplications a 16 = (((a*a) 2 ) 2 ) 2  4 multiplications a 32 = ((((a*a) 2 ) 2 ) 2 ) 2  5 multiplications a 32 = ((((a*a) 2 ) 2 ) 2 ) 2  5 multiplications … a 1024  10 multiplications a 1024  10 multiplications Obviously this is an O(log n) algorithm Obviously this is an O(log n) algorithm What about a 47 ? What about a 47 ?

a n Decrease by constant factor: What about a 47 ? What about a 47 ? a 47 = a*(a 23 ) 2  2 multiplications a 47 = a*(a 23 ) 2  2 multiplications a 23 = a*(a 11 ) 2  2 multiplications a 23 = a*(a 11 ) 2  2 multiplications a 11 = a*(a 5 ) 2  2 multiplications a 11 = a*(a 5 ) 2  2 multiplications a 5 = a*(a 2 ) 2  2 multiplications a 5 = a*(a 2 ) 2  2 multiplications a 2 = a*a  1 multiplications a 2 = a*a  1 multiplications It might actually take 2*log n the worst case, which is still O(log n). It might actually take 2*log n the worst case, which is still O(log n). Isn’t Big-O nice? Isn’t Big-O nice?

Graph Traversal Many problems require processing all graph vertices in systematic fashion Many problems require processing all graph vertices in systematic fashion Graph traversal algorithms: Graph traversal algorithms: –Depth-first search –Breadth-first search

Depth-first search Given a graph G=(V,E), explore graph always moving away from last visited vertex Given a graph G=(V,E), explore graph always moving away from last visited vertex G is a graph which consists of two sets G is a graph which consists of two sets –V is a set of vertices V = {A, B, C, D, E, F} V = {A, B, C, D, E, F} –E is a set of edges E = {(A,B), (A,C), (C,D), (D,E), (E,C), (B,F)} E = {(A,B), (A,C), (C,D), (D,E), (E,C), (B,F)} A B C D E F

Depth-first search DFS(G) count :=0 mark each vertex with 0 (unvisited) for each vertex v in V do if v is marked with 0 dfs(v) dfs(v) count := count + 1 mark v with count for each vertex w adjacent to v do if w is marked with 0 dfs(w)

Types of edges Tree edges: edges comprising forest Tree edges: edges comprising forest Back edges: edges to ancestor nodes Back edges: edges to ancestor nodes Forward edges: edges to descendants (digraphs only) Forward edges: edges to descendants (digraphs only) Cross edges: none of the above Cross edges: none of the above

Depth-first search: Notes DFS can be implemented with graphs represented as: DFS can be implemented with graphs represented as: –Adjacency matrices: Θ(V 2 ) –Adjacency linked lists: Θ(V+E) Yields two distinct ordering of vertices: Yields two distinct ordering of vertices: –preorder: as vertices are first encountered (pushed onto stack) –postorder: as vertices become dead-ends (popped off stack)

Depth-first search: Notes Applications: Applications: –checking connectivity, finding connected components –checking acyclicity –searching state-space of problems for solution (AI)

Breadth-first search Explore graph moving across to all the neighbors of last visited vertex Explore graph moving across to all the neighbors of last visited vertex Similar to level-by-level tree traversals Similar to level-by-level tree traversals Instead of a stack, breadth-first uses queue Instead of a stack, breadth-first uses queue Applications: same as DFS, but can also find paths from a vertex to all other vertices with the smallest number of edges Applications: same as DFS, but can also find paths from a vertex to all other vertices with the smallest number of edges

Breadth-first search algorithm bfs(v) count := count + 1 mark v with count initialize queue with v while queue is not empty do a := front of queue for each vertex w adjacent to a do if w is marked with 0 count := count + 1 mark w with count add w to the end of the queue remove a from the front of the queue BFS(G) count :=0 mark each vertex with 0 for each vertex v in V do bfs(v)

Breadth-first search: Notes BFS has same efficiency as DFS and can be implemented with graphs represented as: BFS has same efficiency as DFS and can be implemented with graphs represented as: –Adjacency matrices: Θ(V 2 ) –Adjacency linked lists: Θ(V+E) Yields single ordering of vertices (order added/deleted from queue is the same) Yields single ordering of vertices (order added/deleted from queue is the same)

Directed acyclic graph (dag) A directed graph with no cycles A directed graph with no cycles Arise in modeling many problems, eg: Arise in modeling many problems, eg: –prerequisite structure –food chains Imply partial ordering on the domain Imply partial ordering on the domain

Topological sorting Problem: find a total order consistent with a partial order Problem: find a total order consistent with a partial order Example: Example: fish human shrimp sheep wheatplankton tiger Order them so that they don’t have to wait for any of their food (i.e., from lower to higher, consistent with food chain) Problem is solvable iff graph is dag

Topological sorting Algorithms 1. DFS-based algorithm: –DFS traversal noting order vertices are popped off stack –Reverse order solves topological sorting –Back edges encountered? → NOT a dag! 2. Source removal algorithm –Repeatedly identify and remove a source vertex, ie, a vertex that has no incoming edges Both Θ(V+E) using adjacency linked lists Both Θ(V+E) using adjacency linked lists

Variable-size-decrease: Binary search trees Arrange keys in a binary tree with the binary search tree property: Arrange keys in a binary tree with the binary search tree property: k k What about repeated keys? What about repeated keys? Example 1: Example 1: 5, 10, 3, 1, 7, 12, 9 Example 2: Example 2: 4, 5, 7, 2, 1, 3, 6

Searching and insertion in binary search trees Searching – straightforward Searching – straightforward Insertion – search for key, insert at leaf where search terminated Insertion – search for key, insert at leaf where search terminated All operations: worst case # key comparisons = h + 1 All operations: worst case # key comparisons = h + 1 –lg n ≤ h ≤ n – 1 with average (random files) 1.41 lg n –Thus all operations have: worst case: Θ(n) worst case: Θ(n) average case: Θ(lgn) average case: Θ(lgn) Bonus: inorder traversal produces sorted list (treesort) Bonus: inorder traversal produces sorted list (treesort)

Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6 –page 168 questions 1 and 4

Download ppt "Chapter 5 Decrease and Conquer. Homework 7 hw7 (due 3/17) hw7 (due 3/17) –page 127 question 5 –page 132 questions 5 and 6 –page 137 questions 5 and 6."

Similar presentations