CS 146: Data Structures and Algorithms July 28 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak www.cs.sjsu.edu/~mak.

Slides:



Advertisements
Similar presentations
Dynamic Programming 25-Mar-17.
Advertisements

Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Greedy Algorithms.
Introduction to Algorithms Greedy Algorithms
Algorithm Design Techniques: Greedy Algorithms. Introduction Algorithm Design Techniques –Design of algorithms –Algorithms commonly used to solve problems.
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Greedy Algorithms Amihood Amir Bar-Ilan University.
Greedy Algorithms Greed is good. (Some of the time)
Greed is good. (Some of the time)
Chapter 4 The Greedy Approach. Minimum Spanning Tree A tree is an acyclic, connected, undirected graph. A spanning tree for a given graph G=, where E.
Data Compressor---Huffman Encoding and Decoding. Huffman Encoding Compression Typically, in files and messages, Each character requires 1 byte or 8 bits.
15-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
16-May-15 Dynamic Programming. 2 Algorithm types Algorithm types we will consider include: Simple recursive algorithms Backtracking algorithms Divide.
1 Dynamic Programming Jose Rolim University of Geneva.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
Chapter 9 Greedy Technique Copyright © 2007 Pearson Addison-Wesley. All rights reserved.
CS 206 Introduction to Computer Science II 04 / 29 / 2009 Instructor: Michael Eckmann.
Data Structures – LECTURE 10 Huffman coding
CS 206 Introduction to Computer Science II 12 / 10 / 2008 Instructor: Michael Eckmann.
Dynamic Programming A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River,
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
CS 146: Data Structures and Algorithms July 14 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
CS 146: Data Structures and Algorithms June 18 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
David Luebke 1 8/23/2015 CS 332: Algorithms Greedy Algorithms.
CS 46B: Introduction to Data Structures July 30 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak.
CS 146: Data Structures and Algorithms July 21 Class Meeting
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 8 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
Dynamic Programming. Well known algorithm design techniques:. –Divide-and-conquer algorithms Another strategy for designing algorithms is dynamic programming.
CS 146: Data Structures and Algorithms June 23 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
Fundamentals of Algorithms MCS - 2 Lecture # 7
Dijkstra’s Algorithm. Announcements Assignment #2 Due Tonight Exams Graded Assignment #3 Posted.
Huffman Coding. Huffman codes can be used to compress information –Like WinZip – although WinZip doesn’t use the Huffman algorithm –JPEGs do use Huffman.
CSC 413/513: Intro to Algorithms Greedy Algorithms.
CSC401: Analysis of Algorithms CSC401 – Analysis of Algorithms Chapter Dynamic Programming Objectives: Present the Dynamic Programming paradigm.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Greedy algorithms David Kauchak cs161 Summer 2009.
Huffman coding Content 1 Encoding and decoding messages Fixed-length coding Variable-length coding 2 Huffman coding.
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 9 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
CSCE350 Algorithms and Data Structure Lecture 19 Jianjun Hu Department of Computer Science and Engineering University of South Carolina
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
CS 146: Data Structures and Algorithms July 14 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
CS 146: Data Structures and Algorithms July 16 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
CS 46B: Introduction to Data Structures July 2 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak
PROBLEM-SOLVING TECHNIQUES Rocky K. C. Chang November 10, 2015.
1 Algorithms CSCI 235, Fall 2015 Lecture 30 More Greedy Algorithms.
Greedy Algorithms Analysis of Algorithms.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 18.
HUFFMAN CODES.
CSC317 Greedy algorithms; Two main properties:
Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: feasible locally optimal irrevocable.
Greedy Technique.
The Greedy Method and Text Compression
Dynamic Programming Dynamic Programming is a general algorithm design technique for solving problems defined by recurrences with overlapping subproblems.
Chapter 8 Dynamic Programming
The Greedy Method and Text Compression
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
CS6045: Advanced Algorithms
Greedy Algorithms Many optimization problems can be solved more quickly using a greedy approach The basic principle is that local optimal decisions may.
Merge Sort Dynamic Programming
Greedy Algorithms TOPICS Greedy Strategy Activity Selection
CSE 326: Data Structures Lecture #24 The Algorhythmics
Algorithm Design Techniques Greedy Approach vs Dynamic Programming
Algorithms CSCI 235, Spring 2019 Lecture 30 More Greedy Algorithms
Major Design Strategies
Presentation transcript:

CS 146: Data Structures and Algorithms July 28 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Dijkstra’s Algorithm Revisited  Think of a graph as a kind of race track. Runners are waiting to be tagged at each vertex. Edge weights are running times.  At time 0, runners take off from vertex s. 2

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Dijkstra’s Algorithm Revisited, cont’d  At time 4, the runner reaches vertex y and tags. Runners take off from vertex y.  One of the runners from vertex y reaches vertex t before the runner from vertex s. The runner from vertex s loses and leaves the race. 3

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Dijkstra’s Algorithm Revisited, cont’d  The runner from vertex y reaches vertex z.  The runner from vertex t reaches vertex x first.  Now we have the shortest (fastest) path from vertex s to each of the other vertices. 4

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 5 Greedy Algorithms  Proceed in stages.  At each stage, choose a local optimum. Attempt to do what is best based on current information. “Take what you can get now.”  Hope this process leads to the global optimum. Doesn’t always work.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 6 Example Greedy Algorithms  Dijkstra’s algorithm for shortest weighted path.  Prim’s algorithm for minimum spanning tree.  Kruskal’s algorithm for minimum spanning tree.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 7 Job Scheduling Algorithms  An operating system’s job scheduler picks jobs to run that are waiting in the job queue. Each job has an expected run time. Once a job starts to run, let it run to completion (non-preemptive scheduling).  Goal: Minimize average turnaround time for all the jobs. Turnaround time =time spent in queue + time spent running

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 8 FCFS: A Non-Greedy Scheduling Algorithm  First-Come First-Served (FCFS) algorithm Turnaround times:  j1: 15  j2: = 23  j3: = 26  j4: = 36 Average turnaround time: ( )/4 = Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 9 SJF: A Greedy Scheduling Algorithm  Shortest Job First (SJF) algorithm Always pick the job with the shortest expected run time. Turnaround times:  j3: 3  j2: = 11  j4: = 21  j1: = 36 Average turnaround time: ( )/4 = Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 10 Huffman’s Greedy Algorithm  Goal: Reduce the number of bits to transmit a message. Invented by David A. Huffman in  Idea: Use fewer bits to encode message characters that appear more frequently.  Normally, each message character is encoded by a fixed number of bits. Examples: 7-bit ASCII code 8-bit UTF-8 code

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 11 Huffman’s Greedy Algorithm, cont’d  Use a full binary tree to determine character encodings. Each node is either a leaf or has two children.  Each character is a leaf node.  In the path from the root to a character: Each left link is a 0 bit Each right link is a 1 bit  No character code is a prefix of another character.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 12 Huffman’s Greedy Algorithm, cont’d Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN Always merge the two trees with the lowest weights. Use frequency as the weight.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 13 Huffman’s Greedy Algorithm, cont’d Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN Always merge the two trees with the lowest weights.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 14 Huffman’s Greedy Algorithm, cont’d Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN Always merge the two trees with the lowest weights.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 15 Huffman’s Greedy Algorithm, cont’d Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN Always merge the two trees with the lowest weights.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 16 Huffman’s Greedy Algorithm, cont’d Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN Decode: s a t i a t e ☐ i t \n Always merge the two trees with the lowest weights.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 17 Divide and Conquer Algorithms  Divide a problem into at least two smaller problems. The subproblems can (but not necessarily) be solved recursively.  Conquer by forming the solution to the original problem from the solutions to the smaller problems.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 18 Example Divide and Conquer Algorithms  Solution to the Towers of Hanoi puzzle.  Mergesort  Quicksort

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 19 Multiplying Two Large Integers  We want to multiply two large N -digit integers X and Y.  Done the usual way, it’s Θ(N 2 ).

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 20 Multiplying Two Large Integers  Let X = 61,438,521 and Y = 94,736,407.  Divide and conquer: Break each number into two parts. X L = 6,143 and X R = 8,521 Y L = 9,473 and Y R = 6,407  Therefore: X = X L X R Y = Y L Y R

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 21 Multiplying Two Large Integers  Replace the multiplier of 10 4 with:  Since we’re already computing X L Y L and X R Y R, we’ve reduced the number of multiplications by one.  With these algebraic manipulations, computing XY is O(N log 3 ) = O(N 1.59 ).

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 22 Multiplying Two Large Integers Data Structures and Algorithms in Java, 3 rd ed. by Mark Allen Weiss Pearson Education, Inc., 2012 ISBN

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Break 23

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 24 Dynamic Programming Algorithms  Break a problem into smaller subproblems. But we don’t know exactly which subproblems to solve.  We solve them all and store the results in a table. Use a table instead of recursion.  Use the stored results to solve larger problems. Richard Bellman coined the term dynamic programming in the 1950s when programming meant planning. Dynamic programming involved optimally planning multistage processes.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 25 Example Dynamic Programming Algorithm  Compute the fibonacci series.  The table consists of the last two computed numbers.  It’s extremely inefficient to use recursion to compute the series.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 26 The Knapsack Problem  A thief burglarizing a safe finds that it contains N types of items of various weights and value.  The thief has a knapsack that can hold only W pounds. The thief can take multiple items of each type. Item i Weight w Value v 16$30 23$14 34$16 42$9

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 27 The Knapsack Problem, cont’d  What is the optimum (most valuable) haul that the thief can carry away in the knapsack?  The solution will use dynamic programming. In other words, it will use a table.  What are the subproblems? Consider smaller knapsacks that have a smaller capacities. Solve for each knapsack size.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 28 The Knapsack Problem, cont’d  Let K(w) = maximum value for a knapsack with capacity weight w  Suppose the optimum solution to K(w) includes item i Then removing this item from the knapsack leaves an optimal solution for a smaller knapsack, K(w – w i ) In other words, K(w) = K(w – w i ) + v i for some i. We don’t know which i, so let’s try them all:

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 29 The Knapsack Problem, cont’d int K[] = new int[W+1]; K[0] = 0; for (int ww = 1; ww <= W; ww++) { K[ww] = K[ww-1]; int max = 0; int item = 0; for (int i = 1; i <= N; i++) { if (w[i] <= ww) { int value = K[ww - w[i]] + v[i]; if (max < value) { max = value; item = i; } K[ww] = max; } }... } Compute w item weight value K[w] Item i Weight w Value v 16$30 23$14 34$16 42$9

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak The Knapsack Problem, cont’d  Which items are in the knapsack? 30 w item weight value K[w] witemvalue – 30 = – 9 = – 9 = 0

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 31 Ordering Matrix Multiplications  Suppose we want to multiply together four matrices:  Matrix multiplication is associative. What order should we multiply them to minimize the cost? MatrixSize A50 x 20 B20 x 1 C 1 x 10 D10 x 100 Multiplying an m x n matrix by an n x p matrix takes approximately mnp multiplications.

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Ordering Matrix Multiplications, cont’d 32

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 33 Ordering Matrix Multiplications, cont’d  Use a dynamic programming algorithm to find the optimal (least cost) multiplication order.  Represent each parenthesization as a binary expression tree:

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 34 Ordering Matrix Multiplications, cont’d  For a tree to be optimal, its subtrees must also be optimal.  What are the subproblems? Consider the multiplications represented by the subtrees. ((A x B) x C) x DA x ((B x C) x D)(A x (B x C)) x D

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 35 Ordering Matrix Multiplications, cont’d  Let C(i, j) = minimum cost of multiplying A i x A i+1 x... x A j  The size of each subproblem is | j – i | multiplications. The smallest problem is i = j, so C(i, i) = 0. For j > i, consider the optimal subtree for C(i, j). The uppermost branch splits the product into two parts, A i x... x A k and A k+1 x... x A j for some k between i and j. The cost of the subtree is the cost of the two partial products, plus the cost of combining them: Find the optimal value of k

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 36 Ordering Matrix Multiplications, cont’d for i = 1 to n { C[i,i] = 0; } // s is the subproblem size for s = 1 to n-1 { for i = 1 to n-s { j = i+s; for k = i to j-1 { cost = C[i,k] + C[k+1,j] + m[i-1]*m[k]*m[j]; if (cost < C[i,j]) { C[i,j] = cost; lastChange[i,j] = k; } } } } Record costs in the two-dimensional array C[][] Record changes in array lastChange[][] Find the optimal value of k Compute

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 37 Ordering Matrix Multiplications, cont’d  Recover the optimal parenthesization from the lastChange[][] array: void order(i, j) { if (i == j) System.out.print(name[i]); else { System.out.print ("("); order(i, lastChange[i,j]-1); System.out.print ("*"); order(lastChange[i,j], j); System.out.print (")"); }

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak Common Subproblems of Dynamic Programming 38

Computer Science Dept. Summer 2015: July 28 CS 146: Data Structures and Algorithms © R. Mak 39 Common Subproblems of Dynamic Programming