Richard Anderson Autumn 2019 Lecture 7

Slides:



Advertisements
Similar presentations
COMP 482: Design and Analysis of Algorithms
Advertisements

Greedy Algorithms Basic idea Connection to dynamic programming
Greedy Algorithms Reading Material: –Alsuwaiyel’s Book: Section 8.1 –CLR Book (2 nd Edition): Section 16.1.
Lecture 23 CSE 331 Oct 24, Temp letter grades assigned See the blog post for more details.
CSE 421 Algorithms Richard Anderson Lecture 6 Greedy Algorithms.
Lecture 20 CSE 331 Oct 21, Algorithm for Interval Scheduling R: set of requests Set A to be the empty set While R is not empty Choose i in R with.
1 The Greedy Method CSC401 – Analysis of Algorithms Lecture Notes 10 The Greedy Method Objectives Introduce the Greedy Method Use the greedy method to.
9/3/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Guest lecturer: Martin Furer Algorithm Design and Analysis L ECTURE.
Called as the Interval Scheduling Problem. A simpler version of a class of scheduling problems. – Can add weights. – Can add multiple resources – Can ask.
CSCI-256 Data Structures & Algorithm Analysis Lecture Note: Some slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved. 8.
CSE 331: Review. Main Steps in Algorithm Design Problem Statement Algorithm Real world problem Problem Definition Precise mathematical def “Implementation”
CSCI 256 Data Structures and Algorithm Analysis Lecture 6 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 2.
1 Chapter 5-1 Greedy Algorithms Slides by Kevin Wayne. Copyright © 2005 Pearson-Addison Wesley. All rights reserved.
Lecture 8 CSE 331. Main Steps in Algorithm Design Problem Statement Algorithm Problem Definition “Implementation” Analysis n! Correctness+Runtime Analysis.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 3.
CSE 421 Algorithms Richard Anderson Lecture 8 Greedy Algorithms: Homework Scheduling and Optimal Caching.
Greedy algorithms: CSC317
Greedy Algorithms – Chapter 5
Merge Sort 7/29/ :21 PM The Greedy Method The Greedy Method.
Greedy Algorithms Basic idea Connection to dynamic programming
Chapter 4 Greedy Algorithms
Greedy Algorithms / Interval Scheduling Yin Tat Lee
Presented by Po-Chuan & Chen-Chen 2016/03/08
Lecture 22 CSE 331 Oct 22, 2010.
Greedy Algorithms.
Lecture 21 CSE 331 Oct 21, 2016.
Richard Anderson Autumn 2015 Lecture 8 – Greedy Algorithms II
Lecture 21 CSE 331 Oct 20, 2017.
Merge Sort 11/28/2018 2:18 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Merge Sort 11/28/2018 8:16 AM The Greedy Method The Greedy Method.
Autumn 2016 Lecture 9 Dijkstra’s algorithm
Autumn 2015 Lecture 9 Dijkstra’s algorithm
Richard Anderson Lecture 9 Dijkstra’s algorithm
Richard Anderson Autumn 2006 Lecture 1
Lecture 19 CSE 331 Oct 12, 2016.
CSE 421 Richard Anderson Autumn 2016, Lecture 3
Lecture 11 Overview Self-Reducibility.
Lecture 11 Overview Self-Reducibility.
Greedy Algorithms.
Greedy Algorithms: Homework Scheduling and Optimal Caching
Merge Sort 1/17/2019 3:11 AM The Greedy Method The Greedy Method.
Richard Anderson Lecture 6 Greedy Algorithms
CSE 421 Richard Anderson Autumn 2015, Lecture 3
Lecture 19 CSE 331 Oct 8, 2014.
Lecture 20 CSE 331 Oct 17, 2011.
Richard Anderson Autumn 2016 Lecture 7
Lecture 18 CSE 331 Oct 9, 2017.
Richard Anderson Lecture 7 Greedy Algorithms
Richard Anderson Autumn 2016 Lecture 2
Richard Anderson Winter 2019 Lecture 8 – Greedy Algorithms II
Lecture 19 CSE 331 Oct 14, 2011.
Lecture 21 CSE 331 Oct 22, 2012.
Richard Anderson Autumn 2016 Lecture 8 – Greedy Algorithms II
Richard Anderson Winter 2009 Lecture 2
Winter 2019 Lecture 9 Dijkstra’s algorithm
Richard Anderson Winter 2019 Lecture 7
CSE 421, University of Washington, Autumn 2006
Merge Sort 5/2/2019 7:53 PM The Greedy Method The Greedy Method.
Week 2: Greedy Algorithms
Lecture 19 CSE 331 Oct 10, 2016.
Richard Anderson Autumn 2015 Lecture 7
CSE 421 Richard Anderson Winter 2019, Lecture 3
Richard Anderson Autumn 2015 Lecture 2
Richard Anderson Autumn 2019 Lecture 2
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
CSE 421 Richard Anderson Autumn 2019, Lecture 3
Autumn 2019 Lecture 9 Dijkstra’s algorithm
Autumn 2019 Lecture 11 Minimum Spanning Trees (Part II)
Presentation transcript:

Richard Anderson Autumn 2019 Lecture 7 CSE 421 Algorithms Richard Anderson Autumn 2019 Lecture 7

Announcements Reading Homework 3 is available I’m back from Kyrgyzstan For today, sections 4.1, 4.2, For Friday, sections 4.4, 4.5, 4.7, 4.8 Homework 3 is available Random Interval Graphs I’m back from Kyrgyzstan

Stable Matching Results m-rank w-rank 500 5.10 98.05 7.52 66.95 8.57 58.18 6.32 75.87 5.25 90.73 6.55 77.95 1000 6.80 146.93 6.50 154.71 7.14 133.53 7.44 128.96 7.36 137.85 7.04 140.40 2000 7.83 257.79 7.50 263.78 11.42 175.17 7.16 274.76 7.54 261.60 8.29 246.62 Averages of 5 runs Much better for M than W Why is it better for M? What is the growth of m-rank and w-rank as a function of n?

Greedy Algorithms

Greedy Algorithms Solve problems with the simplest possible algorithm The hard part: showing that something simple actually works Pseudo-definition An algorithm is Greedy if it builds its solution by adding elements one at a time using a simple rule

Scheduling Theory Tasks Processors Precedence constraints Processing requirements, release times, deadlines Processors Precedence constraints Objective function Jobs scheduled, lateness, total execution time

Interval Scheduling Tasks occur at fixed times Single processor Maximize number of tasks completed Tasks {1, 2, . . . N} Start and finish times, s(i), f(i)

What is the largest solution?

Greedy Algorithm for Scheduling Let T be the set of tasks, construct a set of independent tasks I, A is the rule determining the greedy algorithm I = { } While (T is not empty) Select a task t from T by a rule A Add t to I Remove t and all tasks incompatible with t from T

Simulate the greedy algorithm for each of these heuristics Schedule earliest starting task Schedule shortest available task Schedule task with fewest conflicting tasks

Greedy solution based on earliest finishing time Example 1 Example 2 Example 3

Theorem: Earliest Finish Algorithm is Optimal Key idea: Earliest Finish Algorithm stays ahead Let A = {i1, . . ., ik} be the set of tasks found by EFA in increasing order of finish times Let B = {j1, . . ., jm} be the set of tasks found by a different algorithm in increasing order of finish times Show that for r<= min(k, m), f(ir) <= f(jr)

Stay ahead lemma A always stays ahead of B, f(ir) <= f(jr) Induction argument f(i1) <= f(j1) If f(ir-1) <= f(jr-1) then f(ir) <= f(jr) Since f(ir-1) <= f(jr-1), both ir and jr are available for the Earliest Finish Algorithm

Completing the proof Let A = {i1, . . ., ik} be the set of tasks found by EFA in increasing order of finish times Let O = {j1, . . ., jm} be the set of tasks found by an optimal algorithm in increasing order of finish times If k < m, then the Earliest Finish Algorithm stopped before it ran out of tasks

Scheduling all intervals Minimize number of processors to schedule all intervals

How many processors are needed for this example?

Prove that you cannot schedule this set of intervals with two processors

Depth: maximum number of intervals active

Algorithm Sort by start times Suppose maximum depth is d, create d slots Schedule items in increasing order, assign each item to an open slot Correctness proof: When we reach an item, we always have an open slot

What happens on “Random” sets of intervals Given n random intervals What is the expected number independent intervals What is the expected depth

What is a random set of intervals Method 1: Each interval assigned random start position in [0.0, 1.0] Each interval assigned a random length in [0.0, 1.0] Method 2: Start with the array [1, 1, 2, 2, 3, 3, 4, 4, 5, 5] Randomly permute it [2, 1, 4, 2, 3, 4, 5, 1, 3, 5] Index of the first j is the start of interval j, and the index of the second j is the end of interval j

Scheduling tasks Each task has a length ti and a deadline di All tasks are available at the start One task may be worked on at a time All tasks must be completed Goal minimize maximum lateness Lateness = fi – di if fi >= di

Example Time Deadline 2 2 3 4 2 3 Lateness 1 3 2 Lateness 3

Determine the minimum lateness Show the schedule 2, 3, 4, 5 first and compute lateness Time Deadline 6 2 3 4 4 5 5 12