Richard Anderson Autumn 2006 Lecture 1

Slides:



Advertisements
Similar presentations
Piyush Kumar (Lecture 3: Stable Marriage) Welcome to COT5405.
Advertisements

Prof. Swarat Chaudhuri COMP 482: Design and Analysis of Algorithms Spring 2013 Lecture 2.
Chapter 5 Fundamental Algorithm Design Techniques.
1 Discrete Structures & Algorithms Graphs and Trees: IV EECE 320.
CSE 421 Algorithms Richard Anderson Lecture 2. Announcements Office Hours –Richard Anderson, CSE 582 Monday, 10:00 – 11:00 Friday, 11:00 – 12:00 –Yiannis.
CSE 421 Algorithms Richard Anderson Autumn 2006 Lecture 2.
CSE 421 Algorithms Richard Anderson Lecture 3. Classroom Presenter Project Understand how to use Pen Computing to support classroom instruction Writing.
The Stable Marriage Problem
1 Stable Matching Problem Goal. Given n men and n women, find a "suitable" matching. n Participants rate members of opposite sex. n Each man lists women.
Great Theoretical Ideas in Computer Science.
1 The Stable Marriage Problem Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 1.
CSEP 521 Applied Algorithms Richard Anderson Winter 2013 Lecture 1.
Stable Marriages (Lecture modified from Carnegie Mellon University course Great Theoretical Ideas in Computer Science)
Bipartite Matching. Unweighted Bipartite Matching.
CSE 421 Algorithms Richard Anderson (for Anna Karlin) Winter 2006 Lecture 2.
The Stable Marriage Problem
Great Theoretical Ideas in Computer Science for Some.
CSCI 256 Data Structures and Algorithm Analysis lecture 1 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
CSCI 256 Data Structures and Algorithm Analysis Lecture 2 Some slides by Kevin Wayne copyright 2005, Pearson Addison Wesley all rights reserved, and some.
1 Distributed Vertex Coloring. 2 Vertex Coloring: each vertex is assigned a color.
Matching Boys Girls A B C D E
Cybernetica AS & Tartu University
Stable Matching.
Stable Marriage Problem
Chapter 10 Iterative Improvement
The Stable Marriage Problem
Introduction to Randomized Algorithms and the Probabilistic Method
Randomized Algorithms
CSE 421: Introduction to Algorithms
Presented by Po-Chuan & Chen-Chen 2016/03/08
The Mathematics Of 1950’s Dating: Who wins The Battle of The Sexes?
Lecture 6 CSE 331 Sep 11, 2017.
CSE373: Data Structures & Algorithms Lecture 11: Implementing Union-Find Linda Shapiro Spring 2016.
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Richard Anderson Autumn 2015 Lecture 8 – Greedy Algorithms II
Richard Anderson (for Anna Karlin) Winter 2006 Lecture 1
S. Raskhodnikova; based on slides by K. Wayne
Winter 2018 CISC101 12/2/2018 CISC101 Reminders
Matching Lirong Xia March 8, Matching Lirong Xia March 8, 2016.
CSE373: Data Structures & Algorithms Lecture 18: Dijkstra’s Algorithm
CSE 421: Introduction to Algorithms
Richard Anderson Lecture 3
Richard Anderson Lecture 6 Greedy Algorithms
CSE332: Data Abstractions Lecture 18: Minimum Spanning Trees
Richard Anderson Autumn 2015 Lecture 1
Richard Anderson Autumn 2016 Lecture 7
Matching and Resource Allocation
Richard Anderson Lecture 7 Greedy Algorithms
Richard Anderson Autumn 2016 Lecture 2
Richard Anderson Autumn 2006 Lecture 1
Lecture 6 CSE 331 Sep 12, 2011.
Richard Anderson Autumn 2016 Lecture 8 – Greedy Algorithms II
Lecture 6 CSE 331 Sep 12, 2016.
Richard Anderson Winter 2009 Lecture 2
Lecture 7 CSE 331 Sep 10, 2014.
Richard Anderson Winter 2019 Lecture 1
Piyush Kumar (Lecture 3: Stable Marriage)
Richard Anderson Winter 2019 Lecture 7
Lecture 7 CSE 331 Sep 11, 2013.
Richard Anderson Autumn 2015 Lecture 7
Richard Anderson Winter 2019 Lecture 2
Richard Anderson Autumn 2016 Lecture 1
Estimating Algorithm Performance
Richard Anderson Autumn 2015 Lecture 4
Richard Anderson Autumn 2015 Lecture 2
Richard Anderson Autumn 2019 Lecture 7
Richard Anderson Autumn 2019 Lecture 2
Richard Anderson Autumn 2019 Lecture 8 – Greedy Algorithms II
Presentation transcript:

Richard Anderson Autumn 2006 Lecture 1 CSE 421 Algorithms Richard Anderson Autumn 2006 Lecture 1

All of Computer Science is the Study of Algorithms

How to study algorithms Zoology Mine is faster than yours is Algorithmic ideas Where algorithms apply What makes an algorithm work Algorithmic thinking

Introductory Problem: Stable Matching Setting: Assign TAs to Instructors Avoid having TAs and Instructors wanting changes E.g., Prof A. would rather have student X than her current TA, and student X would rather work for Prof A. than his current instructor.

Formal notions Perfect matching Ranked preference lists Stability m1 w1 m2 w2

Example (1 of 3) m1: w1 w2 m2: w2 w1 w1: m1 m2 w2: m2 m1 m1 w1 m2 w2

Example (2 of 3) m1: w1 w2 m2: w1 w2 w1: m1 m2 w2: m1 m2 m1 w1 m2 w2 Find a stable matching

Example (3 of 3) m1: w1 w2 m2: w2 w1 w1: m2 m1 w2: m1 m2 m1 w1 m2 w2

Intuitive Idea for an Algorithm m proposes to w If w is unmatched, w accepts If w is matched to m2 If w prefers m to m2, w accepts If w prefers m2 to m, w rejects Unmatched m proposes to highest w on its preference list that m has not already proposed to

Algorithm Initially all m in M and w in W are free While there is a free m w highest on m’s list that m has not proposed to if w is free, then match (m, w) else suppose (m2, w) is matched if w prefers m to m2 unmatch (m2, w) match (m, w)

Example m1 w1 m2 w2 m3 w3 m1: w1 w2 w3 m2: w1 w3 w2 m3: w1 w2 w3 w1: m2 m3 m1 w2: m3 m1 m2 w3: m3 m1 m2 m2 w2 m3 w3

Does this work? Does it terminate? Is the result a stable matching? Begin by identifying invariants and measures of progress m’s proposals get worse Once w is matched, w stays matched w’s partners get better (have lower w-rank)

Claim: The algorithm stops in at most n2 steps Why? Each m asks each w at most once

When the algorithms halts, every w is matched Why? Hence, the algorithm finds a perfect matching

The resulting matching is stable Suppose m1 prefers w2 to w1 How could this happen? m1 w1 m2 w2 m1 proposed to w2 before w1 w2 rejected m1 for m3 w2 prefers m3 to m1 w2 prefers m2 to m3

Result Simple, O(n2) algorithm to compute a stable matching Corollary A stable matching always exists

Algorithm Initially all m in M and w in W are free While there is a free m w highest on m’s list that m has not proposed to if w is free, then match (m, w) else suppose (m2, w) is matched if w prefers m to m2 unmatch (m2, w) match (m, w)

A closer look Stable matchings are not necessarily fair m1: w1 w2 w3 w1: m2 m3 m1 w2: m3 m1 m2 w3: m1 m2 m3 m2 w2 m3 w3 How many stable matchings can you find?

Algorithm under specified Many different ways of picking m’s to propose Surprising result All orderings of picking free m’s give the same result Proving this type of result Reordering argument Prove algorithm is computing something mores specific Show property of the solution – so it computes a specific stable matching

Proposal Algorithm finds the best possible solution for M Formalize the notion of best possible solution (m, w) is valid if (m, w) is in some stable matching best(m): the highest ranked w for m such that (m, w) is valid S* = {(m, best(m)} Every execution of the proposal algorithm computes S*

Proof See the text book – pages 9 – 12 Related result: Proposal algorithm is the worst case for W Algorithm is the M-optimal algorithm Proposal algorithms where w’s propose is W-optimal

Best choices for one side are bad for the other Design a configuration for problem of size 4: M proposal algorithm: All m’s get first choice, all w’s get last choice W proposal algorithm: All w’s get first choice, all m’s get last choice m1: m2: m3: m4: w1: w2: w3: w4:

But there is a stable second choice Design a configuration for problem of size 4: M proposal algorithm: All m’s get first choice, all w’s get last choice W proposal algorithm: All w’s get first choice, all m’s get last choice There is a stable matching where everyone gets their second choice m1: m2: m3: m4: w1: w2: w3: w4:

Key ideas Formalizing real world problem Model: graph and preference lists Mechanism: stability condition Specification of algorithm with a natural operation Proposal Establishing termination of process through invariants and progress measure Under specification of algorithm Establishing uniqueness of solution

M-rank and W-rank of matching m-rank: position of matching w in preference list M-rank: sum of m-ranks w-rank: position of matching m in preference list W-rank: sum of w-ranks m1: w1 w2 w3 m2: w1 w3 w2 m3: w1 w2 w3 w1: m2 m3 m1 w2: m3 m1 m2 w3: m3 m1 m2 m1 w1 m2 w2 m3 w3 What is the M-rank? What is the W-rank?

Suppose there are n m’s, and n w’s What is the minimum possible M-rank? What is the maximum possible M-rank? Suppose each m is matched with a random w, what is the expected M-rank?

Random Preferences Suppose that the preferences are completely random m1: w8 w3 w1 w5 w9 w2 w4 w6 w7 w10 m2: w7 w10 w1 w9 w3 w4 w8 w2 w5 w6 … w1: m1 m4 m9 m5 m10 m3 m2 m6 m8 m7 w2: m5 m8 m1 m3 m2 m7 m9 m10 m4 m6 If there are n m’s and n w’s, what is the expected value of the M-rank and the W-rank when the proposal algorithm computes a stable matching?

Expected Ranks Expected M rank Expected W rank Guess – as a function of n

Expected M rank Expected M rank is the number of steps until all M’s are matched (Also is the expected run time of the algorithm) Each steps “selects a w at random” O(n log n) total steps Average M rank: O(log n)

Expected W-rank If a w receives k random proposals, the expected rank for w is n/(k+1). On the average, a w receives O(log n) proposals The average w rank is O(n/log n)

Probabilistic analysis Select items with replacement from a set of size n. What is the expected number of items to be selected until every item has been selected at least once. Choose k values at random from the interval [0, 1). What is the expected size of the smallest item.

What is the run time of the Stable Matching Algorithm? Initially all m in M and w in W are free While there is a free m w highest on m’s list that m has not proposed to if w is free, then match (m, w) else suppose (m2, w) is matched if w prefers m to m2 unmatch (m2, w) match (m, w) Executed at most n2 times

O(1) time per iteration Find free m Find next available w If w is matched, determine m2 Test if w prefer m to m2 Update matching

What does it mean for an algorithm to be efficient?