Download presentation

Presentation is loading. Please wait.

Published byKaiya Frothingham Modified over 3 years ago

2
Blackbox Reductions from Mechanisms to Algorithms

3
Feasibility constraints on outcome space Algorithm Design v1v1 v1v1 v2v2 v2v2 v3v3 v3v3 v4v4 v4v4 v5v5 v5v5 Input v Output x GOAL: maximize (or minimize) some function f(x,v)

4
Feasibility constraints on outcome space Mechanism Design Allocation x Payment p GOAL: maximize (or minimize) some function f(x,v) v1v1 v1v1 v2v2 v2v2 v3v3 v3v3 v4v4 v4v4 v5v5 v5v5 Input v b1b1 b1b1 b2b2 b2b2 b3b3 b3b3 b4b4 b4b4 b5b5 b5b5 Input b b i chosen to maximize utility = v i x i (b)-p i (b)

5
behind every great mechanism is a great algorithm computation

7
Black-Box Transformations Transformation Algorithm Input b Allocation x Payment p GOAL: for every algorithm, transformation preserves quality of solution in equilibrium. and is incentive compatible. Input v

8
Black-Box Transformations Transformation Algorithm Allocation x … and is incentive compatible (IC), i.e., monotone: Input v ex-post IC (truthful in expectation): allocation to agent i is increasing in i’s bid for all bid profiles of others Bayesian IC: allocation to agent i is increasing in i’s bid in expectation w.r.t. prior of over bid profiles of others

9
VCG Transformation Optimal Algorithm Input vAllocation x EXAMPLE: Vickrey-Clark-Groves auction transforms any optimal algorithm into an optimal ex-post IC mechanism for any monotone objective function.

10
Single Item Auction one item, agent i has value v i for item VCG Transformation Selection Algorithm Input vAllocation x Find agent w/max value (single-parameter)

11
i2i2 i2i2 Combinatorial Auction many items, agent i has value v ij for subset S j i1i1 i1i1 i2i2 i2i2 i3i3 i3i3 (multi-parameter)

12
Combinatorial Auction VCG Transformation ??? Input vAllocation x Find max value non-overlapping collection of sets many items, agent i has value v ij for subset S j (multi-parameter)

13
^ approximation

15
BIC Transformation Positive Result: Transform approximation algorithms into Bayesian IC mechs with small loss in social welfare. Single-parameter: (single private value for allocation) 1.Monotonization. For dist. F and algorithm A, there is a Bayesian IC transformation T A,F satisfying E[T A,F (v)] ≥ E[A(v)]. 2.Blackbox computation. T A,F can be computed in polytime with queries to A. 3.Payment computation. Payments can be computed with two queries to A.

16
x i (v i ) = E[alloc. to i | v i ] Not BIC BIC Monotonization Fact. There’re payments that make an alg. Bayesian IC if and only if for all i, expected allocation is monotone non-decreasing in value v i. vivi

17
Monotonization Goal: construct y i from x i s.t. 1.Monotonicity. y i (.) non-decreasing monotone 2.Surplus-preservation. E v i [v i y i (v i )] ≥ E v i [v i x i (v i )] 3.Distribution-preservation. (can apply construction independently to each j)

18
Monotonization Idea 1: remap values.

19
Monotonization Idea 2: resample values.

20
Monotonization Idea 3: resample values in region where cumulative allocation is not monotone. allocation cumulative curve

21
Monotonization Construction of y i (v i ) from x i (v i ) preserves: 1.Distribution-preservation. 2.Monotonicity. y i non-decreasing monotone x i (v i ) y i (v i )

22
Monotonization Construction of y i (v i ) from x i (v i ) preserves: 3.Surplus-preservation. E v i [v i (y i - x i )] ≥ 0 x i (v i ) y i (v i ) E[v(y-x)] = ∫ v(y-x) d f(v)(integration by parts) = v(Y-X)| – ∫ v’(Y-X) d f(v)(v, X dominates Y) = 0 – (non-neg.) x (non-pos.)(2 nd term non-pos.) ≥ 0 a b a b a b

23
BIC Transformation for Welfare Positive Result: Transform approximation algorithms into Bayesian IC mechs with small loss in social welfare. Single-parameter: (single private value for allocation) 1.Monotonization. For dist. F and algorithm A, there is a Bayesian IC transformation T A,F satisfying E[A(v)] ≥ E[T A,F (v)]. 2.Blackbox computation. T A,F can be computed in polytime with queries to A. 3.Payment computation. Payments can be computed with two queries to A.

24
Blackbox Computation

25
BIC Transformation for Welfare Positive Result: Transform approximation algorithms into Bayesian IC mechs with small loss in social welfare. Single-parameter: (single private value for allocation) 1.Monotonization. For dist. F and algorithm A, there is a Bayesian IC transformation T A,F satisfying E[A(v)] ≥ E[T A,F (v)]. 2.Blackbox computation. T A,F can be computed in polytime with queries to A. 3.Payment computation. Payments can be computed with two queries to A.

26
Payment Computation payment identity p(v) = v y(v) – ∫ y(z) dz v 0 Idea: compute random variable P with E[P] = p(v)

27
Payment Computation payment identity p(v) = v y(v) – ∫ y(z) dz v 0 Idea: compute random variable P with E[P] = p(v) 1. Y indicator random variable for whether agent wins in A (with y(v)) 2. z drawn uniformly from [0,v] 3. Y z indicator random variable for whether agent wins in A (with y(z)) 4. P = v (Y – Y z ) 1 st call to A 2 nd call to A const. # calls per agent

28
Payment Computation goal: given A, find an alg. A’ that computes allocation and payments with just 1 call to A 1. Pick agent k uniformly at random and draw w k from F k 2. Calculate outcome y’ for A(w k, v -k ) 3. For each agent i ≠ k, set p’ i = v i y’ i 4. For agent k, set p’ k = 0 if w k > v k and p’ k = -(n – 1)y’ k /f k (w k ) otherwise 5. Output (y’, p’) only call to A

29
Payment Computation Thm. Algorithm A’ is Bayesian IC. Proof. 1.Monotone. y’ linear transformation of y. y’(v) = (1 - 1/n) y(v) + 1/n E[y(w)] 2.Payment Identity. p’(v) = v y’(v) – ∫ y’(z) dz v 0 p’(v) = (1 - 1/n) vy(v) – (1/n)(n - 1)∫ y(z) dz v 0 payment for i ≠ k payment for i = k (see ugly formula)

30
Payment Computation Thm. Welfare is E[A’(v)] ≥ E[A(v) – max(v)] Proof. 1.Each buyer has welfare ≥ (1 - 1/n) vy(v) 2.Since y(v) is a probability, vy(v) ≤ max(v) 3.Lose at most max(v) in total buyer welfare 4.Expected payments are the same, so lose nothing in seller welfare Finds (alloc, payments) with 1 call to monotone alg. [Babaioff, Kleinberg, Slivkins’10]

31
Transformation Approx. Algorithm Input v POSSIBILITY: can transform any approximation algorithm into a Bayesian IC mech. with small loss for f(x,v) = Σ i x i v i. [Hartline-Lucier’10] Dist. of values (drawn from known dist.) Allocation x Payment p

32
Multi-parameter Transformation Goal: construct allocation from algorithm s.t. 1.“Monotonicity”. 2.Surplus-preservation. 3.Distribution-preservation. By mapping types of an agent to surrogates in a way that preserves above properties.

33
Replicas and Surrogates replicas (drawn from F) surrogates (drawn from F) surrogate allocations v(t,x(t’)) max-weight matching original type t surrogate type t’ x(t’) Set payment equal to VCG payment for type t. Set allocation equal to output on surrogate type profile.

34
Replicas and Surrogates Thm. Transformation is distribution-preserving. Thm. Transformation is Bayesian IC. Thm. Transformation doesn’t lose much welfare. Prf. Because replicas are “close” to matched surrogates in values for outcomes. [Hartline, Kleinberg, Malekian’11] [Bei, Huang’11]

35
Strengthening the Result Solution concept: black-box transformations for social welfare that preserve approximation and are truthful in expectation? Social objective: black-box transformations that preserve approximation, are Bayesian IC, and work for other social objectives?

37
Multi-parameter Transformations Thm. There’s no truthful in expectation mech. for combinatorial auctions with submodular valuations that guarantees a sub-linear approx. Note: there is a (1-1/e)-approximation alg. [Dughmi, Vondrak’11]

38
Single-parameter Transformations Truthful in Expectation. For all algorithms A, T A is truthful in expectation, i.e., expected allocation is monotone for all i. Worst-case approximation preserving. For all values vectors v and algorithms A, expected welfare of transformation is close to expected welfare of algorithm.

40
Proof Outline 1.Define welfare instance (feasible allocations, values of agents). 2.Find algorithm with high welfare. 3.Use monotonicity to show any ex-post transformation has low worst-case welfare.

41
Intuition (.5,.5) v1v1 v2v2 (x 1,x 2 ) Bayesian IC column ave. of x 2 increasing row ave. of x 1 increasing

42
Intuition (.7,.9) (.6,.2) (.6,.3)(.3,.4)(.5,.5)(.1,.6)(.7,.7) (.4,.1) (.3,.9) (.2,.7) v1v1 v2v2 Ex-post IC

43
Intuition Transformation must fix non-monotonicities in every row and column. Query Input vector (.1,.3)(.2,.2)(.3,.4)(.8,.7)(.5,.5) v1v1 v2v2

44
Intuition (.6,.2) (.2,.6) Idea: hide non-monotonicity on high-dim. diagonal. Make all allocations constant on these agents. (.5,.5) (.3,.3) 2 1

45
Truthful in Expectation Thm. Any truthful-in-expectation transformation loses a polynomial factor in welfare approximation. [Chawla, Immorlica, Lucier’12]

47
Thank You

Similar presentations

OK

Agent Technology for e-Commerce Chapter 10: Mechanism Design Maria Fasli

Agent Technology for e-Commerce Chapter 10: Mechanism Design Maria Fasli

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on polynomials and coordinate geometry practice Ppt on aviva life insurance Ppt on applied operational research management Ppt on operating system security Ppt on project spark Ppt on frame of reference Ppt on indian construction industry Download ppt on poem song of the rain Ppt on history of olympics in canada Ppt on limitation act 1963