Download presentation

Presentation is loading. Please wait.

Published byJazmyne Kilgore Modified over 2 years ago

1
Learning Voting Trees Ariel D. Procaccia, Aviv Zohar, Yoni Peleg, Jeffrey S. Rosenschein

2
Lecture Outline Voting and Voting Trees PAC learning Results Limitations VotingPACResultsLimitations

3
Voting Election: set of voters N={1,...,n}, candidates C={a,b,c,...} Voters have linear preferences. Winner of the election determined according to a voting rule F. Plurality: each voter gives 1 point to first place. Copeland: –x 1 beats x 2 in a pairwise election if most voters prefer x 1 to x 2. –Candidate’s score is num of other candidates beaten in pairwise election. VotingPACResultsLimitations

4
Voting: example Example: N={1,2,3,4,5}, C={a,b,c,d} –Voter 1: a > b > c > d –Voter 2: a > b > c > d –Voter 3: b > c > d > a –Voter 4: c > b > d > a –Voter 5: d > b > c > a Plurality: a wins; Copeland: b wins. Copeland only cares about pairwise elections. VotingPACResultsLimitations

5
Tournaments A tournament over C is complete binary irreflexive relationship over C. Summarizes results of pairwise elections. Example: N={1,2,3}, C={a,b,c} –Voter 1: c > b > a –Voter 2: b > a > c –Voter 3: a > c > b –Overall: a < b, b < c, c < a (Condorcet paradox). (Pairwise) voting rule is a function from tournaments to candidates. VotingPACResultsLimitations

6
Voting Trees a c ? ? ? ? b ac a < b, b < c, c < a VotingPACResultsLimitations

7
Voting Trees Voting trees are everywhere! Concise representation of (pairwise) voting rules. –In gen., double exponential number Exponential representation. –Capture many rules, such as Copeland. Given some (pairwise) voting rule, want to find accurate (as much as possible), concise representation by voting tree. Idea: use learning. Designer labels tournaments with winners, learning algorithm outputs a “good” voting tree. VotingPACResultsLimitations

8
PAC Learning Want to learn voting rule f (not necessarily tree). Training set consists of example pairs (T j,f(T j )). T j – tournaments drawn from fixed dist. D. err(h)=Pr D [h(T) f(T)]. f * minimizes err(h) over all voting trees. Goal: given , find voting tree g such that err(g) err(f * )+ . Q: How many examples are needed in order to guarantee that goal is achieved with prob. at least 1- ? VotingPACResultsLimitations

9
Formulation of Theorems Theorem: An exponential training set is needed to learn voting trees. Restrict to the class of voting trees of polynomial size (at most k leaves). Lemma: If the size of this class is (only) exponential, the following alg achieves the goal with polynomial training set: return the tree which minimizes mistakes on the training set. Theorem: |Voting trees with k leaves| exp(m,k) Proof: –size ( # possible structures # assignments to leaves) k = (# possible structures m k ) k VotingPACResultsLimitations

10
Number of tree structures VotingPACResultsLimitations

11
Proof continued Theorem: |Trees with k leaves| m k k! Proof: –size k # assignments to leaves # possible structures –Lemma: Can obtain any tree structure w. k leaves by k-1 splits. –# structures (k-1)! –Size k (k-1)! m k VotingPACResultsLimitations

12
Approximation by Voting Trees Voting rule g is a c-approximation of f iff f and g agree on a c-fraction of the tournaments. Theorem: Most voting rules can’t be approximated by small voting trees to a factor of better than ½. This result isn’t as negative as it sounds. VotingPACResultsLimitations

13
Closing Remarks Computational learning theory as a method to concisely represent voting rules. Other concisely representable families: Scoring rules –Defined by a vector 1,..., m –Efficiently PAC learnable Which voting rules can be approximated? Under which underlying distributions? VotingPACResultsLimitations

14
Encore: Computational Complexity So far were interested in sample complexity. Recall: If the size of this class is (only) exponential, the following alg achieves the goal with polynomial training set: return the tree which minimizes mistakes on the training set. Theorem: Finding such a tree is NP-hard. In practice, the complexity depends on the structure of the tree. VotingPACResultsLimitations

15
A Graph!! VotingPACResultsLimitations

Similar presentations

OK

Using computational hardness as a barrier against manipulation Vincent Conitzer

Using computational hardness as a barrier against manipulation Vincent Conitzer

© 2018 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on game theory wiki Animated ppt on reflection of light Ppt on atrial septal defect murmur Ppt on formation of company for class 11 Ppt on automobile topics Ppt on hlookup in excel Ppt on marketing management introduction Ppt on object-oriented programming interview questions Ppt on peak load pricing lecture Download ppt on decline of mughal empire