Combinatorial Betting Rick Goldstein and John Lai.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Hansons Market Scoring Rules Robin Hanson, Logarithmic Market Scoring Rules for Modular Combinatorial Information Aggregation, Robin Hanson, Combinatorial.
NP-Hard Nattee Niparnan.
Markov Networks Alan Ritter.
A Tutorial on Learning with Bayesian Networks
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Fast Algorithms For Hierarchical Range Histogram Constructions
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Seminar In Game Theory Algorithms, TAU, Agenda  Introduction  Computational Complexity  Incentive Compatible Mechanism  LP Relaxation & Walrasian.
Dynamic Bayesian Networks (DBNs)
Identifying Conditional Independencies in Bayes Nets Lecture 4.
Chapter 14 Comparing two groups Dr Richard Bußmann.
. Bayesian Networks Lecture 9 Edited from Nir Friedman’s slides by Dan Geiger from Nir Friedman’s slides.
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
Chapter 11: Limitations of Algorithmic Power
CS 188: Artificial Intelligence Spring 2007 Lecture 14: Bayes Nets III 3/1/2007 Srini Narayanan – ICSI and UC Berkeley.
CPS Securities & Expressive Securities Markets Vincent Conitzer
Computer vision: models, learning and inference Chapter 10 Graphical Models.
1 Computation in a Distributed Information Market Joan Feigenbaum (Yale) Lance Fortnow (NEC Labs) David Pennock (Overture) Rahul Sami (Yale)
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
. Expressive Graphical Models in Variational Approximations: Chain-Graphs and Hidden Variables Tal El-Hay & Nir Friedman School of Computer Science & Engineering.
A Practical Course in Graphical Bayesian Modeling; Class 1 Eric-Jan Wagenmakers.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Bayes Net Perspectives on Causation and Causal Inference
AM Recitation 2/10/11.
© 2004 South-Western Publishing 1 Chapter 6 The Black-Scholes Option Pricing Model.
Decision Analysis (cont)
Chapter 13 Wiener Processes and Itô’s Lemma
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Pricing Combinatorial Markets for Tournaments Presented by Rory Kulz.
Using Probability and Discrete Probability Distributions
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
10.1 Options, Futures, and Other Derivatives, 4th edition © 1999 by John C. Hull Model of the Behavior of Stock Prices Chapter 10.
1 Derivatives & Risk Management: Part II Models, valuation and risk management.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Module networks Sushmita Roy BMI/CS 576 Nov 18 th & 20th, 2014.
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Slides for “Data Mining” by I. H. Witten and E. Frank.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Learning In Bayesian Networks. General Learning Problem Set of random variables X = {X 1, X 2, X 3, X 4, …} Training set D = { X (1), X (2), …, X (N)
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Inference Algorithms for Bayes Networks
1 Learning P-maps Param. Learning Graphical Models – Carlos Guestrin Carnegie Mellon University September 24 th, 2008 Readings: K&F: 3.3, 3.4, 16.1,
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
Chapter 13 Wiener Processes and Itô’s Lemma 1. Stochastic Processes Describes the way in which a variable such as a stock price, exchange rate or interest.
Securities & Expressive Securities Markets
Lecture 7: Constrained Conditional Models
Chapter 7. Classification and Prediction
CS 2750: Machine Learning Directed Graphical Models
US Election Prediction Markets: Applications of Parimutuel Call Auction Mechanisms Mark Peters November 25, /22/2018.
CS 4/527: Artificial Intelligence
Markov Networks.
Bayesian Networks: Motivation
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
Instructors: Fei Fang (This Lecture) and Dave Touretzky
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2007
Chapter 11 Limitations of Algorithm Power
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence Fall 2008
Class #16 – Tuesday, October 26
Presentation transcript:

Combinatorial Betting Rick Goldstein and John Lai

Outline Prediction Markets vs Combinatorial Markets How does a combinatorial market maker work? Bayesian Networks + Price Updating Applications Discussion Complexity (if time permits)

Simple Markets Small outcome space Binary or a small finite number Sports game (binary); Horse race (constant number) Easy to match orders and price trades Larger outcome space E.g.: State-by-state winners in an election One way: separate market for each state Weaknesses cannot express certain information Candidate either wins both Florida and Ohio or neither Need arbitrage to make markets consistent

Combinatorial Betting Different approach for large outcome spaces Single market with large underlying outcome space Elections (n binary events) 50 states, two possible winners for each state, 2 50 outcomes Horse race (permutation betting) n horses, all possible orderings of finishing, n! outcomes

Two types of markets Order matching Risklessly match buy and sell orders Market maker Price and accept any trade Thin markets problem with order matching

Computational Difficulties Order matching Which orders to accept? Is there is a non-null subset of orders we can accept? Hard combinatorial optimization question Why is this easy in simple markets? Market maker How to price trades? How to keep track of current state? Can be computationally intractable for certain trades Why is this easy in simple markets?

Order Matching Contracts costs $q, pays $1 if event occurs Sell orders: buy the negation of the event Horse race, three horses A, B, C Alice: (A wins, 0.6, 1 share) Bob: (B wins, 0.3 for each, 2 shares) Charlie: (C wins, 0.2 for each, 3 shares) Auctioneer does not want to assume any risk Should you accept the orders? Indivisible: no. Example: accept all orders, revenue = 1.8, but might have to pay out 2 or 3 if B or C wins respectively Divisible: yes. Example: accept 1 share of each order, revenue = 1.1, pay out 1 in any state of the world

Order Matching: Details

Order Matching: Permutations Bet on orderings of n variables Chen et. al. (2007) Pair betting Bet that A beats B NP-hard for both divisible and indivisible orders Subset betting Bet that A,B,C finish in position k Bet that A finishes in positions j, k, l Tractable for divisible orders Solve the separation problem efficiently by reduction to maximum weight bipartite matching

Order Matching: Binary Events n events, 2 n outcomes Fortnow et. al. (2004) Divisible Polynomial time with O(log m) events co-NP complete for O(m) events Indivisible NP-complete for O(log m) events

Market Maker Price securities efficiently Logarithmic scoring rule

Market Maker Pricing trades under an unrestricted betting language is intractable Idea: reduction If we could price these securities, then we could also compute the number of satisfying assignments of some boolean formula, which we know is hard

Market Maker Search for bets that admit tractable pricing Aside: Bayesian Networks Graphical way to capture the conditional independences in a probability distribution If distributions satisfy the structure given by a Bayesian network, then need much fewer parameters to actually specify the distribution

Bayesian Networks ALCS NLCS World Series Any distribution: Bayes Net distribution:

Bayesian Networks Directed Acyclic Graph over the variables in a joint distribution Decomposition of the joint distribution: Can read off independences and conditional independences from the graph

Bayesian Networks

Market Maker Idea: find trades whose implied probability distributions are simple Bayesian networks Exploit properties of Bayesian networks to price and update efficiently

Paper Roadmap 1. Basic lemmas for updating probabilities when shares are purchased on any event A 2. Uniform distribution is represented by a Bayesian network (BN) 3. For certain classes of trades, the implied distribution after trades will still be reflected by the BN (i.e. conditional independences still hold) 4. Because of the BN structure that persists even after trades are made, we can characterize the distribution with a small number of parameters, compute prices, and update probabilities efficiently

Basic Lemmas

Network Structure 1

Network Structure I Implied joint distribution has some strange properties Winners of first round games are not independent Expect independence in true distribution; restricted language is not capturing true distribution

Network Structure II

Tractable Pricing and Updates Only need to update conditional probability tables of ancestor nodes Number of parameters to specify the network is small (polynomial in n) Counting Exercise: how many parameters needed to specify network given by the tree structure?

Sampling Based Methods

Applications Predictalot (Yahoo!) Combinatorial Market for NCAA basketball March Madness 64 teams, 63 single elimination games, 1 winner Predictalot allowed combinatorial bets Probability Duke beats UNC given they play Probability Duke wins more games than UNC Duke wins the entire tournament Duke wins their first game against Belmont Status points (no real money)

=

Predictalot! Predictalot allows for 2 63 bets About 9.2 quintillion possible states of the world ,000 possible bets Too much space to store all data Rather Predictalot computes probabilities on the fly given past bets Randomly sample outcome space Emulate Hansons market maker

Discussion Do you think these combinatorial markets are practical?

Strengths Natural betting language Prediction markets fully elicit beliefs of participants Can bet on match-ups that might not be played to figure out information about relative strength between teams Conditionally betting Believe in hot streaks/non-independence then can bet at better rates that with prediction markets Correlations Good for insurance + risk calculations No thin market problem Trade bundles in 1 motion

Criticism Do we really need such an expressive betting language? 2 63 markets different bets Whats wrong with using binary markets? Instead, why dont we only bet on known games that are taking place? UCLA beats Miss. Valley State in round 1 Duke beats Belmont in round 1 After round 1 is over, we close old markets and open new markets Duke beats Arizona in round 2

More Criticism

Even More Criticism 64 more markets for tourney winner Duke wins entire tourney UNC wins entire tourney Arizona State wins entire tourney Need ~> 2n markets to allow for all bets that people actually make Perhaps add 20 or so interesting pairwise bets for rivalries? Duke outlasts UNC 50%? USC outlasts UCLA 5%? Dont need 2 63 bets as in Predictalot

Expressiveness v. Tractability Tradeoff between expressiveness and tractability Allow any trade on the 2 50 outcomes (Good): Theoretically can express any information (Bad): Traders may not exploit expressiveness (Bad): Impossible to keep track of all 2 50 states Restrict possible trades (Good): May be computationally tractable (Good): More natural betting languages (Bad): Cannot express some information (Bad): Inferred probability distribution not intuitive

Tractable Pricing and Updates (optional)

Complexity Result (optional)

How does Predictalot Make Prices? (optional) Markov Chain Monte Carlo Try to construct Markov Chain with probabilities implied by past bets Correlated Monte Carlo Method Importance Sampling Estimating properties of a distribution with only samples from a different distribution Monte Carlo, but encourages important values Then corrects these biases