Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multi Armed Bandits chalpert@meetup.com.

Similar presentations


Presentation on theme: "Multi Armed Bandits chalpert@meetup.com."— Presentation transcript:

1 Multi Armed Bandits

2 Survey

3 Click Here

4 Click-through Rate (Clicks / Impressions) 20%
Click Here Click-through Rate (Clicks / Impressions) 20%

5 Click Here Click Here

6 Click Here Click Here Click-through Rate 20% ?

7 AB Test Randomized Controlled Experiment
Show each button to 50% of users Click Here Click Here Click-through Rate 20% ?

8 After Test (show winner)
AB Test Timeline Time Exploration Phase (Testing) Exploitation Phase (Show Winner) Before Test AB Test After Test (show winner)

9 Click Here Click Here Click-through Rate 20% ?

10 Click Here Click Here Click-through Rate 20% 30%

11 10,000 impressions/month Need 4,000 clicks by EOM 30% CTR won’t be enough

12 Need to keep testing (Exploration)

13

14 Each variant would be assigned with probability 1/N
Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here ABCDEFG... Test Each variant would be assigned with probability 1/N N = # of variants Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here

15 Not everyone is a winner

16 Each variant would be assigned with probability 1/N
Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here ABCDEFG... Test Each variant would be assigned with probability 1/N N = # of variants Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here Click Here

17 Need to keep testing (Exploration)
Need to minimize regret (Exploitation)

18 Balance of Exploitation & Exploration
Multi Armed Bandit Balance of Exploitation & Exploration

19 Bandit Algorithm Balances Exploitation & Exploration
Time Discrete Exploitation & Exploration Phases Before Test AB Test After Test Continuous Exploitation & Exploration Before Test Multi Armed Bandit Bandit Favors Winning Arm

20 Bandit Algorithm Reduces Risk of Testing
AB Test Best arm exploited with probability 1/N More Arms: Less exploitation Bandit Best arm exploited with determined probability Reduced exposure to suboptimal arms

21 Borrowed from Probabilistic Programming & Bayesian Methods for Hackers
Demo Borrowed from Probabilistic Programming & Bayesian Methods for Hackers

22

23 AB test would have cost 4.3 percentage points
Split Test Still sending losers Bandit AB test would have cost 4.3 percentage points Winner Breaks Away!

24 How it works Epsilon Greedy Algorithm ε = Probability of Exploration
Click Here Exploration ε 1 / N ε / N Start of round Click Here Epsilon Greedy with ε = 1 = AB Test 1 - ε 1-ε Exploitation (show best arm) Click Here

25 Epsilon Greedy Issues Constant Epsilon: No prior knowledge
Initially under exploring Later over exploring Better if probability of exploration decreases with sample size (annealing) No prior knowledge

26 Some Alternatives Epsilon-First Epsilon-Decreasing Softmax
UCB (UCB1, UCB2) Bayesian-UCB Thompson Sampling (Bayesian Bandits)

27 Bandit Algorithm Comparison
Regret:

28 Thompson Sampling Setup: Assign each arm a Beta distribution with parameters (α,β) (# Success, # Failures) Beta(α,β) Beta(α,β) Beta(α,β) Click Here Click Here Click Here

29 Thompson Sampling Setup: Initialize priors with ignorant state of Beta(1,1) (Uniform distribution) - Or initialize with an informed prior to aid convergence Beta(1,1) Beta(1,1) Beta(1,1) Click Here Click Here Click Here

30 Thompson Sampling For each round:
1: Sample random variable X from each arm’s Beta Distribution 2: Select the arm with largest X 3: Observe the result of selected arm 4: Update prior Beta distribution for selected arm Success! X 0.7 0.2 0.4 Beta(1,1) Beta(1,1) Beta(1,1) Click Here Click Here Click Here

31 Thompson Sampling For each round:
1: Sample random variable X from each arm’s Beta Distribution 2: Select the arm with largest X 3: Observe the result of selected arm 4: Update prior Beta distribution for selected arm Success! X 0.7 0.2 0.4 Beta(2,1) Beta(1,1) Beta(1,1) Click Here Click Here Click Here

32 Thompson Sampling For each round:
1: Sample random variable X from each arm’s Beta Distribution 2: Select the arm with largest X 3: Observe the result of selected arm 4: Update prior Beta distribution for selected arm Failure! X 0.4 0.8 0.2 Beta(2,1) Beta(1,1) Beta(1,1) Click Here Click Here Click Here

33 Thompson Sampling For each round:
1: Sample random variable X from each arm’s Beta Distribution 2: Select the arm with largest X 3: Observe the result of selected arm 4: Update prior Beta distribution for selected arm Failure! X 0.4 0.8 0.2 Beta(2,1) Beta(1,2) Beta(1,1) Click Here Click Here Click Here

34

35 Posterior after 100k pulls (30 arms)

36 Bandits at Meetup

37 Meetup’s First Bandit

38 Control: Welcome To Meetup. - 60% Open Rate Winner: What
Control: Welcome To Meetup! - 60% Open Rate Winner: What? Winner: Hi - 75% Open Rate (+25%) 76 Arms

39 Control: Welcome To Meetup. - 60% Open Rate Winner: What
Control: Welcome To Meetup! - 60% Open Rate Winner: What? Winner: Hi - 75% Open Rate (+25%) 76 Arms

40 Control: Welcome To Meetup. - 60% Open Rate Winner: What
Control: Welcome To Meetup! - 60% Open Rate Winner: What? Winner: Hi - 75% Open Rate (+25%) 76 Arms

41 Avoid Linkbaity Subject Lines

42 Coupon 16 Arms Control: Save 50%, start your Meetup Group – 42% Open Rate Winner: Here is a coupon – 53% Open Rate (+26%)

43 398 Arms

44

45 210% Click-through Difference:
Best: Looking to start the perfect Meetup for you? We’ll help you find just the right people Start the perfect Meetup for you! Worst: Launch your own Meetup in January and save 50% Start the perfect Meetup for you 50% off promotion ends February 1st.

46 Choose the Right Metric of Success
Success tied to click in last experiment Sale end & discount messaging had bad results Perhaps people don’t know that hosting a Meetup costs $$$? Better to tie success to group creation

47 More Issues Email open & click delay New subject line effect
Problem when testing notifications Monitor success trends to detect weirdness

48 Seasonality Thompson Sampling should naturally adapt to seasonal changes Learning rate can be added for faster adaptation Winner all other times Click Here Click Here

49 Bandit or Split Test? AB Test good for: Bandit good for:
- Biased Tests - Complicated Tests Bandit good for: - Unbiased Tests - Many Variants - Time Restraints - Set It And Forget It

50 Thanks!


Download ppt "Multi Armed Bandits chalpert@meetup.com."

Similar presentations


Ads by Google