Download presentation

Presentation is loading. Please wait.

Published byKeira Mewborn Modified over 2 years ago

1
Search Advertising These slides are modified from those by Anand Rajaram & Jeff UllmanAnand Rajaram & Jeff Ullman

2
History of web advertising Banner ads (1995-2001) Initial form of web advertising Popular websites charged X$ for every 1000 “impressions” of ad Called “CPM” rate Modeled similar to TV, magazine ads Untargeted to demographically tageted Low clickthrough rates low ROI for advertisers

3
Performance-based advertising Introduced by Overture around 2000 Advertisers “bid” on search keywords When someone searches for that keyword, the highest bidder’s ad is shown Advertiser is charged only if the ad is clicked on Similar model adopted by Google with some changes around 2002 Called “Adwords”

4
Ads vs. search results

5
Web 2.0 Performance-based advertising works! Multi-billion-dollar industry Interesting problems Search Engine: What ads to show for a search? Advertiser: Which search terms should I bid on and how much to bid? Will I be charged the full amount I bid? User: Am I getting relevant ads? Should I click on them?

6
From http://www.stanford.edu/class/msande239/http://www.stanford.edu/class/msande239/

7
From http://www.stanford.edu/class/msande239/http://www.stanford.edu/class/msande239/

8
From http://www.stanford.edu/class/msande239/http://www.stanford.edu/class/msande239/

9
From http://www.stanford.edu/class/msande239/http://www.stanford.edu/class/msande239/

10
Simple Adwords problem A stream of queries arrives at the search engine q1, q2,… Several advertisers bid on each query When query q i arrives, search engine must pick an ad to show Goal: maximize search engine’s revenues Clearly we need an online algorithm!

11
Greedy algorithm Select the ad with the highest bid for Simplest algorithm is greedy It’s easy to see that the greedy algorithm is actually optimal!

12
Complication 1: Ads with different CTRs Each ad has a different likelihood of being clicked Advertiser 1 bids $2, click probability = 0.1 Advertiser 2 bids $1, click probability = 0.5 Clickthrough rate measured historically Simple solution Instead of raw bids, use the “expected revenue per click”

13
The Adwords Innovation AdvertiserBidCTRBid * CTR A B C $1.00 $0.75 $0.50 1% 2% 2.5% 1 cent 1.5 cents 1.125 cents CTR: Click-Through Rate for that specific ad What fraction of times, when the ad is shown, do people click on the ad? SE has to track these statistics over time..

14
The Adwords Innovation AdvertiserBidCTRBid * CTR A B C $1.00 $0.75 $0.50 1% 2% 2.5% 1 cent 1.5 cents 1.125 cents

15
Complication 2: Advertisers bid on keywords not queries.. Many-to-Many correspondence between queries and keywords purchased Select top-k similar ads, pick from among them the one with highest bid*CTR Retrieving Similar Ads Exact matching IR-style similarity Query rewriting (say using scalar/association clustering) Followed by exact matching Need inverted indexes

16
Complication 3: SE may want to show multiple ads per query First ad shown: Is the one that is most similar and has the highest bid*CTR value Second ad shown isn’t necessarily the one that has next highest bid*CTR.. Need to worry about inter-ad correlation (diversity) Need to worry about user browsing pattern (user may stop looking after the first ad)

17
Optimal Ranking given Abandonment The physical meaning RF is the profit generated for unit consumed view probability of ads Higher ads have more view probability. Placing ads producing more profit for unit consumed view probability higher up is intuitive. Rank ads in the descending order of: 17 Optimal ranking considering inter-ad correlation is NP-hard Raju Balakrishnan

18
Complication 4: Advertisers have limited budgets Each advertiser has a limited budget Search engine guarantees that the advertiser will not be charged more than their daily budget SE needs to be smarter in selecting among the relevant advertisers so it is sensitive to the remaining budget..

19
Simplified model (for now) Assume all bids are 0 or 1 Each advertiser has the same budget B One advertiser per query Let’s try the greedy algorithm Arbitrarily pick an eligible advertiser for each keyword

20
Bad scenario for greedy Two advertisers A and B A bids on query x, B bids on x and y Both have budgets of $4 Query stream: xxxxyyyy Worst case greedy choice: BBBB____ Optimal: AAAABBBB Competitive ratio = ½ Simple analysis shows this is the worst case Competitive Ratio: Ratio between online and off-line alg. Performance

21
BALANCE algorithm [MSVV] [Mehta, Saberi, Vazirani, and Vazirani] For each query, pick the advertiser with the largest unspent budget Break ties arbitrarily

22
Example: BALANCE Two advertisers A and B A bids on query x, B bids on x and y Both have budgets of $4 Query stream: xxxxyyyy BALANCE choice: ABABBB__ Optimal: AAAABBBB Competitive ratio = ¾ In the general case, worst competitive ratio of BALANCE is 1–1/e ~ 0.63

23
Analyzing BALANCE Consider simple case: two advertisers, A 1 and A 2, each with budget B (assume B À 1) Assume optimal solution exhausts both advertisers’ budgets BALANCE must exhaust at least one advertiser’s budget If not, we can allocate more queries Assume BALANCE exhausts A 2 ’s budget

24
Analyzing Balance A1A1 A2A2 B x y B A1A1 A2A2 x Opt revenue = 2B Balance revenue = 2B-x = B+y We have y ¸ x Balance revenue is minimum for x=y=B/2 Minimum Balance revenue = 3B/2 Competitive Ratio = 3/4 Queries allocated to A 1 in optimal solution Queries allocated to A 2 in optimal solution

25
General Result In the general case, worst competitive ratio of BALANCE is 1–1/e = approx. 0.63 Interestingly, no online algorithm has a better competitive ratio Won’t go through the details here, but let’s see the worst case that gives this ratio

26
Worst case for BALANCE N advertisers, each with budget B À N À 1 NB queries appear in N rounds of B queries each Round 1 queries: bidders A 1, A 2, …, A N Round 2 queries: bidders A 2, A 3, …, A N Round i queries: bidders A i, …, A N Optimum allocation: allocate round i queries to A i Optimum revenue NB

27
BALANCE allocation … A1A1 A2A2 A3A3 A N-1 ANAN B/N B/(N-1) B/(N-2) After k rounds, sum of allocations to each of bins A k,…,A N is S k = S k+1 = … = S N = 1 · 1 · k B/(N-i+1) If we find the smallest k such that S k ¸ B, then after k rounds we cannot allocate any queries to any advertiser

28
BALANCE analysis B/1 B/2 B/3 … B/(N-k+1) … B/(N-1) B/N S1S1 S2S2 S k = B 1/1 1/2 1/3 … 1/(N-k+1) … 1/(N-1) 1/N S1S1 S2S2 S k = 1

29
BALANCE analysis Fact: H n = 1 · i · n 1/i = approx. log(n) for large n Result due to Euler 1/1 1/2 1/3 … 1/(N-k+1) … 1/(N-1) 1/N S k = 1 log(N) log(N)-1 S k = 1 implies H N-k = log(N)-1 = log(N/e) N-k = N/e k = N(1-1/e)

30
BALANCE analysis So after the first N(1-1/e) rounds, we cannot allocate a query to any advertiser Revenue = BN(1-1/e) Competitive ratio = 1-1/e

31
General version of problem Arbitrary bids, budgets Consider query q, advertiser i Bid = x i Budget = b i BALANCE can be terrible Consider two advertisers A 1 and A 2 A 1 : x 1 = 1, b 1 = 110 A 2 : x 2 = 10, b 2 = 100

32
Generalized BALANCE Arbitrary bids; consider query q, bidder i Bid = x i (can also consider x i * CTR i ) Budget = b i Amount spent so far = m i Fraction of budget left over f i = 1-m i /b i Define i (q) = x i (1-e -f i ) Allocate query q to bidder i with largest value of i (q) Same competitive ratio (1-1/e)

33
Complication 4: How do we encourage Truthful Bidding? How to set the keyword bids? I am willing to pay 2$/click on the phrase “best asu course” But should I be charged the full 2$ even if no one else cares for that phrase? If I know that I will be charged my full bid price, I am likely to under-bid I lose the auction; search engine loses revenue Solution: Second Price Auction (Vickery Auction) The advertiser with the highest bid wins, but only pays the price of the second highest bid Has the property that truthful bidding is dominant strategy [No other strategy does better] Insight: Physical auctions Are second-price auctions!

34
From http://www.stanford.edu/class/msande239/http://www.stanford.edu/class/msande239/

35
Generalizing to multiple items Two issues: 1. Allocation: who gets which item? Decided by Maximal Matching 2. Pricing: What price do they pay? Decided by opportunity cost introduced by the winner (which involves doing matching again without the winning bid) (Each advertiser may bid on more than one query) 30 Q1 Q2 This opportunity cost for a bidder is defined as the total bids of all the other bidders that would have won if the first bidder didn't bid, minus the total bids of all the other actual winning bidders.

36
Generalizing to multiple items Single item Vickery Auction (Truthful bidding is dominant strategy) Multi-item VCG Auction (Truthful bidding is dominant strategy) --Price paid by each buyer is his/her externality (how much others would have benefited if he/she weren’t around) Multi-item Generalized Second Price Auction (Truthful bidding is not a dominant strategy) Price paid by i-th bidder is the bid of i+1th bidder Google (and most SE) use this Two issues: 1. Allocation who gets which item? 2. Pricing What price do they pay?

37
What we swept under the rug.. We handled user interests (CTR) search engine interests (ranking, budgets) Advertiser interests (keyword bidding, auctions) As if they are sort of independent.. But they are inter-dependent.. The full solution needs to bring them all together....and won’t have too many neat properties that you can prove

38
Ad Ranking: State of the Art Sort by Bid Amount x Relevance We consider ads as a set, and ranking is based on user’s browsing model Sort by Bid Amount Ads are Considered in Isolation, as both ignore Mutual influences. 38

39
User’s Browsing Model User browses down staring at the first ad Abandon browsing with probability Goes down to the next ad with probability At every ad he May Process repeats for the ads below with a reduced probability Click the ad with relevance probability If is similar to residual relevance of decreases and abandonment probability increases. 39

40
Mutual Influences Three Manifestations of Mutual Influences on an ad are: 1.Similar ads placed above Reduces user’s residual relevance of 2.Relevance of other ads placed above User may click on above ads may not view 3.Abandonment probability of other ads placed above User may abandon search and may not view 40

41
Expected Profit Considering Ad Similarities Considering bids ( ), residual Relevance ( ), abandonment probability ( ), and similarities the expected profit from a set of n results is, THEOREM: Ranking maximizing expected profit considering similarities between the results is NP-Hard Proof is a reduction of independent set problem to choosing top-k ads considering similarities. Expected Profit = Even worse, constant ratio approximation algorithms are hard (unless NP = ZPP) for diversity ranking problem 41

42
Dropping similarity, hence replacing Residual Relevance ( ) by Absolute Relevance ( ), Ranking to maximize this expected utility is a sorting problem Expected Profit Considering other two Mutual Influences (2 and 3) Expected Profit = 42

43
Comparison to current Ad Rankings Assume abandonment probability is zero Assume where is a constant for all ads Assumes that the user has infinite patience to go down the results until he finds the ad he wants. Assumes that abandonment probability is negatively proportional to relevance. Bid Amount x Relevance Sort by Bid Amount

44
Generality of the Proposed Ranking The generalized ranking based on utilities. For ads utility=bid amount For documents utility=relevance Popular relevance ranking 44

45
Quantifying Expected Profit Proposed strategy gives maximum profit for the entire range 45.7% 35.9% Number of Clicks Zipf random with exponent 1.5 Abandonment probability Uniform Random as Relevance Uniform random as Bid Amounts Uniform random Difference in profit between RF and competing strategy is significant Bid amount only strategy becomes optimal at 45

46
Optimal Ad-Ranking for Profit Maximization. Raju Balakrishnan, Subbarao Kabmbhampati. WebDB 2008 Yahoo! Research Key scientific Challenge award for Computation advertising, 2009-10 Publication and Recognition 46

47
Online algorithms Classic model of algorithms You get to see the entire input, then compute some function of it In this context, “offline algorithm” Online algorithm You get to see the input one piece at a time, and need to make irrevocable decisions along the way Similar to data stream models

48
Example: Bipartite matching 1 2 3 4 a b c d GirlsBoys

49
Example: Bipartite matching 1 2 3 4 a b c d M = {(1,a),(2,b),(3,d)} is a matching Cardinality of matching = |M| = 3 GirlsBoys

50
Example: Bipartite matching 1 2 3 4 a b c d GirlsBoys M = {(1,c),(2,b),(3,d),(4,a)} is a perfect matching

51
Matching Algorithm Problem: Find a maximum-cardinality matching for a given bipartite graph A perfect one if it exists There is a polynomial-time offline algorithm (Hopcroft and Karp 1973) But what if we don’t have the entire graph upfront?

52
Online problem Initially, we are given the set Boys In each round, one girl’s choices are revealed At that time, we have to decide to either: Pair the girl with a boy Don’t pair the girl with any boy Example of application: assigning tasks to servers

53
Online problem 1 2 3 4 a b c d (1,a) (2,b) (3,d)

54
Greedy algorithm Pair the new girl with any eligible boy If there is none, don’t pair girl How good is the algorithm?

55
Competitive Ratio For input I, suppose greedy produces matching M greedy while an optimal matching is M opt Competitive ratio = min all possible inputs I (|M greedy |/|M opt |)

56
Analyzing the greedy algorithm Consider the set G of girls matched in M opt but not in M greedy Then it must be the case that every boy adjacent to girls in G is already matched in M greedy There must be at least |G| such boys Otherwise the optimal algorithm could not have matched all the G girls Therefore |M greedy | ¸ |G| = |M opt - M greedy | |M greedy |/|M opt | ¸ 1/2

57
Worst-case scenario 1 2 3 4 a b c (1,a) (2,b) d

Similar presentations

OK

Mining of Massive Datasets Jure Leskovec, Anand Rajaraman, Jeff Ullman Stanford University Note to other teachers and users of these.

Mining of Massive Datasets Jure Leskovec, Anand Rajaraman, Jeff Ullman Stanford University Note to other teachers and users of these.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on motivation for management students Ppt on hotel management software Ppt on mitigation strategies for landslides Ppt on teamviewer Ppt on special types of chromosomes genes Ppt on statistics in maths the ratio Ppt on combination of resistances eve Story ppt on leadership Download ppt on life cycle of butterfly Presentation ppt on motivation theory