Presentation is loading. Please wait.

Presentation is loading. Please wait.

Information Complexity Lower Bounds

Similar presentations


Presentation on theme: "Information Complexity Lower Bounds"β€” Presentation transcript:

1 Information Complexity Lower Bounds
Rotem Oshman, Princeton CCI Based on: Bar-Yossef,Jayram,Kumar,Srinivasan’04 Braverman,Barak,Chen,Rao’10

2 Communication Complexity
𝑓 𝑋,π‘Œ = ? 𝑋 π‘Œ Yao β€˜79, β€œSome complexity questions related to distributive computing”

3 Communication Complexity
Applications: Circuit complexity Streaming algorithms Data structures Distributed computing Property testing …

4 Deterministic Protocols
A protocol Ξ  specifies, at each point: Which player speaks next What should the player say When to halt and what to output Formally, Ξ : 0,1 βˆ— β†’ 𝐴,𝐡,βŠ₯ Γ— 0,1 what we’ve said so far who speaks next: 𝐴= Alice, 𝐡= Bob, βŠ₯ = halt what to say/output

5 Randomized Protocols Can use randomness to decide what to say
Private randomness: each player has a separate source of random bits Public randomness: both players can use the same random bits Goal: for any 𝑋,π‘Œ compute 𝑓 𝑋,π‘Œ correctly with probability β‰₯1 βˆ’πœ– Communication complexity: worst-case length of transcript in any execution

6 Randomness Can Help a Lot
Example: Equality 𝑋,π‘Œ Input: 𝑋,π‘Œβˆˆ 0,1 𝑛 Output: is 𝑋=π‘Œ ? Trivial protocol: Alice sends 𝑋 to Bob For deterministic protocols, this is optimal!

7 Equality Lower Bound 0 𝑛 1 𝑛 … 1 #rectangles ≀ 2 transcript

8 Randomized Protocol Protocol with public randomness:
Select random π‘βˆˆ 0,1 𝑛 Alice sends 𝑋,𝑍 = 𝑖=1 𝑛 𝑋 𝑖 𝑍 𝑖 mod 2 Bob accepts iff π‘Œ,𝑍 = 𝑋,𝑍 If 𝑋=π‘Œ: always accept If π‘‹β‰ π‘Œ: 𝑋,𝑍 + π‘Œ,𝑍 mod 2= 𝑋+π‘Œ,𝑍 mod 2 Reject with probability 1/2 non-zero vector

9 Set Disjointness Input: 𝑋,π‘ŒβŠ† 1,…,𝑛 Output: π‘‹βˆ©π‘Œ=βˆ… ?
Theorem [Kalyanasundaran, Schnitger β€˜92, Razborov β€˜92]: randomized CC = Ξ© 𝑛 Easy to see for deterministic protocols Today we’ll see a proof by Bar-Yossef, Jayram, Kumar, Srinivasan β€˜04

10 Application: Streaming Lower Bounds
Streaming algorithm: Example: how many distinct items in the data? Reduction from Disjointness [Alon, Matias, Szegedy ’99] How much space is required to approximate f(data)? algorithm data

11 Reduction from Disjointness:
Fix a streaming algorithm for Distinct Elements with space 𝑆, universe size 𝑛 Construct a protocol for Disj. with 𝑛 elements: 𝑋={ π‘₯ 1 ,…, π‘₯ π‘˜ } π‘Œ={ 𝑦 1 ,…, 𝑦 β„“ } algorithm π‘‹βˆ©π‘Œ=βˆ… ⇔ #distinct elements in 𝑋βˆͺπ‘Œ is 𝑋 + π‘Œ State of the algorithm and 𝑋 (#bits = 𝑆+log 𝑛)

12 Application 2: KW Games Circuit depth lower bounds:
How deep does the circuit need to be? ∧ ∨ π‘₯ 1 … π‘₯ 𝑛 𝑓(π‘₯ 1 ,…, π‘₯ 𝑛 )

13 Application 2: KW Games Karchmer-Wigderson’93,Karchmer-Raz-Wigderson’94: find 𝑖 such that 𝑋 𝑖 β‰  π‘Œ 𝑖 𝑋 :𝑓 𝑋 =0 π‘Œ:𝑓 π‘Œ =1

14 Application 2: KW Games Claim: if 𝐾 π‘Š 𝑓 has deterministic CC β‰₯𝑑, then 𝑓 requires circuit depth β‰₯𝑑. Circuit with depth 𝑑 β‡’ protocol with length 𝑑 1 ∧ ∨ π‘₯ 1 … π‘₯ 𝑛 1 1 1 𝑋 :𝑓 𝑋 =0 π‘Œ:𝑓 π‘Œ =1

15 Information-Theoretic Lower Bound on Set Disjointness

16 Some Basic Concepts from Info Theory
Entropy of a random variable: 𝐻 𝑋 =βˆ’ π‘₯ Pr 𝑋=π‘₯ log Pr 𝑋=π‘₯ Important properties: 𝐻 𝑋 β‰₯0 𝐻 𝑋 =0 β‡’ 𝑋 is deterministic 𝐻(𝑋) = expected # bits needed to encode 𝑋

17 Some Basic Concepts from Info Theory
Conditional entropy: 𝐻 𝑋 π‘Œ = 𝐸 𝑦 𝐻( 𝑋 |π‘Œ=𝑦 ) Important properties: 𝐻 𝑋|π‘Œ ≀𝐻(𝑋) 𝐻 𝑋|π‘Œ =𝐻 𝑋 β‡’ 𝑋,π‘Œ are independent Example: 𝑋,π‘βˆΌBernoulli 𝐻 𝑋 =βˆ’ 1 2 log βˆ’ 1 2 log =1 If 𝑍=0 then π‘Œ=𝑋, if 𝑍=1 then π‘Œ=1βˆ’π‘‹ 𝐻 𝑋 π‘Œ,𝑍 = 1 2 𝐻 𝑋|𝑋 𝐻 𝑋|1βˆ’π‘‹ =0 𝐻 𝑋 π‘Œ =1

18 Some Basic Concepts from Info Theory
Mutual information: 𝐼 𝑋;π‘Œ =𝐻 𝑋 βˆ’π» 𝑋 π‘Œ =𝐻 π‘Œ βˆ’π»(π‘Œ|𝑋) Conditional mutual information: 𝐼 𝑋;π‘Œ|𝑍 =𝐻 𝑋|𝑍 βˆ’π» 𝑋 π‘Œ,𝑍 =𝐻 π‘Œ|𝑍 βˆ’π»(π‘Œ|𝑋,𝑍) Important properties: 𝐼 𝑋;π‘Œ β‰₯0 𝐼 𝑋;π‘Œ =0 β‡’ 𝑋,π‘Œ are independent

19 Some Basic Concepts from Info Theory
Chain rule for mutual information: 𝐼 𝑋 1 , 𝑋 2 ;π‘Œ =𝐼 𝑋 1 ;π‘Œ +𝐼 𝑋 2 ;π‘Œ 𝑋 1 More generally, 𝐼 𝑋 1 ,…, 𝑋 π‘˜ ;π‘Œ = 𝑖=1 π‘˜ 𝐼 𝑋 𝑖 ;π‘Œ 𝑋 1 ,…, 𝑋 π‘–βˆ’1

20 Information Cost of Protocols
Fix an input distribution πœ‡ on 𝑋,π‘Œ Given a protocol Ξ , let Ξ  also denote the distribution of Π’s transcript Information cost of Ξ : 𝐼𝐢 Ξ  =𝐼 Ξ ;π‘Œ 𝑋 +𝐼 Ξ ;𝑋 π‘Œ Information cost of a function 𝑓: 𝐼 𝐢 πœ– 𝑓 = inf Ξ  solves 𝑓 𝑀/errorβ‰€πœ– 𝐼𝐢 Ξ 

21 Information Cost of Protocols
Important property: 𝐼𝐢 Ξ  ≀|Ξ | Proof: by induction. Let Ξ = Ξ  1 … Ξ  𝑑 . βˆ€π‘Ÿβ‰€π‘‘ : 𝐼 Ξ  β‰€π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  β‰€π‘Ÿ ;𝑋 π‘Œ β‰€π‘Ÿ. 𝐼 Ξ  β‰€π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  β‰€π‘Ÿ ;𝑋 π‘Œ what we know after r rounds =𝐼 Ξ  <π‘Ÿ ;π‘Œ 𝑋 +𝐼 Ξ  <π‘Ÿ ;𝑋 π‘Œ what we knew after r-1 rounds + 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ +𝐼 Ξ  π‘Ÿ ;X Y, Ξ  <π‘Ÿ what we learn in round r, given what we already know

22 Information vs. Communication
Want: 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ +𝐼 Ξ  π‘Ÿ ;X Y, Ξ  <π‘Ÿ ≀1. Suppose Ξ  π‘Ÿ is sent by Alice. What does Alice learn? Ξ  π‘Ÿ is a function of Ξ  <π‘Ÿ and 𝑋, so 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ =0. What does Bob learn? 𝐼 Ξ  π‘Ÿ ;Y X, Ξ  <π‘Ÿ ≀ Ξ  π‘Ÿ =1.

23 Information vs. Communication
Important property: 𝐼𝐢 Ξ  ≀|Ξ | Lower bound on information cost β‡’ lower bound on communication complexity In fact, IC lower bounds are the most powerful technique we know

24 Information Complexity of Disj.
Disjointness: is π‘‹βˆ©π‘Œ=βˆ… ? Disj 𝑋,π‘Œ = 𝑖=1 𝑛 𝑋 𝑖 ∧ π‘Œ 𝑖 Strategy: for some β€œhard distribution” πœ‡, Direct sum: 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Prove that 𝐼 𝐢 πœ‡ And β‰₯Ξ©(1).

25 Hard Distribution for Disjointness
For each coordinate π‘–βˆˆ 𝑛 : 𝑋 𝑖 =0 𝑋 𝑖 =1 π‘Œ 𝑖 =0 1/3 1/3 π‘Œ 𝑖 =1 1/3

26 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛
Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– and run Ξ  For each 𝑗≠𝑖, 𝑋 𝑗 ∧ π‘Œ 𝑗 =0 β‡’ Disj 𝑋,π‘Œ =And 𝑋 𝑖 , π‘Œ 𝑖 𝑋 π‘Œ π‘ˆ 𝑉

27 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛
Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Bad idea: publicly sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– 𝑋 Suppose in Ξ , Alice sends 𝑋 1 βŠ•β€¦βŠ• 𝑋 𝑛 . In Ξ , Bob learns one bit β‡’ in Ξ  β€² he should learn 1/𝑛 bit But if 𝑋 βˆ’π‘– is public Bob learns 1 bit about π‘ˆ! π‘ˆ π‘Œ 𝑉

28 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛
Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Another bad idea: publicly sample 𝑋 βˆ’π‘– , Bob privately samples π‘Œ βˆ’π‘– given 𝑋 βˆ’π‘– But the players can’t sample 𝑋 βˆ’π‘– , π‘Œ βˆ’π‘– independently…

29 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Let Ξ  be a protocol for Disj on 𝑋,π‘Œβˆˆ 0,1 𝑛
Construct Ξ  β€² for And as follows: Alice and Bob get inputs π‘ˆ,π‘‰βˆˆ 0,1 Choose a random coordinate π‘–βˆˆ 𝑛 , set 𝑋 𝑖 =π‘ˆ, π‘Œ 𝑖 =𝑉 Publicly sample 𝑋 1 ,…, 𝑋 π‘–βˆ’1 Privately sample 𝑋 (𝑖+1) ,…, 𝑋 𝑛 𝑋 π‘ˆ Privately sample π‘Œ 1 ,…, π‘Œ π‘–βˆ’1 Publicly sample π‘Œ 𝑖+1 ,…, π‘Œ 𝑛 π‘Œ 𝑉

30 Direct Sum Theorem Transcript of Ξ  β€² =𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ,Ξ 
Need to show: 𝐼 πœ‡ Ξ  β€² ;𝑉 π‘ˆ + 𝐼 πœ‡ Ξ  β€² ;π‘ˆ 𝑉 ≀ 𝐼 πœ‡ 𝑛 Ξ ;π‘Œ 𝑋 + 𝐼 πœ‡ 𝑛 Ξ ;𝑋 π‘Œ /𝑛 𝐼 πœ‡ Ξ  β€² ;𝑉 π‘ˆ = 𝐼 πœ‡ 𝑛 𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ,Ξ ; π‘Œ 𝑖 𝑋 𝑖 = 𝐼 πœ‡ 𝑛 Ξ ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 + 𝐼 πœ‡ 𝑛 𝑖, 𝑋 <𝑖 , π‘Œ >𝑖 ; π‘Œ 𝑖 𝑋 𝑖 ≀𝐼 Ξ , 𝑋 >𝑖 ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 =𝐼 𝑋 >𝑖 ; π‘Œ 𝑖 𝑋 ≀𝑖 , π‘Œ >𝑖 ,𝑖 +𝐼 Ξ ; π‘Œ 𝑖 𝑋, π‘Œ >𝑖 ,𝑖 =(1/𝑛) 𝑖=1 𝑛 𝐼 Ξ ; π‘Œ 𝑖 𝑋, π‘Œ >𝑖 =𝐼(Ξ ;π‘Œβ”‚π‘‹)/𝑛.

31 Information Complexity of Disj.
Disjointness: is π‘‹βˆ©π‘Œ=βˆ… ? Disj 𝑋,π‘Œ = 𝑖=1 𝑛 𝑋 𝑖 ∧ π‘Œ 𝑖 Strategy: for some β€œhard distribution” πœ‡, Direct sum: 𝐼 𝐢 πœ‡ 𝑛 Disj β‰₯𝑛⋅𝐼 𝐢 πœ‡ (And) Prove that 𝐼 𝐢 πœ‡ And β‰₯Ξ©(1). οƒΌ

32 Hardness of And 𝐼𝐢 And =𝐼 Ξ ;π‘Œ 𝑋 + 𝐼 Ξ ;𝑋 π‘Œ β‰₯Ξ© 1
2 3 𝐼 Ξ ;π‘Œ 𝑋= 𝐼 Ξ ;π‘Œ 𝑋=1 + 2 3 𝐼 Ξ ;𝑋 π‘Œ= 𝐼 Ξ ;𝑋 π‘Œ=1 11 01 00 10 1/3 transcript on 11 should be β€œvery different” =0 =0

33 Hellinger Distance β„Ž 2 𝑃,𝑄 =1βˆ’ πœ” 𝑃 πœ” 𝑄 πœ” Examples:
β„Ž 2 𝑃,𝑄 =1βˆ’ πœ” 𝑃 πœ” 𝑄 πœ” Examples: β„Ž 2 𝑃,𝑃 =1βˆ’ πœ” 𝑃 πœ” 2 =1βˆ’ πœ” 𝑃 πœ” =0 If 𝑃,𝑄 have disjoint support, β„Ž 𝑃,𝑄 =1

34 Hellinger Distance Hellinger distance is a metric
β„Ž 𝑃,𝑄 β‰₯0, with equality iff 𝑃=𝑄 β„Ž 𝑃,𝑄 =β„Ž 𝑄,𝑃 Triangle inequality: β„Ž 𝑃,𝑄 β‰€β„Ž 𝑃,𝑅 +β„Ž 𝑅,𝑄 𝑃 𝑄 𝑅

35 Hellinger Distance If for some πœ” we have 𝑃 πœ” βˆ’π‘„ πœ” =𝛿, then
β„Ž 2 𝑃,𝑄 β‰₯ 𝛿 2 11 01 00 10 β„Žβ‰₯

36 Hellinger Distance vs. Mutual Info
Let 𝑃 0 , 𝑃 1 be two distributions Select 𝑍 by choosing 𝐽∼Bernoulli , then drawing π‘βˆΌ 𝑃 𝐽 Then 𝐼 𝑍;𝐽 β‰₯ β„Ž 2 𝑃 0 , 𝑃 1 11 01 00 10 1/3 𝐼 Ξ ;π‘Œ 𝑋=0 β‰₯ β„Ž 2 Ξ  00 , Ξ  01 𝐼 Ξ ;𝑋 π‘Œ=0 β‰₯ β„Ž 2 Ξ  00 , Ξ  10

37 Hardness of And 01 11 00 10 1/3 1/3 1/3 Same for Bob until
Alice acts differently 1/3 01 β„Žβ‰₯ 11 Same for Alice until Bob acts differently 1/3 1/3 00 10

38 β€œCut-n-Paste Lemma” β„Ž Ξ  00 , Ξ  11 =β„Ž Ξ  01 , Ξ  10
β„Ž Ξ  00 , Ξ  11 =β„Ž Ξ  01 , Ξ  10 Recall: β„Ž 2 Ξ  π‘‹π‘Œ , Ξ  𝑋 β€² π‘Œ β€² =1βˆ’ 𝑑 Ξ  π‘‹π‘Œ 𝑑 Ξ  𝑋 β€² π‘Œ β€² 𝑑 Enough to show: we can write Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ β„Ž 2 Ξ  00 , Ξ  11 =1βˆ’ 𝑑 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,0 π‘ž 𝐴 𝑑,1 π‘ž 𝐡 𝑑,1 =1βˆ’ 𝑑 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,1 π‘ž 𝐴 𝑑,0 π‘ž 𝐡 𝑑,1 = β„Ž 2 Ξ  01 , Ξ  11

39 β€œCut-n-Paste Lemma” We can write Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ Proof:
Ξ  π‘‹π‘Œ 𝑑 = π‘ž 𝐴 𝑑,𝑋 β‹… π‘ž 𝐡 𝑑,π‘Œ Proof: Ξ  induces a distribution on β€œpartial transcripts” of each length π‘˜: Ξ  π‘‹π‘Œ π‘˜ 𝑑 = probability that first π‘˜ bits are 𝑑 By induction: Ξ  π‘‹π‘Œ π‘˜ 𝑑 = π‘ž 𝐴 π‘˜ 𝑑,𝑋 β‹… π‘ž 𝐡 π‘˜ 𝑑,π‘Œ Base case: Ξ  π‘‹π‘Œ π‘˜ πœ– =1 Set π‘ž 𝐴 π‘˜ πœ–,𝑋 = π‘ž 𝐡 π‘˜ πœ–,π‘Œ =1

40 β€œCut-n-Paste Lemma” Step: Ξ  π‘‹π‘Œ π‘˜+1 𝑑 = Ξ  π‘‹π‘Œ π‘˜ 𝑑 β‰€π‘˜ β‹… Pr next bit= 𝑑 π‘˜+1 Suppose after 𝑑 β‰€π‘˜ it is Alice’s turn to speak What Alice says depends on: Her input Her private randomness The transcript so far, 𝑑 β‰€π‘˜ So Pr next bit= 𝑑 π‘˜+1 =𝑓 𝑑 β‰€π‘˜ ,𝑋, 𝑑 π‘˜+1 =𝑓 𝑑,𝑋 Set π‘ž 𝐴 π‘˜+1 𝑑,𝑋 = π‘ž 𝐴 π‘˜ 𝑑 β‰€π‘˜ ,𝑋 ⋅𝑓 𝑑,𝑋 , π‘ž 𝐡 π‘˜+1 𝑑,𝑋 = π‘ž 𝐡 π‘˜ 𝑑 β‰€π‘˜ ,𝑋

41 Hardness of And 1/3 𝐼𝐢 And =𝐼 Ξ ;π‘Œ 𝑋 + 𝐼 Ξ ;𝑋 π‘Œ = 𝐼 Ξ ;π‘Œ 𝑋= 𝐼 Ξ ;𝑋 π‘Œ=0 β‰₯constβ‹… β„Ž 2 Ξ  00 , Ξ  β„Ž 2 Ξ  00 , Ξ  10 β‰₯constβ€²β‹… β„Ž Ξ  00 , Ξ  01 +β„Ž Ξ  00 , Ξ  β‰₯constβ€²β‹… β„Ž 2 Ξ  01 , Ξ  10 =constβ€²β‹… β„Ž 2 Ξ  00 , Ξ  11 β‰₯Ξ© 1 01 β„Žβ‰₯ 11 1/3 1/3 00 10

42 Multi-Player Communication Complexity

43 The Coordinator Model 𝑓 𝑋 1 ,…, 𝑋 π‘˜ = ? π‘˜ sites 𝑛 bits 𝑋 1 𝑋 2 𝑋 π‘˜ …

44 Multi-Party Set Disjointness
Input: 𝑋 1 ,…, 𝑋 π‘˜ βŠ† 𝑛 Output: is β‹‚ 𝑋 𝑖 =βˆ…? Braverman,Ellen,O.,Pitassi,Vaikuntanathan’13: lower bound of Ξ© π‘›π‘˜ bits

45 Reduction from Disj to graph connectivity
Given 𝑋 1 ,…, 𝑋 π‘˜ , we want to Choose vertices 𝑉 Design inputs 𝐸 1 ,…, 𝐸 π‘˜ such that 𝐺 𝑉, 𝐸 1 βˆͺ…βˆͺ 𝐸 π‘˜ 𝑉, 𝐸 1 βˆͺ…βˆͺ 𝐸 π‘˜ is connected iff β‹‚ 𝑋 𝑖 =βˆ…

46 Reduction from Disj to graph connectivity
𝑝 1 𝑝 2 𝑝 π‘˜ (Players) (Elements) 1 2 3 4 5 6 𝑋 𝑖 𝑛 βˆ– ⋃ 𝑋 𝑖 input graph connected ⇔ ⋃ 𝑋 𝑖 β‰  𝑛 β‹‚ 𝑋 𝑖 β‰ βˆ…

47 Other Stuff Distributed computing

48 Other Stuff Compressing down to information cost
Number-on-forehead lower bounds Open questions in communication complexity


Download ppt "Information Complexity Lower Bounds"

Similar presentations


Ads by Google