Randomized algorithm Tutorial 1 Hint for homework 1.

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

1. Frequency Distribution & Relative Frequency Distribution 2. Histogram of Probability Distribution 3. Probability of an Event in Histogram 4. Random.
1 Lecture 5 (part 2) Graphs II Euler and Hamiltonian Path / Circuit Reading: Epp Chp 11.2, 11.3.
Randomized algorithm Tutorial 2 Solution for homework 1.
Statistics review of basic probability and statistics.
1 Chapter 22: Elementary Graph Algorithms IV. 2 About this lecture Review of Strongly Connected Components (SCC) in a directed graph Finding all SCC (i.e.,
Probability Fall 2011, CISC1400 Dr. Zhang.
Selecting the the k-th smallest element in a set.
PROBABILITY QUIZ How good are You in Probability?.
Probability Distributions Discrete. Discrete data Discrete data can only take exact values Examples: The number of cars passing a checkpoint in 30 minutes.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Randomized Algorithms Tutorial 3 Hints for Homework 2.
Complexity 26-1 Complexity Andrei Bulatov Interactive Proofs.
Probabilistic (Average-Case) Analysis and Randomized Algorithms Two different but similar analyses –Probabilistic analysis of a deterministic algorithm.
11.Hash Tables Hsu, Lih-Hsing. Computer Theory Lab. Chapter 11P Directed-address tables Direct addressing is a simple technique that works well.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
TDC 369 / TDC 432 April 2, 2003 Greg Brewster. Topics Math Review Probability –Distributions –Random Variables –Expected Values.
How should a computer shuffle?. Intro - 2 Comp 122, Goal  Input: Given n items to shuffle (cards, …)  Output: Return some list of exactly those n items;
Binomial & Geometric Random Variables
Finite probability space set  (sample space) function P:  R + (probability distribution)  P(x) = 1 x 
Warm Up 1. 5x – 2 when x = – t 2 when 3. when x = Give the domain and range for this relation: {(1, 1), (–1, 1), (2, 4), (–2, 4),
1.2 Represent Functions as Rules and Tables
Chapter6 Jointly Distributed Random Variables
Instructional Strategies
Ragesh Jaiswal Indian Institute of Technology Delhi Threshold Direct Product Theorems: a survey.
Lecture 5a: Bayes’ Rule Class web site: DEA in Bioinformatics: Statistics Module Box 1Box 2Box 3.
Binomial Coefficients, Inclusion-exclusion principle
Probability of Compound Events
Fall 2002CMSC Discrete Structures1 One, two, three, we’re… Counting.
Splash Screen. Example 1 Identify Possible Zeros A. List all of the possible rational zeros of f(x) = 3x 4 – x Answer:
Chapter 9 Review. 1. Give the probability of each outcome.
Chapter 5: Probability Analysis of Randomized Algorithms Size is rarely the only property of input that affects run time Worst-case analysis most common.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Data Stream Algorithms Ke Yi Hong Kong University of Science and Technology.
Probabilistic Analysis and Randomized Algorithm. Average-Case Analysis  In practice, many algorithms perform better than their worse case  The average.
COMPSCI 102 Introduction to Discrete Mathematics.
Machine Learning Chapter 5. Evaluating Hypotheses
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
Randomization Carmella Kroitoru Seminar on Communication Complexity.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Section 7.4 Use of Counting Techniques in Probability.
Random Variables Example:
Homework 4 Chapter 3 Problems: 38, 46, 50, 54, 55, 58, 61, 64, 69, 74.
ICS 353: Design and Analysis of Algorithms
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Great Theoretical Ideas in Computer Science for Some.
2/24/20161 One, two, three, we’re… Counting. 2/24/20162 Basic Counting Principles Counting problems are of the following kind: “How many different 8-letter.
1 4.1 Hash Functions and Data Integrity A cryptographic hash function can provide assurance of data integrity. ex: Bob can verify if y = h K (x) h is a.
Warm-Up Exercises Warm-up: Countdown to Mastery Week #4 Homework- Page 44 #3-9 all #14-21 all.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
1 Lecture 5 (part 2) Graphs II (a) Circuits; (b) Representation Reading: Epp Chp 11.2, 11.3
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Warm-Up Exercises Warm-up: Homework: Worksheet given in class. #1-20 all.
Practice Problems Actex 3, 4, 5. Section 3 -- #3 A box contains 4 red balls and 6 white balls. A sample of size 3 is drawn without replacement from the.
Theory of Computational Complexity Yusuke FURUKAWA Iwama Ito lab M1.
PROBABILITY AND COMPUTING RANDOMIZED ALGORITHMS AND PROBABILISTIC ANALYSIS CHAPTER 1 IWAMA and ITO Lab. M1 Sakaidani Hikaru 1.
11-2 Basic Probability.
The Pigeonhole Principle
Known Probability Distributions
Introduction to probability (5)
Topic 7: Pseudorandom Functions and CPA-Security
1.6 Represent Functions as Rules and Tables
Chapter 2.3 Counting Sample Points Combination In many problems we are interested in the number of ways of selecting r objects from n without regard to.
11-2 Basic Probability.
P.O.D. # 32 2/4/19.
Lecture 2 Basic Concepts on Probability (Section 0.2)
Lesson 3.3 Writing functions.
Presentation transcript:

Randomized algorithm Tutorial 1 Hint for homework 1

Outline Lookup table problem (Exercise 1.18) Convex function (Exercise 2.10) Hint for homework 1

Lookup table problem

F:{0,…,n-1}→{0,…,m-1}. For any x and y with 0 ≦ x, y ≦ n-1: F((x + y) mod n) = (F(x) + F(y)) mod m. x f(x)f(x) n=10, m=7

Lookup table problem Now, someone has changed exactly 1/5 of the lookup table entries x f(x)f(x)

Lookup table problem (a) Given any input z, outputs the correct value F(z) with probability at least 1/2. (it needs to work for every value) 1. Pick a x, uniformly at random from [0,…,n-1]. 2. Compute y =z-x mod n. 3. output (F(x)+F(y)) mod m as our computed value for F(z).

Lookup table problem Pr (the output F(z) is correct) ≧ Pr (F(x) and F(y) are not modified) = 1-Pr (F(x) or F(y) are modified) ≧ 1- (Pr (F(x) is modified) + Pr (F(y) is modified)) ≧ 1-(1/5 + 1/5) = 3/5

Lookup table problem (b) Can you improve the probability of returning the correct value with repeating your algorithm three times? 1. Pick three values (with replacement) of y 1, y 2, y 3 as y values. 2. compute the corresponding values of F(z 1 ),F(z 2 ),F(z 3 ). 3. If two or more of the F(z) values are equal, we output such value. Otherwise, we output any one of these values.

Lookup table problem Pr (the output F(z) is correct) ≧ Pr (at least two of the F(z i ),F(z 2 ),F(z 3 ) are correct) = Pr (exactly two of the F(z 1 ),F(z 2 ),F(z 3 ) are correct)+Pr (all F(z 1 ),F(z 2 ),F(z 3 ) are correct) ≧ 3*3/5*3/5*2/5+3/5*3/5*3/5 = 54/125+27/125 = 81/125

Convex function

Show by induction that if f:R→R is convex then, for any x 1, x 2, …, x n and y 1, y 2, …, y n with

Convex function x1x1 x2x2 (y 1 x 1 +y 2 x 2,y 1 f(x 1 )+y 2 f(x 2 ))

Convex function

Use former inequality to prove that if f:R→R is convex then E[f(X)] ≧ f(E[X])

Hint for homework 1

Homework 1-1 :a:a :b:b :c:c :d:d What is the probability? Box 1Box 2 step 1 step 2

Homework 1-1 Hint: There are two cases  The ball from one to the other is black.  The ball from one to the other is white.

Pr( )=1/3 Homework 1-2 :a:a :b:b The probability we get a white ball at the first time is a/a+b. step 1 step 2

Homework 1-3 Give an example of three random events X, Y, Z for which any pair are independent but all three are not mutually independent. Hint: Two dices with X=sum is even. You may use to check your answer.

Homework 1-4 There can be at most n(n-1)/2 distinct min-cut sets. Hint: [Theorem 1.8] The algorithm outputs a min-cut set with probability at least 2/n(n-1).

Homework Hint: Indicator variable/linearity of expectation.

Homework 1-6 (a) What is the probability that X = Y ? (b) What is E[max(X,Y )]?  min(X,Y) is the random variable denoting the number of steps you see the first head. X+Y-min(X,Y) = max(X,Y).  Memory-less property of geometric random variable.

Homework 1-7 First, we give everyone a number card. The number card means the order of interview. Choose the best grade from 1 to m : A. If someone after m better than A, accept him. Otherwise, we choose the last one. 12mm+1m+2n

Homework 1-7 Sometimes, we may not get the best candidate.  The best one’s number card is less than m+1.  A, the best grade we chose from 1 to m, is not too good. What is a “nice” m?  The second best candidate is in 1~m.