Instructor Neelima Gupta

Slides:



Advertisements
Similar presentations
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Advertisements

Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
I Advanced Algorithms Analysis. What is Algorithm?  A computer algorithm is a detailed step-by-step method for solving a problem by using a computer.
Reference: Tremblay and Cheston: Section 5.1 T IMING A NALYSIS.
Chapter 3 Growth of Functions
The Growth of Functions
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Introduction to Analysis of Algorithms
Cmpt-225 Algorithm Efficiency.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
David Luebke 1 8/17/2015 CS 332: Algorithms Asymptotic Performance.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Program Performance & Asymptotic Notations CSE, POSTECH.
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
MCA 202: Discrete Mathematics Instructor Neelima Gupta
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
A Lecture /24/2015 COSC3101A: Design and Analysis of Algorithms Tianying Ji Lecture 1.
MCA 202: Discrete Structures Instructor Neelima Gupta
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
MS 101: Algorithms Instructor Neelima Gupta
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
David Luebke 1 11/29/2015 CS 332: Algorithms Introduction Proof By Induction Asymptotic notation.
Time Complexity of Algorithms (Asymptotic Notations)
Algorithm Analysis Part of slides are borrowed from UST.
David Luebke 1 1/6/2016 CS 332: Algorithms Asymptotic Performance.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Spring 2015 Lecture 2: Analysis of Algorithms
CSC 413/513: Intro to Algorithms Introduction Proof By Induction Asymptotic notation.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Advanced Algorithms Analysis and Design
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 2.
CIS 313: Analysis and Design of Algorithms Assoc. Prof Hassan Aly Department of Computer Science and Information College of Science at AzZulfi Al-Majmaah.
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Mathematical Foundations (Solving Recurrence)
COMP108 Algorithmic Foundations Algorithm efficiency
Introduction to Algorithms
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
The Growth of Functions
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms Lecture 9.
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
Asymptotic Growth Rate
At the end of this session, learner will be able to:
Introduction To Algorithms
Estimating Algorithm Performance
Presentation transcript:

Instructor Neelima Gupta ngupta@cs.du.ac.in MS 101: Algorithms Instructor Neelima Gupta ngupta@cs.du.ac.in

Mathematical Induction: Review Growth Functions Table Of Contents Mathematical Induction: Review Growth Functions

Review: Mathematical Induction Suppose S(k) is true for fixed constant k Often k = 0 S(n)  S(n+1) for all n >= k Then S(n) is true for all n >= k

Proof By Mathematical Induction Claim:S(n) is true for all n >= k Basis: Show formula is true when n = k Inductive hypothesis: Assume formula is true for an arbitrary n Step: Show that formula is then true for n+1

Strong Induction Strong induction also holds Another variation: Basis: show S(0) Hypothesis: assume S(k) holds for arbitrary k <= n Step: Show S(n+1) follows Another variation: Basis: show S(0), S(1) Hypothesis: assume S(n) and S(n+1) are true Step: show S(n+2) follows

Lets do it Prove 1 + 2 + 3 + … + n = n(n+1) / 2 Prove a0 + a1 + … + an = (an+1 - 1)/(a - 1) for all a  1

Growth Functions Big O Notation In general a function Formally f(n) is O(g(n)) if there exist positive constants c and n0 such that f(n)  c  g(n) for all n  n0 Formally O(g(n)) = { f(n):  positive constants c and n0 such that f(n)  c  g(n)  n  n0 Intuitively, it means f(n) grows no faster than g(n). Examples: n^2, n^2 – n n^3, n^3 – n^2 – n

f(n) = n2 g(n) = n2 – n Is f(n) = O(g(n))? Sol: g(n) = n2 – n = n2/2 + n2/2 - n ≥ n2/2 for n ≥ 2 = ½ f(n) f(n) ≤ 2g(n) for n ≥ 2 Hence, f(n) = O(g(n))

f(n) = n3 g(n) = n3 - n2 – n Is f(n) = O(g(n))? Sol: g(n) = n3 - n2 – n = n3/2 + n3/2 - n2 - n ≥ n3/2 + n3/2 - n3/4 - n3/4 for n ≥ 4 = ½ f(n) f(n) ≤ 2g(n) for n ≥ 2 Hence, f(n) = O(g(n))

f(n) = n3 g(n) = n3 - n2 – n Is g(n) = O(f(n))? Sol: Clearly, g(n) = n3 - n2 – n ≤3 n3 for all n ≥ 1 = 3 f(n) g(n) ≤ 3f(n) V n ≥ 1 Hence, g(n) = O(f(n)).

Omega Notation In general a function f(n) is (g(n)) if  positive constants c and n0 such that 0  cg(n)  f(n)  n  n0 Intuitively, it means f(n) grows at least as fast as g(n). Examples: n^2, n^2 + n n^3, n^3 + n^2 – n

Ques: f(n) = n2, g(n) = n2 + n Is f(n) = Ω(g(n))? Sol: n2 = ½ (n2 + n2 ) ≥ ½ (n2 + n ) f(n) ≥ c g(n) for c = ½ m =1

f(n) = n3 g(n) = n3 + 4n2 - 5n Is g(n) = Ω(f(n))? Sol: g(n) = n3 + 4n2 - 5n ≥ n3 /2 + n3 /2 - 4 n2 - 5n2 for all n ≥ 1 = n3 /2 + n2 (n/2 - 9) ≥ n3 /2 for n ≥ 18 = 1/2 f(n) Hence, g(n) = Ω (f(n)).

Theta Notation A function f(n) is (g(n)) if  positive constants c1, c2, and n0 such that c1 g(n)  f(n)  c2 g(n)  n  n0

Relations Between Q, W, O Theorem : For any two functions g(n) and f(n), f(n) = (g(n)) iff f(n) = O(g(n)) and f(n) = (g(n)). Proof: Simple. Do it.

Assignment No 1 Q1 Already proved a0 + a1 + … + an = (an+1 - 1)/(a - 1) for all a  1 What is the sum for a = 2/3 as n  infinity? Is it O(1)? Is it big or small? For a = 2, is the sum = O(2^n)? Is it big or small? Q2 Show that a polynomial of degree k = theta(n^k).

Other Asymptotic Notations A function f(n) is o(g(n)) if for every positive constant c, there exists a constant n0 > 0 such that f(n) < c g(n)  n  n0 A function f(n) is (g(n)) if for every positive constant c, there exists a constant n0 > 0 such that c g(n) < f(n)  n  n0 Intuitively, () is like > () is like  () is like = o() is like < O() is like 

Arrange some functions f(n) = O(g(n)) => f(n) = o(g(n)) ? Is the converse true? Let us arrange the following functions in ascending order (assume log n = o(n) is known) n, n^2, n^3, sqrt(n), n^epsilon, log n, log^2 n, n log n, n/log n, 2^n, 3^n

Relation between n & n2 Solution: let c > 0 be any constant such that n < c n2 i.e. 1 < c n i.e. n > 1 / c Hence n < c n2 for n > 1 / c i.e. n = o( n2) we can also write it as n < n2.

Relation between n2 & n3 Solution: let c> 0 be any constant such that n2 < c n3 Þ 1 < c n Þ n > 1 / c Hence n2 < c n3 " n > 1 / c i.e. n2 = o( n3) we can also write it as n2 < n3. Combining the previous result we can write n < n2 < n3

Relation between n & n1/2 i.e. n2 = o( n3) Solution: let c> 0 be any constant such that n1/2 < c n i.e n < c2 n2 i.e 1 < c2 n i.e n > 1/ c2 i.e. n1/2 < c n for n > 1/c2 i.e. n2 = o( n3) we can also write it as n1/2 < n. Combining the previous result n1/2 < n < n2 < n3

Relation between n & log n For the time being we can assume the result log ( n ) = o(n) Þ log ( n ) < n we will prove it later.

Relation between n1/2 & log n Assume log n = o(n) let c > 0 be any constant for c/2 > 0 there exists m > 0 such that log n < (c/2) n for n > m changing variables from n to n1/2 we get log(n1/2 ) < (c/2) n1/2 for n1/2 > m ½ log( n ) < (c/2) n1/2 for n > m2

Contd.. let m2 = k log( n ) < c n1/2 for n > k Since c > 0 was chosen arbitrarily hence log n = o( n1/2 ) or log n < n1/2 Combining the results we get log n < n1/2 < n < n2 < n3

Relation between n2 & nlog n Since log n = o(n) for c > 0,  n0 > 0 such that  n  n0, we have log n < c n Multiplying by n on both sides we get n log( n ) < c n2  n  n0 nlog n = o( n2 ) nlog n < n2

Relation between n & nlog n Solution: let c> 0 be any constant such that n < c n log (n) Þ 1 < c log( n ) Þ log( n) > 1 / c Þ n > e1/c i.e. n < c n log n " n > e1/c Since c was chosen arbitrarily \ n = o( n log n ) or n < n log n Combining the results we can get log n < n1/2 < n < n logn < n2 < n3

Relation between n & n/log n We know that n = o(nlogn) for c > 0,  n0 > 0 such that  n  n0, we have n < c n log n dividing both sides by log n we get n/ log( n) < c n  n  n0 Þ n / logn = o(n) i.e. n / logn < n

Assignment No 2 Show that log^M n = o(n^epsilon) for all constants M>0 and epsilon > 0. Assume that log n = o(n). Also prove the following Corollary: log n = o(n/log n) Show that n^epsilon = o(n/logn ) for every 0 < epsilon < 1 .

Hence we have, log n < n/log n < n1/2 < n < n logn < n2 < n3

Assignment No 3 Show that Show that log n = o(n). lim f(n)/g(n) = 0 => f(n) = o(g(n)). n → ∞ lim f(n)/g(n) = c => f(n) = θ(g(n)). n → ∞, where c is a positive constant. Show that log n = o(n). Show that n^k = o(2^n) for every positive constant k.

Show by definition of ‘small o’ that a^n = o(b^n) whenever a < b , a and b are positive constants. Hence we have, log n < n/log n < n1/2 < n < n logn < n2 < n3 <2n < 3n

Why the constants ‘c’ and ‘m’? Suppose we have two algorithms to solve the problem say sorting: Insertion Sort and Merge sort for eg. Why should we have more than one algorithm to solve the same problem? Ans: efficiency. What’s the measure of efficiency? Ans: System resources for example ‘time’. How do we measure time?

Contd.. IS(n) = O(n^2) MS(n) = O(nlog n) MS(n) is faster than IS(n). Suppose we run IS on a fast machine and MS on a slow machine and measure the time (since they were developed by two different people living in different part of the globe), we may get less time for IS and more for MS…wrong analysis Solution: count the number of steps on a generic computational model

Computational Model: Analysis of Algorithms Analysis is performed with respect to a computational model We will usually use a generic uniprocessor random-access machine (RAM) All memory equally expensive to access No concurrent operations All reasonable instructions take unit time Except, of course, function calls Constant word size Unless we are explicitly manipulating bits

Running Time Number of primitive steps that are executed Except for time of executing a function call, in this model most statements roughly require the same amount of time y = m * x + b c = 5 / 9 * (t - 32 ) z = f(x) + g(y) We can be more exact if need be

But why ‘c’ and ‘m’? Because We compare two algorithms on the basis of their number of steps and the actual time taken by an algorithm is ‘c’ times the number of steps.

Why ‘m’? We need efficient algorithms and computational tools to solve problems on big data. For example, it is not very difficult to sort a pack of 52 cards manually. However, to sort all the books in a library on their accession number might be tedious if done manually. So we want to compare algorithms for large input.

An Example: Insertion Sort InsertionSort(A, n) { for i = 2 to n { key = A[i] j = i - 1; while (j > 0) and (A[j] > key) { A[j+1] = A[j] j = j - 1 } A[j+1] = key } }

Assignment 4 Show that Insertion Sort takes O(n^2) steps by counting each and every step. Is it O(n)? Is it O(n^3)?

Lower Bound Notation We say InsertionSort’s run time is (n) Proof: Suppose run time is an + b Assume a and b are positive (what if b is negative?) an  an + b

Up Next Solving recurrences Substitution method Master theorem