Growth of functions CSC317.

Slides:



Advertisements
Similar presentations
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Advertisements

5/5/20151 Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Chapter 1 – Basic Concepts
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Growth of Functions CIS 606 Spring 2010.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Instructor Neelima Gupta
Asymptotic Notations Iterative Algorithms and their analysis
Lecture 2 Computational Complexity
Iterative Algorithm Analysis & Asymptotic Notations
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
10/25/20151 CS 3343: Analysis of Algorithms Lecture 6&7: Master theorem and substitution method.
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
MS 101: Algorithms Instructor Neelima Gupta
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Time Complexity of Algorithms (Asymptotic Notations)
COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
David Meredith Growth of Functions David Meredith
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
CSC317 1 Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound.
Asymptotic Complexity
Chapter 2 Algorithm Analysis
Analysis of algorithms
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Introduction to Algorithms
What is an Algorithm? Algorithm Specification.
Introduction to Algorithms (2nd edition)
Complexity analysis.
Complexity & the O-Notation
CS 3343: Analysis of Algorithms
Analysis of algorithms
Complexity Analysis.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
O-notation (upper bound)
CSC 413/513: Intro to Algorithms
Introduction to Algorithms Analysis
Asymptotic Growth Rate
CSI Growth of Functions
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Advanced Analysis of Algorithms
Chapter 2.
CS200: Algorithms Analysis
CSE 373, Copyright S. Tanimoto, 2002 Asymptotic Analysis -
Performance Evaluation
O-notation (upper bound)
At the end of this session, learner will be able to:
Discrete Mathematics 7th edition, 2009
Analysis of algorithms
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Presentation transcript:

Growth of functions CSC317

• We ignore constants and low order terms; why? Growth of functions We’ve already been talking about “grows as” for the sort examples, but what does this really mean? We already know that: • We ignore constants and low order terms; why? • Asymptotic analysis: we focus on large input size; growth of function for large input; why? Complexity petting zoo (see notes of Burt Rosenberg): http://blog.cs.miami.edu/burt/2014/09/01/a-complexity-petting-zoo/) CSC317

Why are we going top the zoo? Complexity classes (we are just scratching the surface) Constant time: T(1) Example: Access first number in an array (also second number...) T(log n) Example: Binary search: Sorted array A; find value v between range low and high A = [1 3 4 10 15 23 35 40 45], find v=4 Solution: Search in middle of array: value found, or recursion left side, or recursion right half (sound familiar?) CSC317

T(n) Example: Largest number in sequence, sum of fixed sequence Whenever you step through entire sequence or array Even if you have to do this 20 times T(n log n) Example (should know one by now)? T(n2) Example (should know one by now)? T(n3) Example: Naïve matrix multiplication (for an n by n matrix) CSC317

Complexity classes Polynomial time (class P): T(n), T(n logn), T(n2), T(n3) T(nk), k ≥ 0 More than polynomial: exponential T(2n) CSC317

What about this: subset sum problem? How long to find a solution? Input: set of integers size n, A = [1,4,-3,2,9,7] Output: is there a subset that sums to 0? Might take exponential time if we have to go through every possible subset (brute force) But what if I handed you a subset, say [1, -3, 2]? How long would it take to verify this sum is 0? Polynomial, linear, time. CSC317

Complexity classes Algorithms that are verifiable in polynomial time (good) are called NP class May take exponential number to go through every possible input (possibly bad) Example: Subset sum problem A=[1, 4, -3, 2, 9, 7] Is there a subset that sums to 0? [1, -3, 2] is verifiable to sum to 0 quickly CSC317

Example: Subset sum problem A=[1, 4, -3, 2, 9, 7] Is there a subset that sums to 0? [1, -3, 2] is verifiable to sum to 0 quickly An algorithm that solves this problem is: form one by one each and every of the 2n subsets of A, and see if the subset sums to zero. How long do we need to run through those? T(nk). The problem is P (polynomial). However, we guess-and-check by randomly creating the subsets, hence NP non-deterministic polynomial CSC317

NP class: Nondeterministic = random = if I was magically handed solution. Originally from nondeterministic Turing machine P = NP ? Can problem that is quickly verifiable (i.e., polynomial time) be quickly solved (i.e., polynomial time)? Unknown; Millenium prize problem CSC317

CSC317

CSC317

Low asymptotic run time = faster b: T(n) = 0.2n2 a: T(n) = n log(n) + 1000 Low asymptotic run time = faster CSC317

Big Oh notation Definition: Big O notation describes the limiting behavior of a function when the arguments tend towards a particular value or infinity usually in terms of simpler functions. For us Big Oh allows us to classify algorithms in terms of their response (e.g., in their processing time or working space requirements) to changes in input size. Asymptotic upper bound; bounded from above by g(n) for large enough n More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 CSC317

Asymptotic upper bound; bounded from above by g(n) for large enough n More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 To show that our relation holds, we just need to find one such pair c and n0 … CSC317

Asymptotic upper bound; bounded from above by g(n) for large enough n More formal definition: O(g(n)) = f(n). There exists a positive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 f(n) = n2+10n is O(n2) CSC317

Definition: O(g(n)) = f(n) Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 Examples of functions in O(n2): f(n) = n2 f(n) = n2 + n f(n) = n2 + 1000n All bound asymptotically by n2 Intuitively, constants and lower order don’t matter ... CSC317

Question: What about f(n) = n ? Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 Question: What about f(n) = n ? g(n) = n2 is not a tight upper bound but it’s an upper bound since n ≤ 1n2 for all n ≥ 1(i.e. c = 1, n0 = 1) f(n) = n is O(n2) CSC317

Definition: O(g(n)) = f(n) Definition: O(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 What about this? f(n) = aknk + ... + a1n + a0. Is f(n) = O(nk) ? Intuition: Yes (we can ignore lower order terms and constants), but we need proof. Proof : we want to find n0; c such that f(n) ≤ cnk CSC317

f(n) = aknk + ... + a1n + a0. ak > 0 Proof : we want to find n0; c such that f(n) ≤ cnk , f(n) ≥ 0 f(n) = aknk + ... + a1n + a0 ≤ |ak|nk + ... + |a1|n + |a0| ≤ |ak|nk + ... + |a1|nk + |a0|nk = (|ak| + ... + |a1| + |a0|)nk what are n0; c ? CSC317

Recap: Why are we talking about Big Oh? Most commonly used! Asymptotic upper bound; bounded from above by g(n) for large enough n However, there are other bounds too! CSC317

Big Omega Asymptotic lower bound; bounded from below by g(n) for large enough n Definition: Ω(g(n)) = f(n). There exists a psitive constant c such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 Why is this less often used? CSC317

Big Theta Asymptotic tight bound; bounded from below and above by g(n) for large enough n Definition: Θ(g(n)) = f(n): There exist positive constants c1, and c2 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 Stronger statement (Note that literature is sometimes sloppy and says Oh when actually Theta) CSC317

Examples of Oh, Omega, Theta Functions f(n) in O(n2) Functions f(n) in Ω(n2) Functions f(n) in Θ(n2) f(n) = n2; f(n) = n2 + n; f(n) = n f(n) = n2; f(n) = n2 + n; f(n) = n5 f(n) = n2; f(n) = n2 – n CSC317

Recap: Oh, Omega, Theta Oh (like ≤) Omega (like ≥) Theta (like =) O(n) is asymptotic upper bound 0 ≤ f(n) ≤ cg(n) Ω(n) is asymptotic lower bound 0 ≤ cg(n) ≤ f(n) Θ(n) is asymptotic tight bound 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) CSC317

More on Oh, Omega, Theta Theorem: f(n) = Θ(n) if and only if (iff) f(n) = O(n) and f(n) = Ω(n) Question: Is f(n) = n2 + 5 Ω(n3) ? Answer: NO (why?)! CSC317

Question: Is f(n) = n2 + 5 Ω(n3) ? Answer: NO! Definition: Θ(g(n)) = f(n): There exist positive constants c1, and c2 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0 Therefore: If f(n) = Ω(n3), then there exists n0; c such that for all n ≥ n0 n2 + 5 ≥ cn3 Remember from before: n2 + 5n2 ≥ n2 + 5 n2 + 5n2 ≥ cn3 6 ≥ cn Can be true for all n ≥ n0 CSC317

Some properties of Oh, Omega, Theta Transitivity: f(n) = Θ(g(n)) and g(n) = Θ(h(n)) then f(n) = Θ(h(n)) (same for O and Ω) Reflexivity: f(n) = Θ(f(n)) (same for O and Ω) Symmetry: f(n) = Θ(g(n)) iff g(n) = Θ(g(n)) CSC317