COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5.

Slides:



Advertisements
Similar presentations
BY Lecturer: Aisha Dawood. The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains.
Advertisements

Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
Logistics HW due next week Midterm also next week on Tuesday
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Asymptotic Notation (O, Ω,  ) s Describes the behavior of the time or space complexity for large instance characteristics s Common asymptotic functions.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
Chapter 1 – Basic Concepts
Asymptotic Growth Rate
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Analysis Motivation Definitions Common complexity functions
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Analysis of Performance
Algorithm analysis and design Introduction to Algorithms week1
Algorithmic Complexity: Complexity Analysis of Time Complexity Complexities Nate the Great.
Program Performance & Asymptotic Notations CSE, POSTECH.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
Lecture 2 Computational Complexity
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
SNU IDB Lab. Ch3. Asymptotic Notation © copyright 2006 SNU IDB Lab.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
Asymptotic Notation (O, Ω, )
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
Nattee Niparnan Dept. of Computer Engineering, Chulalongkorn University.
Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Time Complexity of Algorithms (Asymptotic Notations)
General rules: Find big-O f(n) = k = O(1) f(n) = a k n k + a k-1 n k a 1 n 1 + a 0 = O(n k ) Other functions, try to find the dominant term according.
Introduction to Algorithms Lecture 2 Chapter 3: Growth of Functions.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
David Meredith Growth of Functions David Meredith
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
Asymptotic Notation Faculty Name: Ruhi Fatima
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
DR. Gatot F. Hertono, MSc. Design and Analysis of ALGORITHM (Session 2)
1 Section 5.6 Comparing Rates of Growth We often need to compare functions ƒ and g to see whether ƒ(n) and g(n) are about the same or whether one grows.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Analysis.
Introduction to Algorithms
What is an Algorithm? Algorithm Specification.
Growth of functions CSC317.
CS 3343: Analysis of Algorithms
O-notation (upper bound)
Asymptotic Notations Algorithms Lecture 9.
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Growth Rate
Asymptotic Analysis.
Fundamentals of Algorithms MCS - 2 Lecture # 9
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Lecture 13: Cost of Sorts CS150: Computer Science
Advanced Analysis of Algorithms
Chapter 2.
CS200: Algorithms Analysis
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Performance Evaluation
G.PULLAIAH COLLEGE OF ENGINEERING AND TECHNOLOGY
At the end of this session, learner will be able to:
Chapter 3 Growth of Functions
Advanced Analysis of Algorithms
Analysis of Algorithms Big-Omega and Big-Theta
Presentation transcript:

COP 3530 Spring2012 Data Structures & Algorithms Discussion Session Week 5

Outline Growth of functions – Big Oh – Omega – Theta – Little Oh

Growth of Functions Gives a simple view of the algorithm’s efficiency. Allows us to compare the relative performance of alternative algorithms. f1(n) is O(n) f2(n) is O(n^2)

Growth of Functions Exact running time of an algorithm is usually hard to compute, and it’s unnecessary. For large enough inputs, the lower-order terms of an exact running time are dominated by high-order terms. f(n) = n^2 + 5n n^2 >> 5n + 234, when n is large enough

Asymptotic Notation: Big Oh (O) f(n)= O(g(n)) iff there exist positive constants c and n0 such that f(n) ≤ cg(n) for all n ≥ n0 O-notation to give an upper bound on a function

Asymptotic Notation: Big Oh (O) Example 1[linear function] f(n) = 3n+2 For n >= 2, 3n+2 <= 3n+n <= 4n. So f(n) = O(n). We can arrive the same conclusion in other ways. For example, 3n The specific values of c and n0 used to satisfy the definition of big oh are not important, we only say f(n) is big oh of g(n). c/n0 do not matter.

Asymptotic Notation: Big Oh (O) Example 2[quadric function] f(n) = 10n^2+4n+2 For n >= 2, f(n) <= 10n^2+5n. For n>= 5, 5n < n^2. Hence for n>= n0 = 5, f(n) <= 10n^2+n^2 = 11n^2. Therefore, f(n) = O(n^2). Only the highest order term matters !!!

Asymptotic Notation: Big Oh (O) Example 3[exponential function] f(n) = 6*2^n + n^2 For n>=4, n^2 <= 2^n. f(n) =4 Therefore, 6*2^n + n^2 = O(2^n) Example 4[constant function] f(n) = 3 For any n, f(n) <= 4*1 Therefore, f(n) = O(1)

Asymptotic Notation: Big Oh (O) Example 5[loose bounds] f(n) = 3n+3 For n >= 10, 3n+3 <= 3n^2. Therefore, f(n) = O(n^2). Usually, we mean tight upper bound when using big oh notation. Example 6[Incorrect bounds] 3n+2 != O(1) since we cannot find n0/c such that 3n + 2 =c0 (n can be infinity). Similarly, 10n^2 + 6n + 2 != O(n).

Asymptotic Notation: Big Oh (O) The specific values of c and n0 used to satisfy the definition of big oh are not important, we only say f(n) is big oh of g(n). c/n0 do NOT matter, WHY? f(n)=n. It’s close to 10n when comparing with n^2, n^3 f(n) is relatively small when n<n0

Asymptotic Notation: Omega Notation Big oh provides an asymptotic upper bound on a function. Omega provides an asymptotic lower bound on a function.

Asymptotic Notation: Omega Notation Example 7 f(n) = 3n+3 > 3n for all n. So f(n) = Omega(n) Example 8[loose bounds] f(n) = 3n+3 > 1 for all n, so f(n) = Omega(1)

Asymptotic Notation: Theta Notation Theta notation is used when function f can be bounded both from above and below by the same function g

Asymptotic Notation: Theta Notation Example 9: f(n) = 3n+3 is Theta(n), since n = 3. Similarly, f(n) = 3n+2 is Theta(n) f(n) = 5n^2 - 10n + 9 is Theta(n^2) f(n) is Theta(g(n)) iff f(n) is Omega(g(n)) and O(g(n))

Asymptotic Notation: Little oh (o) The asymptotic upper bound provided by O-notation may or may not be asymptotically tight. 2n = O(n) is tight, 2n = O(n^2) is not tight. We use o-notation to denote an upper bound that is NOT asymptotically tight.

Asymptotic Notation: Little oh (o) f(n) = o(g(n)) iff f(n) = O(g(n)) and f(n) != Omega(g(n)) Example 10 3n+2 = o(n^2) as 3n+2 = O(n^2) and 3n+2 != Omega(n^2) Example 11 3n+2 != o(n) as 3n+2 = Omega(n)

Review Big oh: upper bound on a function. Omega: lower bound. Theta: lower and upper bound. - f(n) is Theta(g(n)) iff f(n) is O(g(n)) and Omega(g(n)) Little oh: loose upper bound. - f(n) = o(g(n)) iff f(n) = O(g(n)) and f(n) != Omega(g(n))

Office Hour This Week: Thursday 9 th period at E309