Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

Discrete Structures CISC 2315
Algorithm Analysis.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Chapter 3 Growth of Functions
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Complexity Analysis (Part II)
20-Jun-15 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
Cmpt-225 Algorithm Efficiency.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Cost Algorithm Complexity. Algorithm Cost.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
Instructor Neelima Gupta
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Analysis CS 367 – Introduction to Data Structures.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Design and Analysis Algorithm Drs. Achmad Ridok M.Kom Fitra A. Bachtiar, S.T., M. Eng Imam Cholissodin, S.Si., M.Kom Aryo Pinandito, MT Pertemuan 04.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
CS 3343: Analysis of Algorithms
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
CSCI 3160 Design and Analysis of Algorithms Tutorial 1
Asymptotic Analysis-Ch. 3
ASYMPTOTIC COMPLEXITY CS2111 CS2110 – Fall
Algorithms Growth of Functions. Some Notation NNatural numbers RReal numbers N + Positive natural numbers R + Positive real numbers R * Non-negative real.
©Silberschatz, Korth and Sudarshan3.1 Algorithms Analysis Algorithm efficiency can be measured in terms of:  Time  Space  Other resources such as processors,
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Week 2 - Wednesday.  What did we talk about last time?  Running time  Big Oh notation.
MS 101: Algorithms Instructor Neelima Gupta
MS 101: Algorithms Instructor Neelima Gupta
Analysis of Algorithms1 O-notation (upper bound) Asymptotic running times of algorithms are usually defined by functions whose domain are N={0, 1, 2, …}
Growth of Functions. 2 Analysis of Bubble Sort 3 Time Analysis – Best Case The array is already sorted – no swap operations are required.
RUNNING TIME 10.4 – 10.5 (P. 551 – 555). RUNNING TIME analysis of algorithms involves analyzing their effectiveness analysis of algorithms involves analyzing.
Asymptotic Analysis (based on slides used at UMBC)
Time Complexity of Algorithms (Asymptotic Notations)
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Big-O. Algorithm Analysis Exact analysis: produce a function f(n) measuring how many basic steps are needed for a given inputs n On any input of size.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
13 February 2016 Asymptotic Notation, Review of Functions & Summations.
21-Feb-16 Analysis of Algorithms II. 2 Basics Before we attempt to analyze an algorithm, we need to define two things: How we measure the size of the.
Asymptotic Notation Faculty Name: Ruhi Fatima
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Algorithms Lecture #05 Uzair Ishtiaq. Asymptotic Notation.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Chapter 2 Algorithm Analysis
Introduction to Algorithms
Week 2 - Friday CS221.
Growth of functions CSC317.
O-notation (upper bound)
Searching, Sorting, and Asymptotic Complexity
Analysis of Algorithms II
O-notation (upper bound)
Advanced Algorithms Analysis and Design
Complexity Analysis (Part II)
Presentation transcript:

Big-O and Friends

Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n) = 2n 2 + 3n + 1 n = f(n) = Let g(n) = n 2 and let c = 3 cg(n) = So f(n) = N, where c = 3, g(n) = n 2, and N = 4 Hence, f(n) = 2n 2 + 3n + 1 is O(n 2 )

Informal explanation A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N The part about “for all n >= N ” means we can ignore small values of n The c lets us adjust for functions that vary only by a constant N ok to ignore f(n) g(n) f(n)=g(n)+k 2g(n)

Big-O is an upper bound Big-O is an upper bound, not a lower bound Here is the same example as before: n = f(n) = Let g(n) = n 3 and let c = 1 cg(n) = So f(n) = N, where c = 1, g(n) = n 3, and N = 4 Hence, f(n) = 2n 2 + 3n + 1 is O(n 3 )

One more time Again, A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) N n = f(n) = Let g(n) = 2 n and let c = 1 cg(n) = So f(n) = N, where c = 1, g(n) = 2 n, and N = 7 That is, 2n 2 + 3n + 1 = 7 Hence, f(n) = 2n 2 + 3n + 1 is O(2 n )

Big-O is a sloppy bound Formally, Big-O is just an upper bound For example, all the sorting algorithms we have studied can be said to be O(n 2 ), or O(n 3 ), or O(n 3 log n), or O(2 n ), etc. However, these sorting algorithms are not O(1), or O(log n), or O(n) If you are asked on a test a question such as “Is insertion sort an O(n 3 ) algorithm?” the correct answer is “Yes” Informally, however, we use Big-O to mean “the least upper bound we can find”

Big-O is formally defined A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Note that this does not put any restrictions on the form of the function g(n) Hence, it is reasonable and legal to say that, for example, an algorithm is O(2n 2 + 3n + 1) We usually want to simplify the function g(n) Since Big-O notation is formally defined, we can prove certain properties of Big-O that allow us to simplify the g(n) function

Big-O notation is transitive If f(n) is O(g(n)), and g(n) is O(h(n)), then f(n) is O(h(n)) –By definition, there exist c 1 and N 1 such that f(n) = N 1 –By definition, there exist c 2 and N 2 such that g(n) = N 2 –Hence, f(n) <= c 1 g(n) <= c 1 c 2 h(n) for N = max(N 1, N 2 ) –So if we take c = c 1 c 2 and N = max(N 1, N 2 ) then f(n) = N So what? –This allows us to simplify O(O(g(n))) to O(g(n))

More theorems about Big-O 1.If f(n) is O(h(n)) and g(n) is also O(h(n)), then f(n) + g(n) is O(h(n)) 2.The function an k is O(n k ) 3.The function n k is O(n k+j ) for any positive j From the above, we conclude that the Big-O of any polynomial is its highest power Example: 3n 4 + 5n 3 + 8n –By 2: O(3n 4 ) is O(n 4 ), O(5n 3 ) is O(n 3 ), and O(8n) is O(n) –By 3: O(n 4 ) is O(n 4 ), O(n 3 ) is O(n 4 ), and O(n) is O(n 4 ) –By 1: O(n 4 ) + O(n 4 ) + O(n 4 ) is O(n 4 )

“Is” is not “equals” We have been making statements such as, “ O(n) is O(n 4 ) ” It would be wrong to say O(n) = O(n 4 ) We are using “X is Y” to mean “Any X is also a Y” –We might say “Fido is a dog” –This is an asymmetric relation: It is not true that any dog is Fido When we say “ O(n) is O(n 4 ) ”, we mean that anything which is O(n) is also O(n 4 ), but not necessarily the reverse

Still more theorems about Big-O If f(n) = cg(n), then f(n) is O(g(n)) O(log A n) is O(log B n) for any positive numbers A ≠ 1 and B ≠ 1 –Changing the base of a logarithm just changes the result by a constant multiplier – log A X = (log A B)log B X O(log A n) is O(log 2 n) for any positive A ≠ 1 –Hence, we can always use logarithms base 2

Big-Ω Big-O is an upper bound Big-Ω is a lower bound A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N A function f(n) is Ω (g(n)) if there exist positive numbers c and N such that: f(n) >= cg(n) for all n >= N If a function is O(n 2 ), it is at least as fast as n 2 If a function is Ω (n 2 ), it is at least as slow as n 2

Big-  A function f(n) is  (g(n)) if there exist positive numbers c 1, c 2 and N such that: c 1 g(n) = N Big-  sets both a lower and an upper bound A function f(n) is  (g(n)) if it is both O(g(n)) and Ω(g(n)) This does not make  (g(n)) unique; f(n) = 2n 2 + 3n + 1 is  (n 2 ) but it is also  (2n 2 ),  (3n 2 ),  (86n 2 ), etc. Why not always use Big-  instead of Big-O? –Not every algorithm has a Big-  value –Quicksort, for example, can be O(n) or O(n 2 )

What you need to know Memorize and understand the definitions of Big- O, Big-Ω, and Big- , particularly Big-O Expect to see these on tests More importantly, you need to be able to examine an algorithm and determine its running time –Usually this is not difficult –Sometimes it can range from quite difficult to nearly impossible –We will come back to this in a later lecture

Simple analysis For series of loops, just add: –for (int i = 0; i < n; i++) {...some O(1) stuff...} for (int j = 0; j < m; j++) {...some more O(1) stuff...} –This is O(n) + O(m) time –If n is proportional to m, this simplifies to O(n) time For nested loops, just multiply: –for (int i = 0; i < n; i++) {...some O(1) stuff... for (int j = 0; j < m; j++) {...some more O(1) stuff } } –This is O(nm) time –If m is proportional to n, this is the same as O(n 2 ) time

More complex analysis Recursion can add some complications –static int depth(BinaryTree tree) { if (tree == null) return 0; int leftDepth = depth(tree.getLeftChild()); int rightDepth = depth(tree.getRightChild()); return Math.max(leftDepth, rightDepth) + 1; } –Formal analysis is a bit tricky, but... –This algorithm visits every node in the binary tree exactly once –Therefore, the algorithm is O(n), where n is the number of nodes in the binary tree

The End