308-203A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

CSE 373 Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III.
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Chapter 1 – Basic Concepts
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Analysis of Algorithms intro.  What is “goodness”?  How to measure efficiency? ◦ Profiling, Big-Oh  Big-Oh: ◦ Motivation ◦ Informal examples ◦ Informal.
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Complexity Analysis (Part I)
CS 307 Fundamentals of Computer Science 1 Asymptotic Analysis of Algorithms (how to evaluate your programming power tools) based on presentation material.
Analysis of Algorithms
CS 206 Introduction to Computer Science II 09 / 05 / 2008 Instructor: Michael Eckmann.
The Efficiency of Algorithms
CS Master – Introduction to the Theory of Computation Jan Maluszynski - HT Lecture 8+9 Time complexity 1 Jan Maluszynski, IDA, 2007
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Asymptotic Analysis Motivation Definitions Common complexity functions
The Efficiency of Algorithms
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
CS 310 – Fall 2006 Pacific University CS310 Complexity Section 7.1 November 27, 2006.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Aaron Bauer Winter 2014.
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Discrete Mathematics Algorithms. Introduction  An algorithm is a finite set of instructions with the following characteristics:  Precision: steps are.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Asymptotic Analysis-Ch. 3
CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Analysis of Algorithms [ Section 4.1 ] Examples of functions important in CS: the constant function:f(n) =
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
Asymptotic Analysis (based on slides used at UMBC)
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Time Complexity of Algorithms (Asymptotic Notations)
Analysis of Algorithm. Why Analysis? We need to know the “behavior” of algorithms – How much resource (time/space) does it use So that we know when to.
Big-Oh Notation. Agenda  What is Big-Oh Notation?  Example  Guidelines  Theorems.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
DS.A.1 Algorithm Analysis Chapter 2 Overview Definitions of Big-Oh and Other Notations Common Functions and Growth Rates Simple Model of Computation Worst.
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis Linda Shapiro Winter 2015.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Introduction to Analysis of Algorithms
Introduction to Algorithms
Data Structures Review Session
CS 201 Fundamental Structures of Computer Science
DS.A.1 Algorithm Analysis Chapter 2 Overview
Searching, Sorting, and Asymptotic Complexity
At the end of this session, learner will be able to:
Presentation transcript:

A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000

How long does a program take to run? Depends on what the input is May just depend on some parameter(s) of the input Example: copying the String depends on the length “a”.clone( ) is less work than “abc…z”.clone()

To Quantify That... Assume some simple operations take fixed time e.g. a[i]= 0; => 1 time unit Complex operations depend on how many simpler operations are performed e.g. for i := 1 to n do a[i] = 0; => n time units

Do constants matter? Q. What if a[i] = 0 and x = 0 don’t take the same time? A. As it happens, this isn’t as important as loops, recursions etc. Therefore, we will use an asymptotic notation, which ignores the exact value of these constants...

The Big O( ) Definition: If there is a function f(n) where there exists N and c such that for any input of length n > N, the running time of a program P is bounded as Time(P) < c f(n) we say that P is O(f(n))

The Big O( ) What does it really mean? n Time f(x) n > N n < N N

The Big O( ) WARNING: CORRUPT NOTATION We write… g(n) = O( f(n) ) even though g(n) and O( f(n) ) are not equivalent.

Examples x = 1; for j := 1 to n do A[j] = 0; for j := 1 to n do for k := 1 to n do A[j][k] = 0; for (int j = n; j != 0; j /= 2) A[j] = 0 f(n): prints all strings of { a, b }* of length n O(1) constant O(n) linear O(n 2 ) quadratic O(log n) logarithmic O(e n ) exponential ExampleGrowth A.k.a.

Worst Case Analysis Big O( ) is worst-case in that the real running time may be much less ( f(n) is an upper-bound): Example: String s = … ; for (int j = 0; j < s.length( ); j++) if (s [ j ] = “a”) break;

Worst Case Analysis Big O( ) is worst-case in that the real running time may be much less ( f(n) is an upper-bound): Example: String s = … ; for (int j = 0; j < s.length( ); j++) if (s [ j ] = “a”) break; Time = O(n)

Best-case Analysis We may choose to analyze the least time the program could take to run This is called big-  notation If P is O( f(n) ) and  ( f(n) ) we say: P is  ( f(n) )

Intuitively... O,  and  do for functions what less than, equal and greater than do for numbers. f(x) = O ( g(x) ) f(x) =  ( g(x) ) f(x) =  ( g(x) ) (i < j) (i = j) (i > j)

f(x) = o ( g(x) ) f(x) =  ( g(x) ) A little more notation “Little-oh” “Little-omega” Lower-case letters act like the corresponding strict inequalities ( ), i.e. it is known that f(x) =  ( g(x) ):

Some Things to Note If P = O( 1 ), it is also true that P = O( n ) If P = O( n k ), it is also true that P = O( n j ) for (j > k) 1. O( ) is a bound, so: 2. If P = O( f(n) + g(n) ) and f(n) = O( g(n) ) then P = O( g(n) ) Example: P = O( x 2 + x ) => P = O( x 2 )

More examples: What about adding two numbers?? 1) In what parameter do we do the analysis? 2) O,  and  ?

More examples: What about adding two numbers?? Let n be the number of digits in the numbers (assume same length) a d a (d-1) a (d-2)... a i … a 3 a 2 a 1 b d b (d-1) b (d-2)... b i … b 3 b 2 b 1 c (d+1) c d c (d-1) c (d-2)... c i … c 3 c 2 c 1 +

More examples: What about adding two numbers?? We do exactly one (primitive) addition for each of d digits =>  ( d ) a d a (d-1) a (d-2)... a i … a 3 a 2 a 1 b d b (d-1) b (d-2)... b i … b 3 b 2 b 1 c (d+1) c d c (d-1) c (d-2)... c i … c 3 c 2 c 1 +

The parameter is important! Let’s say we did the analysis on the number itself rather than how many digits it contains… … is it still linear ???

The parameter is important! Let’s say we did the analysis on the number itself rather than how many digits it contains… … is it still linear ??? NO! If the number is n, d = log n O( d ) = O ( log n )

So what is O(1) in Java Primitive math operations (i.e. +,-,*, / on ints, doubles, etc) Accessing simple variables (and data members) Accessing an array, A[i]

So what is not O(1) in Java Method calls usually aren’t: depends on the body of the method This includes Java library calls like in java.lang.math Loops of any kind

Another example: Exponentiation What is the order of growth of and can we do better than: Function exp(m,n) ::= { result := 1 while (n > 0) result := result * m n := n -1 }

Another example: Exponentiation What is the order of growth of and can we do better than: Answer #1: O( n ) Answer #2: yes…

Better Exponentiation Observe: We can rewrite exponentiations like 5 13 = 5 (5 2 ) 2 (( 5 2 ) 2 ) 2 This has only seven multiplications (instead of thirteen)

Better Exponentiation Function exp(m, n) ::= { result := 1 while (n > 0) if ( n is even) m := m 2 n := n /2 else result := result * m n := n - 1 }

Order of Growth?? Best-case: We always divide by 2 until n := 1 =>  ( log n ) iterations Worst-case: If we’re forced into the other branch (n odd) it will be even next time, so : 2 log n = O( log n ) Conclusion:  ( log n )

Any questions?