Copyright © Zeph Grunschlag, 2001-2002. Algorithms and Complexity Zeph Grunschlag.

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
22C:19 Discrete Math Algorithms and Complexity
Recurrences : 1 Chapter 3. Growth of function Chapter 4. Recurrences.
Logistics HW due next week Midterm also next week on Tuesday
Lecture 16 Complexity of Functions CSCI – 1900 Mathematics for Computer Science Fall 2014 Bill Pine.
12-Apr-15 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
MATH 224 – Discrete Mathematics
Lecture3: Algorithm Analysis Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Chapter 1 – Basic Concepts
The Growth of Functions
CS 206 Introduction to Computer Science II 09 / 10 / 2008 Instructor: Michael Eckmann.
1 Data Structures A program solves a problem. A program solves a problem. A solution consists of: A solution consists of:  a way to organize the data.
1.2 – Open Sentences and Graphs
Unit 1. Sorting and Divide and Conquer. Lecture 1 Introduction to Algorithm and Sorting.
Chapter 1 Introduction Definition of Algorithm An algorithm is a finite sequence of precise instructions for performing a computation or for solving.
Algorithms: Selected Exercises Goals Introduce the concept & basic properties of an algorithm.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
February 17, 2015Applied Discrete Mathematics Week 3: Algorithms 1 Double Summations Table 2 in 4 th Edition: Section th Edition: Section th.
Algorithm analysis and design Introduction to Algorithms week1
Chapter 5 Algorithm Analysis 1CSCI 3333 Data Structures.
Chapter 2 The Fundamentals: Algorithms, the Integers, and Matrices
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a CSC 201 Analysis and Design of Algorithms Lecture 03: Introduction to a lgorithms.
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
Discrete Mathematics Algorithms. Introduction  An algorithm is a finite set of instructions with the following characteristics:  Precision: steps are.
1 R. Johnsonbaugh, Discrete Mathematics Chapter 4 Algorithms.
Discrete Math and Its Application to Computer Science
CSCE 3110 Data Structures & Algorithm Analysis Rada Mihalcea Algorithm Analysis II Reading: Weiss, chap. 2.
2.3 Functions A function is an assignment of each element of one set to a specific element of some other set. Synonymous terms: function, assignment, map.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
DISCRETE MATHEMATICS I CHAPTER 11 Dr. Adam Anthony Spring 2011 Some material adapted from lecture notes provided by Dr. Chungsim Han and Dr. Sam Lomonaco.
Introduction Algorithms and Conventions The design and analysis of algorithms is the core subject matter of Computer Science. Given a problem, we want.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Fall 2002CMSC Discrete Structures1 Enough Mathematical Appetizers! Let us look at something more interesting: Algorithms.
Complexity of Algorithms
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
MCA-2012Data Structure1 Algorithms Rizwan Rehman CCS, DU.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CompSci 102 Discrete Math for Computer Science
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
CSC310 © Tom Briggs Shippensburg University Fundamentals of the Analysis of Algorithm Efficiency Chapter 2.
Algorithm Analysis Problem Solving Space Complexity Time Complexity
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
ADVANCED ALGORITHMS REVIEW OF ANALYSIS TECHNIQUES (UNIT-1)
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
P, NP, and NP-Complete Problems Section 10.3 The class P consists of all problems that can be solved in polynomial time, O(N k ), by deterministic computers.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
FURQAN MAJEED ALGORITHMS. A computer algorithm is a detailed step-by-step method for solving a problem by using a computer. An algorithm is a sequence.
Advanced Algorithms Analysis and Design
Algorithms and Complexity
Algorithms and Complexity
Courtsey & Copyright: DESIGN AND ANALYSIS OF ALGORITHMS Courtsey & Copyright:
Algorithms Furqan Majeed.
Enough Mathematical Appetizers!
Computation.
Analysis Algorithms.
Objective of This Course
Applied Discrete Mathematics Week 6: Computation
Asst. Dr.Surasak Mungsing
Enough Mathematical Appetizers!
Enough Mathematical Appetizers!
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Copyright © Zeph Grunschlag,
Lecture 6 - Recursion.
Presentation transcript:

Copyright © Zeph Grunschlag, Algorithms and Complexity Zeph Grunschlag

L82 Agenda Section 2.1: Algorithms Pseudocode Recursive Algorithms (Section 3.4) Section 2.2: Complexity of Algorithms Section 1.8: Growth of Functions Big-O Big-  (Omega) Big-  (Theta)

L83 Section 2.1 Algorithms and Pseudocode DEF: An algorithm is a finite set of precise instructions for performing a computation or solving a problem. Synonyms for a algorithm are: program, recipe, procedure, and many others.

L84 Pseudo-Java Possible alternative to text’s pseudo-Java Start with “real” Java and simplify: int f(int[] a){ int x = a[0]; for(int i=1; i<a.length; i++){ if(x > a[i]) x = a[i]; } return x; }

L85 Pseudo-Java Version 1 integer f(integer_array (a 1, a 2, …, a n ) ){ x = a 1 for(i =2 to n){ if(x > a i ) x = a i } return x }

L86 Pseudo-Java version 2 INPUT: integer_array V = (a 1, a 2, …, a n ) begin x = a 1 for(y  V) if(x > y) x = y end OUTPUT: x

L87 Algorithm for Surjectivity boolean isOnto( function f: (1, 2,…, n)  (1, 2,…, m) ){ if( m > n ) return false // can’t be onto soFarIsOnto = true for( j = 1 to m ){ soFarIsOnto = false for(i = 1 to n ){ if ( f(i ) == j ) soFarIsOnto = true if( !soFarIsOnto ) return false; } return true; }

L88 Improved Algorithm for Surjectivity boolean isOntoB( function f: (1, 2,…, n)  (1, 2,…, m) ){ if( m > n ) return false // can’t be onto for( j = 1 to m ) beenHit[ j ] = false; // does f ever output j ? for(i = 1 to n ) beenHit[ f(i ) ] = true; for(j = 1 to m ) if( !beenHit[ j ] ) return false; return true; }

L89 Recursive Algorithms (Section 3.4) “Real” Java: long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

L810 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } Compute 5!

L811 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(5)= 5·f(4)

L812 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(4)= 4·f(3) f(5)= 5·f(4)

L813 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L814 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(2)= 2·f(1) f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L815 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(1)= 1·f(0) f(2)= 2·f(1) f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L816 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } f(0)= 1  f(1)= 1·f(0) f(2)= 2·f(1) f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L817 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } 1·1= 1  f(2)= 2·f(1) f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L818 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } 2·1= 2  f(3)= 3·f(2) f(4)= 4·f(3) f(5)= 5·f(4)

L819 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } 3·2= 6  f(4)= 4·f(3) f(5)= 5·f(4)

L820 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } 4·6= 24  f(5)= 5·f(4)

L821 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } 5·24= 120 

L822 Recursive Algorithms long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } Return 5! = 120

L823 Section 2.2 Algorithmic Complexity Compare the running time of 2 previous algorithms for testing surjectivity. Measure running time by counting the number of “basic operations”.

L824 Running Time Basic steps— AssignmentIncrement ComparisonNegation ReturnRandom array access Function output accessetc. In a particular problem, may tell you to consider other operations (e.g. multiplication) and ignore all others

L825 Running time of 1 st algorithm boolean isOnto( function f: (1, 2,…, n)  (1, 2,…, m) ){ if( m > n ) return false soFarIsOnto = true for( j = 1 to m ){ soFarIsOnto = false for(i = 1 to n ){ if ( f(i ) == j ) soFarIsOnto = true if( !soFarIsOnto ) return false } return true; } 1 step OR: 1 step (assigment) m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step possibly leads to: 1 step (assignment) 1 step possibly leads to: 1 step (return) possibly 1 step

L826 Running time of 1 st algorithm 1 step (m>n) OR: 1 step (assigment) m loops : 1 increment plus 1 step (assignment) n loops : 1 increment plus 1 step possibly leads to: 1 step (assignment) 1 step possibly leads to: 1 step (return) possibly 1 step WORST-CASE running time: Number of steps = 1 OR m · ( n · ( ) + 1 ) = 1 (if m>n) OR 5mn+3m+2

L827 Running time of 2 nd algorithm boolean isOntoB( function f: (1, 2,…, n)  (1, 2,…, m) ){ if( m > n ) return false for( j = 1 to m ) beenHit[ j ] = false for(i = 1 to n ) beenHit[ f(i ) ] = true for(j = 1 to m ) if( !beenHit[ j ] ) return false return true } 1 step OR: m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step (assignment) m loops: 1 increment plus 1 step possibly leads to: 1 step possibly 1 step.

L828 Running time of 2 nd algorithm 1 step (m>n) OR: m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step (assignment) m loops: 1 increment plus 1 step possibly leads to: 1 step possibly 1 step. WORST-CASE running time: Number of steps = 1 OR 1+ m · (1+ 1) + n · (1+ 1 ) + m · ( ) + 1 = 1 (if m>n) OR 5m + 2n + 2

L829 Comparing Running Times 1. At most 5mn+3m+2 for first algorithm 2. At most 5m+2n+2 for second algorithm Worst case when m  n so replace m by n: 5n 2 +3n+2 vs. 8n+2 To tell which is better, look at dominant term: 5 n 2 +3n+2 vs. 8 n +2 So second algorithm is better.

L830 Comparing Running Times. Issues 1. 5n 2 +3n+2, 8n+2 are more than just their biggest term. Consider n = Number of “basic steps” doesn’t give accurate running time. 3. Actual running time depends on platform. 4. Overestimated number of steps: under some conditions, portions of code will not be seen.

L831 Running Times Issues Big-O Response Asymptotic notation (Big-O, Big- , Big-  ) gives partial resolution to problems: 1. For large n the largest term dominates so 5n 2 +3n+2 is modeled by just n 2.

L832 Running Times Issues Big-O Response Asymptotic notation (Big-O, Big- , Big-  ) gives partial resolution to problems: 2. Different lengths of basic steps, just change 5n 2 to Cn 2 for some constant, so doesn’t change largest term

L833 Running Times Issues Big-O Response Asymptotic notation (Big-O, Big- , Big-  ) gives partial resolution to problems: 3. Basic operations on different (but well- designed) platforms will differ by a constant factor. Again, changes 5n 2 to Cn 2 for some constant.

L834 Running Times Issues Big-O Response Asymptotic notation (Big-O, Big- , Big-  ) gives partial resolution to problems: 4. Even if overestimated by assuming iterations of while-loops that never occurred, may still be able to show that overestimate only represents different constant multiple of largest term.

L835 Worst Case vs. Average Case Worst case complexity: provides absolute guarantees for time a program will run. The worst case complexity as a function of n is longest possible time for any input of size n. Average case complexity: suitable if small function is repeated often or okay to take a long time –very rarely. The average case as a function of n is the avg. complexity over all possible inputs of that length. Avg. case complexity analysis usually requires probability theory. (Delayed till later)

L836 Section 1.8 Big-O, Big- , Big-  Useful for computing algorithmic complexity, i.e. the amount of time that it takes for computer program to run.

L837 Notational Issues Big-O notation is a way of comparing functions. Notation unconventional: EG: 3x 3 + 5x 2 – 9 = O (x 3 ) Doesn’t mean “3x 3 + 5x 2 – 9 equals the function O (x 3 )” Which actually means “3x 3 +5x 2 –9 is dominated by x 3 ” Read as: “3x 3 +5x 2 –9 is big-Oh of x 3 ”

L838 Intuitive Notion of Big-O Asymptotic notation captures behavior of functions for large values of x. EG: Dominant term of 3x 3 +5x 2 –9 is x 3. As x becomes larger and larger, other terms become insignificant and only x 3 remains in the picture:

L839 Intuitive Notion of Big-O domain – [0,2] y = 3x 3 +5x 2 –9 y = x 3 y = x y = x 2

L840 Intuitive Notion of Big-O domain – [0,5] y = 3x 3 +5x 2 –9 y = x 3 y = x y = x 2

L841 Intuitive Notion of Big-O domain – [0,10] y = 3x 3 +5x 2 –9 y = x 3 y = x y = x 2

L842 Intuitive Notion of Big-O domain – [0,100] y = 3x 3 +5x 2 –9 y = x 3 y = x y = x 2

L843 Intuitive Notion of Big-O In fact, 3x 3 +5x 2 –9 is smaller than 5x 3 for large enough values of x: y = 3x 3 +5x 2 –9 y = 5x 3 y = x y = x 2

L844 Big-O. Formal Definition f (x ) is asymptotically dominated by g (x ) if there’s a constant multiple of g (x ) bigger than f (x ) as x goes to infinity: DEF: Let f, g be functions with domain R  0 or N and codomain R. If there are constants C and k such  x > k, |f (x )|  C  |g (x )| then we write: f (x ) = O ( g (x ) )

L845 Common Misunderstanding It’s true that 3x 3 + 5x 2 – 9 = O (x 3 ) as we’ll prove shortly. However, also true are: 3x 3 + 5x 2 – 9 = O (x 4 ) x 3 = O (3x 3 + 5x 2 – 9) sin(x) = O (x 4 ) NOTE: C.S. usage of big-O typically involves mentioning only the most dominant term. “The running time is O (x 2.5 )” Mathematically big-O is more subtle.

L846 Big-O. Example EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Previous graphs show C = 5 good guess. Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k

L847 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x 3 + 9

L848 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x What k will make 5x 2 ≤ x 3 for x > k ?

L849 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x What k will make 5x 2 ≤ x 3 for x > k ? 3. k = 5 !

L850 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x What k will make 5x 2 ≤ x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 ≤ x 3 ≤ 2x 3 + 9

L851 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x What k will make 5x 2 ≤ x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 ≤ x 3 ≤ 2x Solution: C = 5, k = 5 (not unique!)

L852 EG: Show that 3x 3 + 5x 2 – 9 = O (x 3 ). Find k so that 3x 3 + 5x 2 – 9  5x 3 for x > k 1. Collect terms: 5x 2 ≤ 2x What k will make 5x 2 ≤ x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 ≤ x 3 ≤ 2x Solution: C = 5, k = 5 (not unique!)

L853 Big-O. Negative Example x 4  O (3x 3 + 5x 2 – 9) : No pair C, k exist for which x > k implies C (3x 3 + 5x 2 – 9)  x 4 Argue using limits: x 4 always catches up regardless of C. 

L854 Big-O and limits LEMMA: If the limit as x   of the quotient |f (x) / g (x)| exists then f (x ) = O ( g (x ) ). EG: 3x 3 + 5x 2 – 9 = O (x 3 ). Compute: …so big-O relationship proved.

L855 Little-o and limits DEF: If the limit as x   of the quotient |f (x) / g (x)| = 0 then f (x ) = o (g (x ) ). EG: 3x 3 + 5x 2 – 9 = o (x 3.1 ). Compute:

L856 Big-  and Big-  Big-  : reverse of big-O. I.e. f (x ) =  (g (x ))  g (x ) = O (f (x )) so f (x ) asymptotically dominates g (x ). Big-  : domination in both directions. I.e. f (x ) =  (g (x ))  f (x ) = O (g (x ))  f (x ) =  (g (x )) Synonym for f =  (g): “f is of order g ”

L857 Useful facts Any polynomial is big-  of its largest term EG: x 4 / x 3 + 5x 2 – 9 =  (x 4 ) The sum of two functions is big-O of the biggest EG: x 4 ln(x ) + x 5 = O (x 5 ) Non-zero constants are irrelevant: EG: 17x 4 ln(x ) = O (x 4 ln(x ))

L858 Big-O, Big- , Big- . Examples Q: Order the following from smallest to largest asymptotically. Group together all functions which are big-  of each other:

L859 Big-O, Big- , Big- . Examples A: , (change of base formula)

L860 Incomparable Functions Given two functions f (x ) and g (x ) it is not always the case that one dominates the other so that f and g are asymptotically incomparable. E.G: f (x) = |x 2 sin(x)| vs. g (x) = 5x 1.5

L861 Incomparable Functions y = |x 2 sin(x)| y = x 2 y = 5x 1.5

L862 Incomparable Functions y = |x 2 sin(x)| y = x 2 y = 5x 1.5

L863 Big-O A Grain of Salt Big-O notation gives a good first guess for deciding which algorithms are faster. In practice, the guess isn’t always correct. Consider time functions n 6 vs. 1000n 5.9. Asymptotically, the second is better. Often catch such examples of purported advances in theoretical computer science publications. The following graph shows the relative performance of the two algorithms:

L864 Big-O A Grain of Salt Running-time In days Input size n T(n) = n 6 T(n) = 1000n 5.9 Assuming each operation takes a nano-second, so computer runs at 1 GHz

L865 Big-O A Grain of Salt In fact, 1000n 5.9 only catches up to n 6 when 1000n 5.9 = n 6, i.e.: 1000= n 0.1, i.e.: n = = operations = /10 9 = seconds  /(3x10 7 )  3x10 13 years  3x10 13 /(2x10 10 )  1500 universe lifetimes!

L866 Example for Section 1.8 Link to example proving big-Omega of a sum.