CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

MATH 224 – Discrete Mathematics
Advance Data Structure and Algorithm COSC600 Dr. Yanggon Kim Chapter 2 Algorithm Analysis.
Fundamentals of Python: From First Programs Through Data Structures
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms. Time and space To analyze an algorithm means: –developing a formula for predicting how fast an algorithm is, based on the size.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 2. 2 Running time of Basic operations Basic operations do not depend on the size of input, their running time is.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CHAPTER 1 BASIC CONCEPT All the programs in this file are selected from Ellis Horowitz, Sartaj Sahni, and Susan Anderson-Freed “Fundamentals of Data Structures.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
C. – C. Yao Data Structure. C. – C. Yao Chap 1 Basic Concepts.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
CS Data Structures Chapter 1 Basic Concepts.
Analysis of Algorithms
Introduction Algorithms and Conventions The design and analysis of algorithms is the core subject matter of Computer Science. Given a problem, we want.
Lecture 4. RAM Model, Space and Time Complexity
Complexity of Algorithms
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 10 Algorithm Analysis.  Introduction  Generalizing Running Time  Doing a Timing Analysis  Big-Oh Notation  Analyzing Some Simple Programs.
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
COSC 1P03 Data Structures and Abstraction 2.1 Analysis of Algorithms Only Adam had no mother-in-law. That's how we know he lived in paradise.
27-Jan-16 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
Searching Topics Sequential Search Binary Search.
ALGORITHMS AND FLOWCHARTS. Why Algorithm is needed? 2 Computer Program ? Set of instructions to perform some specific task Is Program itself a Software.
Software Learning Resource Service Platform BASIC CONCEPT CHAPTER 1 BASIC CONCEPT 1.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Program Performance. 2 Concepts Memory and time complexity of a program Measuring the time complexity using the operation count and step count.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Chapter 3 Chapter Summary  Algorithms o Example Algorithms searching for an element in a list sorting a list so its elements are in some prescribed.
Algorithm Analysis 1.
Chapter 2 Algorithm Analysis
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Algorithm Analysis CSE 2011 Winter September 2018.
Enough Mathematical Appetizers!
Efficiency (Chapter 2).
Big-Oh and Execution Time: A Review
Algorithm Analysis (not included in any exams!)
Algorithm design and Analysis
Chapter 12: Analysis of Algorithms
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
CS 201 Fundamental Structures of Computer Science
Applied Discrete Mathematics Week 6: Computation
Analysis of Algorithms
Searching, Sorting, and Asymptotic Complexity
Analysis of Algorithms
Revision of C++.
Analysis of Algorithms
Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Basics Prof. Hsin-Mu (Michael) Tsai (蔡欣穆)
Analysis of Algorithms
Analysis of Algorithms
Presentation transcript:

CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition, all algorithms must satisfy the following criteria: (1) Input There are zero or more quantities that are externally supplied. (2) Output At least one quantity is produced. (3) Definiteness Each instruction is clear and unambiguous. (4) Finiteness If we trace out the instructions of an algorithm, then for all cases, the algorithm terminates after finite number of steps. (5) Effectiveness Every instruction must be basic enough to be carried out, in principle, by a person using only pencil and paper. It is not enough that each operation be definite as in(3); it also must be feasible. 1/15

Note: A program is written in some programming language, and does not have to be finite (e.g. an operation system). An algorithm can be described by human languages, flow charts, some programming languages, or pseudo- code. 〖 Example 〗 Selection Sort : Sort a set of n  1 integers in increasing order. From those integers that are currently unsorted, find the smallest and place it next in the sorted list. Where and how are they stored? Where? for ( i = 0; i < n; i++) { E xamine list[i] to list[n  1] and suppose that the smallest integer is at list[min]; I nterchange list[i] and list[min]; } Sort = Find the smallest integer + Interchange it with list[i]. Algorithm in pseudo-code 2/15

§1 What to Analyze  Machine & compiler-dependent run times.  Time & space complexities : machine & compiler- independent. Assumptions:  instructions are executed sequentially  each instruction is simple, and takes exactly one time unit  integer size is fixed and we have infinite memory Typically the following two functions are analyzed: T avg (N) & T worst (N) -- the average and worst case time complexities, respectively, as functions of input size N. If there is more than one input, these functions may have more than one argument. 3/15

§1 What to Analyze 〖 Example 〗 Matrix addition void add ( int a[ ][ MAX_SIZE ], int b[ ][ MAX_SIZE ], int c[ ][ MAX_SIZE ], int rows, int cols ) { int i, j ; for ( i = 0; i < rows; i++ ) for ( j = 0; j < cols; j++ ) c[ i ][ j ] = a[ i ][ j ] + b[ i ][ j ]; } /* rows + 1 */ /* rows(cols+1) */ /* rows  cols */ T( rows, cols ) = 2 rows  cols + 2 rows + 1 Q: What shall we do if rows >> cols ? A: Exchange rows and cols. 4/15

〖 Example 〗 Iterative function for summing a list of numbers float sum ( float list[ ], int n ) { /* add a list of numbers */ float tempsum = 0; int i ; for ( i = 0; i < n; i++ ) tempsum += list [ i ] ; return tempsum; } /* count = 1 */ /* count ++ */ /* count ++ for last execution of for */ /* count ++ */ T sum ( n ) = 2n + 3 〖 Example 〗 Recursive function for summing a list of numbers float rsum ( float list[ ], int n ) { /* add a list of numbers */ if ( n ) return rsum(list, n  1) + list[n  1]; return 0; } /* count ++ */ T rsum ( n ) = 2n + 2 But it takes more time to compute each step. §1 What to Analyze 5/15

§1 What to Analyze Is it really necessary to count the exact number of steps ? Uhhh... I don’t think so. Why? Because it drives me crazy! So it’s too complicated sometimes. But does it worth the effort? Take the iterative and recursive programs for summing a list for example --- if you think 2n+2 is less than 2n+3, try a large n and you’ll be surprised ! I see... Then what’s the point of this T p stuff? Good question ! Let’s ask the students... 6/15

§2 Asymptotic Notation ( , , , o ) The point of counting the steps is to predict the growth in run time as the N change, and thereby compare the time complexities of two programs. So what we really want to know is the asymptotic behavior of T p. Suppose T p1 ( N ) = c 1 N 2 + c 2 N and T p2 ( N ) = c 3 N. Which one is faster? No matter what c 1, c 2, and c 3 are, there will be an n 0 such that T p1 ( N ) > T p2 ( N ) for all N > n 0. I see! So as long as I know that T p1 is about N 2 and T p2 is about N, then for sufficiently large N, P2 will be faster! 7/15

§2 Asymptotic Notation 【 Definition 】 T (N) = O( f (N) ) if there are positive constants c and n 0 such that T (N)  c  f (N) for all N  n 0. 【 Definition 】 T (N) =  ( g(N) ) if there are positive constants c and n 0 such that T (N)  c  g(N) for all N  n 0. 【 Definition 】 T (N) =  ( h(N) ) if and only if T (N) = O( h(N) ) and T (N) =  ( h(N) ). Note:  2N + 3 = O( N ) = O( N k  1 ) = O( 2 N ) =  We shall always take the smallest f (N).  2 N + N 2 =  ( 2 N ) =  ( N 2 ) =  ( N ) =  ( 1 ) =  We shall always take the largest g(N). 【 Definition 】 T (N) = o( p(N) ) if T (N) = O( p(N) ) and T (N)   ( p(N) ). 8/15

§2 Asymptotic Notation Rules of Asymptotic Notation  If T 1 (N) = O( f (N) ) and T 2 (N) = O( g(N) ), then (a) T 1 (N) + T 2 (N) = max( O( f (N)), O( g(N)) ), (b) T 1 (N) * T 2 (N) = O( f (N) * g(N) ).  If T (N) is a polynomial of degree k, then T (N) =  ( N k ).  log k N = O(N) for any constant k. This tells us that logarithms grow very slowly. Note: When compare the complexities of two programs asymptotically, make sure that N is sufficiently large. For example, suppose that T p1 ( N ) = 10 6 N and T p2 ( N ) = N 2. Although it seems that  ( N 2 ) grows faster than  ( N ), but if N < 10 6, P2 is still faster than P1. Note: When compare the complexities of two programs asymptotically, make sure that N is sufficiently large. For example, suppose that T p1 ( N ) = 10 6 N and T p2 ( N ) = N 2. Although it seems that  ( N 2 ) grows faster than  ( N ), but if N < 10 6, P2 is still faster than P1. 9/15

§2 Asymptotic Notation 10/15

§2 Asymptotic Notation 2n2n n2n2 n log n n Log n f n 11/15

§2 Asymptotic Notation  s = microsecond = seconds ms = millisecond = seconds sec = seconds min = minutes yr = years hr = hours d = days n 12/15

〖 Example 〗 Matrix addition void add ( int a[ ][ MAX_SIZE ], int b[ ][ MAX_SIZE ], int c[ ][ MAX_SIZE ], int rows, int cols ) { int i, j ; for ( i = 0; i < rows; i++ ) for ( j = 0; j < cols; j++ ) c[ i ][ j ] = a[ i ][ j ] + b[ i ][ j ]; } /*  (rows) */ /*  (rows  cols ) */ T( rows, cols ) =  (rows  cols ) §2 Asymptotic Notation 13/15

§2 Asymptotic Notation General Rules  FOR LOOPS: The running time of a for loop is at most the running time of the statements inside the for loop (including tests) times the number of iterations.  NESTED FOR LOOPS: The total running time of a statement inside a group of nested loops is the running time of the statements multiplied by the product of the sizes of all the for loops.  CONSECUTIVE STATEMENTS: These just add (which means that the maximum is the one that counts).  IF / ELSE: For the fragment if ( Condition ) S1; else S2; the running time is never more than the running time of the test plus the larger of the running time of S1 and S2. 14/15

§2 Asymptotic Notation  RECURSIONS: 〖 Example 〗 Fibonacci number: Fib(0) = Fib(1) = 1, Fib(n) = Fib(n  1) + Fib(n  2) long int Fib ( int N ) { if ( N <= 1 ) return 1; else return Fib( N  1 ) + Fib( N  2 ); } /* O( 1 ) */ /* T ( N ) */ /*T(N  1)*//*T(N  2)*/ T(N) = T(N  1) + T(N  2) + 2  Fib(N) Proof by induction T(N) grows exponentially Q: Why is it so bad? 15/15