CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis.

Slides:



Advertisements
Similar presentations
CSCE 2100: Computing Foundations 1 Running Time of Programs
Advertisements

 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Chapter 2: Algorithm Analysis
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
Cmpt-225 Algorithm Efficiency.
Algorithm Analysis. Math Review – 1.2 Exponents –X A X B = X A+B –X A /X B =X A-B –(X A ) B = X AB –X N +X N = 2X N ≠ X 2N –2 N+ 2 N = 2 N+1 Logarithms.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
CHAPTER 2 ANALYSIS OF ALGORITHMS Part 1. 2 Big Oh and other notations Introduction Classifying functions by their asymptotic growth Theta, Little oh,
Chapter 2: Algorithm Analysis Big-Oh and Other Notations in Algorithm Analysis Lydia Sinapova, Simpson College Mark Allen Weiss: Data Structures and Algorithm.
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (1) Asymptotic Complexity 10/28/2008 Yang Song.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Design and Analysis of Algorithms Chapter Analysis of Algorithms Dr. Ying Lu August 28, 2012
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Instructor Neelima Gupta
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Performance & Asymptotic Notations CSE, POSTECH.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Design and Analysis of Algorithms - Chapter 21 Analysis of Algorithms b Issues: CorrectnessCorrectness Time efficiencyTime efficiency Space efficiencySpace.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
MS 101: Algorithms Instructor Neelima Gupta
Fundamentals of Algorithms MCS - 2 Lecture # 8. Growth of Functions.
Zeinab EidAlgorithm Analysis1 Chapter 4 Analysis Tools.
The Time Complexity of an Algorithm Specifies how the running time depends on the size of the input. CSE 3101Z Design and Analysis of Algorithms.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Algorithm Analysis Problem Solving Space Complexity Time Complexity
CE 221 Data Structures and Algorithms Chapter 2: Algorithm Analysis - I Text: Read Weiss, §2.1 – Izmir University of Economics.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Algorithm Analysis (Big O)
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Data Structure and Algorithm Analysis 02: Algorithm Analysis Hongfei Yan School of EECS, Peking University 3/12/2014.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Mathematical Foundations (Growth Functions) Neelima Gupta Department of Computer Science University of Delhi people.du.ac.in/~ngupta.
Analysis of Non – Recursive Algorithms
Analysis of Non – Recursive Algorithms
Thinking about Algorithms Abstractly
COMP108 Algorithmic Foundations Algorithm efficiency
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Introduction to Algorithms
Computational Complexity
Complexity Analysis.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
Asymptotic Growth Rate
BIG-OH AND OTHER NOTATIONS IN ALGORITHM ANALYSIS
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Programming and Data Structure
Chapter 2.
Programming and Data Structure
Asymptotic Notations Algorithms perform f(n) basic operations to accomplish task Identify that function Identify size of problem (n) Count number of operations.
CE 221 Data Structures and Algorithms
Performance Evaluation
8. Comparison of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Advanced Analysis of Algorithms
Presentation transcript:

CHAPTER 1 Compiled by: Dr. Mohammad Omar Alhawarat Algorithm’s Analysis

What is an Algorithm  An algorithm is a set of clear instructions to solve a problem.  Algorithms are usually written in an English- like language called Pseudo code.  Algorithms are not programs! 2

Why Performance Matter?  Suppose N = 106  A PC can read/process N records in 1 sec.  But if some algorithm does N*N computation, then it takes 1M seconds = 11 days!!!  100 City Traveling Salesman Problem.  A supercomputer checking 100 billion tours/sec still requires years! 3

Factors Affecting Choosing An Algorithm  Running time (Speed).  Storage usage.  Graphical User Interface (GUI). (Ease of use)  Security.  Maintenance.  Portability.  Open Source. 4

Algorithm Performance  Performance = Efficiency = Complexity.  Usually performance of an algorithm is computed either by:  Empirical Method. OR  Analytical Method. 5

Empirical Method  We have two algorithms A & B that solve Problem X.  To compare these algorithms using empirical method then we need to convert each to a program using a programming language.  After that we run both programs and see which one runs faster. 6

Why Empirical Method is not Feasible? That’s because when running a program on a computer there are some factors that affect the speed of running such as:  CPU speed.  RAM size.  Hard Disk Drive (HDD).  Graphics Card.  Operating System (OS).  Compiler.  Environment (Backup, Antivirus, …etc). 7

Analytical Method  In this method each algorithm is mapped to a function.  Then the functions are compared in terms of their growth rate to decide which is better.  First some mathematical review will be done to cover: exponents, logarithms and concept of growth rate. 8

Mathematical Review: Exponents  X 0 = 1 by definition  X a X b = X (a+b)  X a / X b = X (a-b) Show that: X -n = 1 / X n  (X a ) b = X ab 9

Mathematical Review: Logarithms  log a X = Y  a Y = X, a > 0, X > 0 E.G: log 2 8 = 3; 2 3 = 8  log a 1 = 0 because a 0 = 1 logX means log 2 X lgX means log 10 X lnX means log e X, where ‘e’ is the natural number 10

Mathematical Review: Logarithms  log a (XY) = log a X + log a Y  log a (X/Y) = log a X – log a Y  log a (X n ) = nlog a X  log a b = (log 2 b)/ (log 2 a)  a log a x = x 11

Typical Function Families 12

Typical Functions: Examples 13

Typical Functions: Examples n2n2 n log n n log n n3n3 2n2n 14

Complexity & Computability Note: Assume the computer does 1 billion ops per sec. 15

Growth Rate: Cross-over Point 16

Typical Growth Rates CConstant, we write O(1) logNLogarithmic NLinear NlogN Linearithmic N 2 Quadratic N 3 Cubic 2 N Exponential N!Factorial 17

Function Growth  lim ( n ) = ∞, n → ∞  lim ( n a ) = ∞, n → ∞, a > 0  lim ( 1 / n ) = 0, n → ∞  lim ( 1 / (n a ) )= 0, n → ∞, a > 0  lim ( log( n )) = ∞, n → ∞  lim ( a n ) = ∞, n → ∞, a > 0 18

Function Growth  lim (f(x) + g(x)) = lim (f(x)) + lim (g(x))  lim (f(x) * g(x)) = lim (f(x)) * lim (g(x))  lim (f(x) / g(x)) = lim (f(x)) / lim (g(x))  lim (f(x) / g(x)) = lim (f '(x) / g '(x)) 19

Examples  lim (n/ n 2 ) = 0, n → ∞  lim (n 2 / n) = ∞, n → ∞  lim (n 2 / n 3 ) = 0, n → ∞  lim (n 3 / n 2 ) = ∞, n → ∞  lim (n / ((n+1)/2) = 2, n → ∞. 20

Classifying Functions by Their Asymptotic Growth Asymptotic growth : The rate of growth of a function Given a particular differentiable function f(n), all other differentiable functions fall into three classes:. growing with the same rate. growing faster. growing slower 21

Asymptotic Notations in Algorithm Analysis Classifying Functions by Their Asymptotic Growth:  Little oh, Little omega  Theta  Big Oh, Big Omega 22

Theta f(n) and g(n) have same rate of growth, if lim( f(n) / g(n) ) = c, 0 ∞ Notation: f(n) = Θ( g(n) ) pronounced "theta" 23

Little oh f(n) grows slower than g(n) (or g(n) grows faster than f(n)) if lim( f(n) / g(n) ) = 0, n → ∞ Notation: f(n) = o( g(n) ) pronounced "little oh" 24

Little omega f(n) grows faster than g(n) (or g(n) grows slower than f(n)) if lim( f(n) / g(n) ) = ∞, n -> ∞ Notation: f(n) = ω (g(n)) pronounced "little omega" 25

Little omega and Little Oh if g(n) = o( f(n) ) then f(n) = ω( g(n) ) Examples: Compare n and n 2 lim( n/n 2 ) = 0, n → ∞, n = o(n 2 ) lim( n 2 /n ) = ∞, n → ∞, n 2 = ω(n) 26

Algorithms with Same Complexity Two algorithms have same complexity, if the functions representing the number of operations have same rate of growth. Among all functions with same rate of growth we choose the simplest one to represent the complexity. 27

Examples Compare n and (n+1)/2 lim( n / ((n+1)/2 )) = 2, same rate of growth (n+1)/2 = Θ(n) - rate of growth of a linear function 28

Examples Compare n 2 and n 2 + 6n lim( n 2 / (n 2 + 6n ) )= 1 same rate of growth. n 2 +6n = Θ(n 2 ) rate of growth of a quadratic function 29

Examples Compare log n and log n 2 lim( log n / log n 2 ) = 1/2 same rate of growth. (log n 2 =2log n) log n 2 = Θ(log n) logarithmic rate of growth 30

Examples Θ(n 3 ):n 3, 5n 3 + 4n,105n 3 + 4n 2 +6n Θ(n 2 ):n 2, 5n 2 + 4n + 6,n Θ(log n):log n, log n 2, log (n + n 3 ) 31

Comparing Functions  same rate of growth: g(n) = Θ(f(n))  different rate of growth: either g(n) = o (f(n)) g(n) grows slower than f(n), and hence f(n) = ω(g(n)) or g(n) = ω (f(n)) g(n) grows faster than f(n), and hence f(n) = o(g(n)) 32

Big-Oh Notation f(n) = O(g(n)) if f(n) grows with same rate or slower than g(n). f(n) = Θ(g(n)) or f(n) = o(g(n)) 33

Example n+5 = Θ(n) = O(n) = O(n 2 ) = O(n 3 ) = O(n 5 ) the closest estimation: n+5 = Θ(n) the general practice is to use the Big-Oh notation: n+5 = O(n) 34

Big-Omega Notation The inverse of Big-Oh is Ω If g(n) = O(f(n)), then f(n) = Ω (g(n)) f(n) grows faster or with the same rate as g(n): f(n) = Ω (g(n)) 35

Big-Oh Rules 36

Big-Oh Rules  We disregard any lower-order term and choose the highest-order term n 2 + n = O(n 2 ) n 2 + nlog(n) = O(n 2 )  We disregard any coefficient of 5n 2 + 3n = O(n 2 ) 37

Problems N 2 = O(N 2 )true 2N = O(N 2 ) true N = O(N 2 ) true N 2 = O(N) false 2N = O(N) true N = O(N) true 38

Problems N 2 = Θ (N 2 )true 2N = Θ (N 2 ) false N = Θ (N 2 ) false N 2 = Θ (N)false 2N = Θ (N) true N = Θ (N) true 39

 T(N)=6N+4 : n0=4 and c=7, f(N)=N  T(N)=6N+4 =4  7N+4 = O(N)  15N+20 = O(N)  N 2 =O(N)?  N log N = O(N)?  N log N = O(N 2 )?  N 2 = O(N log N)?  N 10 = O(2 N )?  6N + 4 = W(N) ? 7N? N+4 ? N 2 ? N log N?  N log N = W(N 2 )?  3 = O(1)  =O(1)  Sum i = O(N)? T(N) f(N) c f(N) n0n0 T(N)=O(f(N)) Examples 40

When sorting a list of numbers in ascending order; we might have one of the following cases: n Worst Case: The list is sorted in descending order, then maximum number of swaps is performed. n Average Case: The list is unsorted, then average number of swaps is performed. n Best Case: The list is sorted in ascending order, then minimum number of swaps is performed. Worst Case, Best Case, and Average Case 41

Big-Oh Definition 42

Mathematical stuff 43

Examples for Big-Oh: Loop sum = 0; for( i = 0; i < n; i++ ) sum = sum + i; The running time is O(n) 44

Examples for Big-Oh: Nested Loop sum = 0; for( i = 0; i < n; i++) for( j = 0; j < n; j++) sum++; The running time is O(n 2 ) 45

Examples for Big-Oh: Sequential sum = 0;O(n) for( i = 0; i < n; i++) sum = sum + i; sum = 0; O(n 2 ) for( i = 0; i < n; i++) for( j = 0; j < 2n; j++) sum++; The maximum is O(n 2 ) 46

Examples for Big-Oh: If stat. if C S1; else S2; The running time is the maximum of the running times of S1 and S2. 47

More Examples O(n 3 ): sum = 0; for( i = 0 ; i < n; i++) for( j = 0 ; j < n*n ; j++ ) sum++; 48

More Examples 49 O(n 2 ): sum = 0; for( i = 0; i < n ; i++) for( j = 0; j < i ; j++) sum++;

More Examples 50 O(n 3 logn) for(j = 0; j < n*n; j++) compute_val(j); The complexity of compute_val(x) is given to be O(n*logn)

More Examples 51 for (i = 0; i < n; i++) for (j = 0; j < m; j++) if (a[ i ][ j ] == x) return 1 ; return -1; O(n*m)