Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?

Slides:



Advertisements
Similar presentations
Analysis of Algorithms
Advertisements

CSCE 2100: Computing Foundations 1 Running Time of Programs
Lecture3: Algorithm Analysis Bohyung Han CSE, POSTECH CSED233: Data Structures (2014F)
Lecture: Algorithmic complexity
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Estimating Running Time Algorithm arrayMax executes 3n  1 primitive operations in the worst case Define a Time taken by the fastest primitive operation.
the fourth iteration of this loop is shown here
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
1 ICS 353 Design and Analysis of Algorithms Spring Semester (062) King Fahd University of Petroleum & Minerals Information & Computer Science.
Asymptotic Growth Rate
Cutler/HeadGrowth of Functions 1 Asymptotic Growth Rate.
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Analysis of Algorithms Review COMP171 Fall 2005 Adapted from Notes of S. Sarkar of UPenn, Skiena of Stony Brook, etc.
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms (pt 2) (Chapter 4) COMP53 Oct 3, 2007.
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 2 Elements of complexity analysis Performance and efficiency Motivation: analysis.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Algorithm Analysis (Big O)
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Algorithm analysis and design Introduction to Algorithms week1
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Analysis of Algorithms1 The Goal of the Course Design “good” data structures and algorithms Data structure is a systematic way of organizing and accessing.
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Algorithm Input Output An algorithm is a step-by-step procedure for solving a problem in a finite amount of time. Chapter 4. Algorithm Analysis (complexity)
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Iterative Algorithm Analysis & Asymptotic Notations
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Algorithm Analysis Part 2 Complexity Analysis. Introduction Algorithm Analysis measures the efficiency of an algorithm, or its implementation as a program,
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of a computer program.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Time Complexity of Algorithms (Asymptotic Notations)
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis (Big O)
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
GC 211:Data Structures Week 2: Algorithm Analysis Tools Slides are borrowed from Mr. Mohammad Alqahtani.
LECTURE 2 : fundamentals of analysis of algorithm efficiency Introduction to design and analysis algorithm 1.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
Data Structures & Algorithm CS-102 Lecture 12 Asymptotic Analysis Lecturer: Syeda Nazia Ashraf 1.
Chapter 10 Algorithm Efficiency
Analysis of Algorithms
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
Analysis of Algorithms
Chapter 2.
CSC 380: Design and Analysis of Algorithms
Estimating Algorithm Performance
Analysis of Algorithms
Algorithm Course Dr. Aref Rashad
Presentation transcript:

Algorithm Complexity L. Grewe 1

Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them? At the heart of computer program design are two (sometimes conflicting) goals. 1.To design an algorithm that is easy to understand, code, debug. 2.To design an algorithm that makes efficient use of the computer’s resources.

How to Measure Efficiency? 1.Empirical comparison (run programs) 2.Asymptotic Algorithm Analysis Critical resources: Factors affecting running time: For most algorithms, running time depends on “size” of the input. Running time is expressed as T(n) for some function T on input size n.

Examples of Growth Rate Example 1. // Find largest value int largest(int array[], int n) { int currlarge = 0; // Largest value seen for (int i=1; i<n; i++) // For each val if (array[currlarge] < array[i]) currlarge = i; // Remember pos return currlarge; // Return largest } As n grows, how does T(n) grow? Cost: T(n) = c 1 n + c 2 steps

Examples (cont) Example 2: Assignment statement. Example 3: sum = 0; for (i=1; i<=n; i++) for (j=1; j<n; j++) sum++; } T(n) = c 1 n 2 + c 2 Constant growth

Growth Rate Graph

Best, Worst, Average Cases Not all inputs of a given size take the same time to run. Sequential search for K in an array of n integers: Begin at first element in array and look at each element in turn until K is found Best case: Cost = 1 Worst case: Cost = n Average case: (n+1)/2 compares if K equally likely in any position

Which Analysis to Use? While average time appears to be the fairest measure…but, how to measure it? When is the worst case time important? Average time analysis requires knowledge of distributions. For example, the assumption of distribution used for average case in the last example.

Faster Computer or Algorithm? What happens when we buy a computer 10 times faster? T(n)T(n)nn’n’Changen’/n 10n1,00010,000n’ = 10n10 20n5005,000n’ = 10n10 5n log n2501,842  10 n < n’ < 10n n22n n’ =  10n n2n 1316n’ = n

When go to computer 10 times faster n: Size of input that can be processed in one hour (10,000 steps). n’: Size of input that can be processed in one our on the new machine (100,000 steps). Note: for 2 n, if n = 1000, then n’ would be

Asymptotic Analysis: Big-oh O(x) Definition: For T(n) a non-negatively valued function, T(n) is in the set O(f(n)) if there exist two positive constants c and n 0 such that T(n) n 0. Usage: The algorithm is in O(n 2 ) in [best, average, worst] case. Meaning: For all data sets big enough (i.e., n>n 0 ), the algorithm always executes in less than cf(n) steps in [best, average, worst] case. pick one of [best, average, worst]. Big-oh notation applies to some set of bounds.

Big-oh Notation O() is an upper bound. Example: If T(n) = 3n 2 then T(n) is in O(n 2 ). Use the “tightest bounds”: While T(n) = 3n 2 is in O(n 3 ), we prefer O(n 2 ).  It provides more information in this example to say O(n 2 ) than O(n 3 ).

Big-Oh Examples Example 1: Finding value X in an array (average cost). T(n) = c s n/2. For all values of n > 1, c s n/2 <= c s n. Therefore, by the definition, T(n) is in O(n) for n 0 = 1 and c = c s.

A Common Misunderstanding “The best case for my algorithm is n=1 because that is the fastest.” WRONG! Big-oh refers to a growth rate as n grows to . Best case is defined as which input of size n is cheapest among all inputs of size n.

Big-Omega  () -- Lower Bound Definition: For T(n) a non-negatively valued function, T(n) is in the set  (g(n)) if there exist two positive constants c and n 0 such that T(n) >= cg(n) for all n > n 0. Meaning: For all data sets big enough (i.e., n > n 0 ), the algorithm always executes in more than cg(n) steps. Lower bound.

Big-Omega Example T(n) = c 1 n 2 + c 2 n. c 1 n 2 + c 2 n >= c 1 n 2 for all n > 1. T(n) >= cn 2 for c = c 1 and n 0 = 1. Therefore, T(n) is in  (n 2 ) by the definition. Always take the greatest lower bound.

Theta Notation  () When big-Oh and  meet, we indicate this by using  (big-Theta) notation. Definition: An algorithm is said to be  (h(n)) if it is in O(h(n)) and it is in  (h(n)). NOTE: For polynomial equations on T(n), we always have .

A Common Misunderstanding Confusing worst case with upper bound. Upper bound refers to a growth rate. Worst case refers to the worst input from among the choices for possible inputs of a given size.

Space/Time Tradeoff Principle One can often reduce time if one is willing to sacrifice space, or vice versa. Encoding or packing information Boolean flags Table lookup Factorials Disk-based Space/Time Tradeoff Principle: The smaller you make the disk storage requirements, the faster your program will run.