Lecture: Algorithmic complexity

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

 O: order of magnitude  Look at the loops and to see whether the loops are nested. ◦ One single loop: O(n) ◦ A nested loop: O(n 2 ) ◦ A nested loop.
Algorithm Analysis.
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
Chapter 3 Growth of Functions
Chapter 10 Algorithm Efficiency
Complexity Analysis (Part I)
CISC220 Spring 2010 James Atlas Lecture 06: Linked Lists (2), Big O Notation.
Cmpt-225 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
Spring2012 Lecture#10 CSE 246 Data Structures and Algorithms.
Chapter 6 Algorithm Analysis Bernard Chen Spring 2006.
Algorithm analysis and design Introduction to Algorithms week1
Asymptotic Notations Iterative Algorithms and their analysis
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Chapter 2.6 Comparison of Algorithms modified from Clifford A. Shaffer and George Bebis.
Week 2 CS 361: Advanced Data Structures and Algorithms
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Data Structures and Algorithms Lecture 5 and 6 Instructor: Quratulain Date: 15 th and 18 th September, 2009 Faculty of Computer Science, IBA.
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Mathematics Review and Asymptotic Notation
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright ©
Analysis of Algorithms CSCI Previous Evaluations of Programs Correctness – does the algorithm do what it is supposed to do? Generality – does it.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Asymptotic Analysis (based on slides used at UMBC)
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithmic Analysis Charl du Plessis and Robert Ketteringham.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
CSE 373: Data Structures and Algorithms
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Searching Topics Sequential Search Binary Search.
CISC220 Spring 2010 James Atlas Lecture 07: Big O Notation.
Dale Roberts Department of Computer and Information Science, School of Science, IUPUI Dale Roberts, Lecturer Computer Science, IUPUI
Ch03-Algorithms 1. Algorithms What is an algorithm? An algorithm is a finite set of precise instructions for performing a computation or for solving a.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Data Structures & Algorithm CS-102 Lecture 12 Asymptotic Analysis Lecturer: Syeda Nazia Ashraf 1.
Algorithm Analysis 1.
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction to Algorithms
Computational Complexity
Lecture 06: Linked Lists (2), Big O Notation
CS 3343: Analysis of Algorithms
Time Complexity Analysis Neil Tang 01/19/2010
Lab 03 - Iterator.
What is CS 253 about? Contrary to the wide spread belief that the #1 job of computers is to perform calculations (which is why the are called “computers”),
Analysis of Algorithms
Complexity Analysis (Part II)
Analysis of Algorithms
Algorithms and data structures: basic definitions
Presentation transcript:

Lecture: Algorithmic complexity CS340 Data Structures Lecture: Algorithmic complexity

CS340 Lecture objectives Define algorithms and how useful they are to solve the world’s problems Understand algorithmic complexity Use analytical thinking to estimate the complexity of your code Learn the complexity lingo

CS340 Algorithms DS 2.2

CS340 What is an algorithm? Finite sequence of precise instructions that solve a problem or perform an action Examples? Go to a building, do something How to install… Troubleshoot a machine Fire extinguishing Manual of your watch, iphone etc.

Algorithm vs. Program Can algorithms solve every problem??? CS340 Algorithm vs. Program Computer program: Written in a programming language Implements an algorithm Algorithm: Written in natural language Solves a problem Can algorithms solve every problem???

Can algorithms solve every problem? CS340 Can algorithms solve every problem? Algorithms are countable. Problems are uncountable. There are too many problems, too few algorithms. Turing halting problem. Turing machine (i.e., a computer program) as an input. It asks whether this Turing machine will halt. Unfortunately, there is no algorithm that can solve this problem. while True: continue Does not halt!!

CS340 Algorithm Efficiency Big-O

Algorithm Complexity How do we measure complexity? Evaluate: Time CS340 Algorithm Complexity How do we measure complexity? Time Space (memory) Communication Evaluate: Average case Worst case Best case

CS340 Algorithm Complexity Time: how many seconds does it take to execute our code? Space: how much memory do we use? Communication: how many exchanges with other programs, external input etc. These vary: From computer to computer Different inputs to our program

Algorithm complexity and Java CS340 Algorithm complexity and Java What happens when you start running a java program? Load the JVM (Java Virtual Machine) The .class file of your program is loaded The .class files referenced by you program are loaded Finally your program executes

Algorithm Efficiency and Big-O CS340 Algorithm Efficiency and Big-O Precise measure of the performance of an algorithm is difficult Java profilers: pros and cons Big-O: performance of an algorithm as a function of the number of items to be processed Algorithm comparison Very large problems

CS340 Linear Growth Rate Processing time increases in proportion to the number of inputs N: public static int search(int[] x, int target) { for(int i=0; i<x.length; i++) { if (x[i]==target) return i; } return -1; // target not found

CS340 Linear Growth Rate for loop MAY execute x.length times, why? for loop may execute (on average) (x.length + 1)/2 times Total execution time is directly proportional to x.length Growth rate of order n OR O(n) Processing time increases in proportion to the number of inputs n: public static int search(int[] x, int target) { for(int i=0; i<x.length; i++) { if (x[i]==target) return i; } return -1; // target not found

CS340 n x m Growth Rate Processing time can be dependent on two different inputs public static boolean areDifferent(int[] x, int[] y) { for(int i=0; i<x.length; i++) { if (search(y, x[i]) != -1) return false; } return true;

CS340 n x m Growth Rate (cont.) for loop executes x.length times search, executes y.length times Total execution time proportional to: (x.length * y.length) Growth rate has an order of n x m or O(n x m) Processing time can be dependent on two different inputs. public static boolean areDifferent(int[] x, int[] y) { for(int i=0; i<x.length; i++) { if (search(y, x[i]) != -1) return false; } return true;

CS340 Quadratic Growth Rate Proportional to the square of the number of inputs n public static boolean areUnique(int[] x) { for(int i=0; i<x.length; i++) { for(int j=0; j<x.length; j++) { if (i != j && x[i] == x[j]) return false; } return true;

Big-O Notation O() Look at loops in the code CS340 Big-O Notation O() "order of magnitude“ OR upper bound Look at loops in the code Assuming a loop body consists only of simple statements, a single loop is O(n) a pair of nested loops is O(n2) a nested pair of loops inside another is O(n3) and so on . . .

Big-O Notation (cont.) Examine the number of times a loop is executed CS340 Big-O Notation (cont.) Examine the number of times a loop is executed for(i=1; i < x.length; i *= 2) { // Do something with x[i] } The loop body will execute k-1 times, with i having the following values: 1, 2, 4, 8, 16, . . ., 2k until 2k is greater than x.length Since 2k-1 = x.length < 2k and log22k is k, we know that k-1 = log2(x.length) < k O(log n)

Formal Definition of Big-O CS340 Formal Definition of Big-O for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { Simple Statement } Simple Statement 1 Simple Statement 2 Simple Statement 3 Simple Statement 4 Simple Statement 5 Simple Statement 6 Simple Statement 7 ... Simple Statement 30

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { Simple Statement } Simple Statement 1 Simple Statement 2 Simple Statement 3 Simple Statement 4 Simple Statement 5 Simple Statement 6 Simple Statement 7 ... Simple Statement 30 This nested loop executes a Simple Statement n2 times

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { Simple Statement } Simple Statement 1 Simple Statement 2 Simple Statement 3 Simple Statement 4 Simple Statement 5 Simple Statement 6 Simple Statement 7 ... Simple Statement 30 This loop executes 5 Simple Statements n times (5n)

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { Simple Statement } Simple Statement 1 Simple Statement 2 Simple Statement 3 Simple Statement 4 Simple Statement 5 Simple Statement 6 Simple Statement 7 ... Simple Statement 30 Finally, 25 Simple Statements are executed

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { Simple Statement } Simple Statement 1 Simple Statement 2 Simple Statement 3 Simple Statement 4 Simple Statement 5 Simple Statement 6 Simple Statement 7 ... Simple Statement 30 The complexity of this code is: T(n) = n2 + 5n + 25 If n=5, 6, etc. n^2 is not so dominating But if n=1000??

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) In terms of T(n), T(n) = O(f(n)) There exist two constants, n0 and c, greater than zero, and a function, f(n), such that for all n > n0, cf(n) ≥ T(n) T(n) is an upper bound for this code. Worst case scenario! T(n) = O(f(n)) means T(n) grows no faster than f(n) asymptotically In other words, as n gets sufficiently large (larger than n0), there is some constant c for which the processing time will always be less than or equal to cf(n) cf(n) is an upper bound on performance Asymptotic versus exact: distance approaches to zero but lines never meet

Formal Definition of Big-O (cont.) CS340 Formal Definition of Big-O (cont.) Growth rate of f(n) == fastest growing term In the example, an algorithm of O(n2 + 5n + 25) is more simply expressed as O(n2) Rules of thumb: Ignore all constants Drop the lower-order terms

Big-O Example 1 Given T(n) = n2 + 5n + 25, show that this is O(n2) CS340 Big-O Example 1 Given T(n) = n2 + 5n + 25, show that this is O(n2) Find constants n0 and c so that, for all n > n0, cn2 > n2 + 5n + 25 Find the point where cn2 = n2 + 5n + 25 Let n = n0, and solve for c c = 1 + 5/ n0, + 25/ n0 2 When n0 is 5(1 + 5/5 + 25/25), c is 3 So, 3n2 > n2 + 5n + 25 for all n > 5 Other values of n0 and c also work

Big-O Example 1 (cont.) CS340 If the coefficients are positive, find a value of n that is equal to the rest of the terms

Big-O Example 2 Consider the following loop CS340 Big-O Example 2 Consider the following loop for (int i = 0; i < n; i++) { for (int j = i + 1; j < n; j++) { 3 simple statements } T(n) = 3(n – 1) + 3 (n – 2) + … + 3 Factoring out the 3, 3(n – 1 + n – 2 + n + … + 1) 1 + 2 + … + n – 1 = (n x (n-1))/2

Big-O Example 2 (cont.) Therefore T(n) = 1.5n2 – 1.5n CS340 Big-O Example 2 (cont.) Therefore T(n) = 1.5n2 – 1.5n When n = 0, the polynomial has the value 0 For values of n > 1, 1.5n2 > 1.5n2 – 1.5n Therefore T(n) is O(n2) when n0 is 1 and c is 1.5

CS340 Big-O Example 2 (cont.)

Symbols Used in Quantifying Performance CS340 Symbols Used in Quantifying Performance Symbol Meaning T(n) Time method or program takes as a function of the number of inputs n f(n) Any function of n O(f(n)) Order of magnituSet of functions that grow no faster than f(n)

CS340 Common Growth Rates ... < log(n) < sqrt(n) < n < n^2 < ... < a^n < n! < n^n < ...

Different Growth Rates CS340 Different Growth Rates Exp is smaller than all others for small n Linear larger than quadratic for n<20 For small values of n a less efficient algorithm may be better Growth rate is important!!

Effects of Different Growth Rates CS340 Effects of Different Growth Rates Ratio shows it will take f(100)/f(50) times to process 100 numbers as it would to process 50 numbers

Algorithms with Exponential and Factorial Growth Rates CS340 Algorithms with Exponential and Factorial Growth Rates Algorithms with exponential and factorial growth rates have an effective practical limit on the size of the problem they can be used to solve With an O(2n) algorithm, if 100 inputs takes an hour then, 101 inputs will take 2 hours 105 inputs will take 32 hours 114 inputs will take 16,384 hours (almost 2 years!)

Algorithms with Exponential and Factorial Growth Rates (cont.) CS340 Algorithms with Exponential and Factorial Growth Rates (cont.) Encryption algorithms take advantage of this characteristic Some cryptographic algorithms can be broken in O(2n) time, where n is the number of bits in the key A key length of 40 is considered breakable by a modern computer, but a key length of 100 bits will take a billion-(billion (1018) ) times longer than a key length of 40

Practice What is the big O of f(n)=n2+n2log(n) ? CS340 Practice What is the big O of f(n)=n2+n2log(n) ? What is the big O of f(n)=n3+n2log(n) ? What is the big O of f(n)= (nlog(n)+n2) (n+log(n))?

CS340 Algorithm Efficiency Other complexities…

More Greek letters Big-Omega: T(n) = Ω(f(n)) There exist Best case! CS340 More Greek letters Big-Omega: T(n) = Ω(f(n)) There exist two constants, n0 and c, greater than zero, and a function, f(n), such that for all n > n0, cf(n) ≤ T(n) Best case! Lower bound Means T(n) grows no slower than f(n) asymptotically

Big-Omega We say T(n)=Ω(f(n)) if f(n)=O(T(n)). Example: CS340 Big-Omega We say T(n)=Ω(f(n)) if f(n)=O(T(n)). Example: Since log n= O(n) Then n= Ω(log n) Example: n 1/2 = W( lg n) . Use the definition with c = 1 and n0 = 16. Checks OK. Let n > 16: n 1/2 > (1) lg n if and only if n > ( lg n )2 by squaring both sides.

Big-Theta Big-Theta: T(n) = 𝞠(f(n)) Best fit! CS340 Big-Theta Big-Theta: T(n) = 𝞠(f(n)) Best fit! Bounded above and below by f(n) T(n)= Θ(f(n)) if and only if f(n)=Θ(T(n)). T(n)=Θ(f(n)) if and only if f(n)= Ω(T(n)) and T(n)=Ω(f(n)). Big-O is an upper bound. Big-Theta is a tight bound, i.e. upper and lower bound. When people only worry about what's the worst that can happen, big-O is sufficient; i.e. it says that "it can't get much worse than this". The tighter the bound the better, of course, but a tight bound isn't always easy to compute.

Big-Theta Example: T(n) = n2 - 5n + 13. CS340 Big-Theta Example: T(n) = n2 - 5n + 13. The constant 13 doesn't change as n grows, so it is not crucial. The low order term, -5n, doesn't have much effect on f compared to the quadratic term, n2. Q: What does it mean to say T(n) = Q(f(n))? A: Intuitively, it means that function T is the same order of magnitude as f.

CS340 Big-Theta T(n) = 73n3+ 22n2+ 58 T(n) = O(n100), which is identical to T(n) ∈ O(n100) T(n) = O(n3), which is identical to T(n) ∈ O(n3) T(n) = Θ(n3), which is identical to T(n) ∈ Θ(n3) The equivalent English statements are respectively: T(n) grows asymptotically no faster than n100 T(n) grows asymptotically no faster than n3 T(n) grows asymptotically as fast as n3.

CS340 Big-O vs Big-Theta If an algorithm is of Θ(f(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to f(n). If an algorithm is of O(f(n)), it means that the running time of the algorithm as n gets larger is at most proportional to f(n).

P vs NP problems Back to the question: P: deterministic polynomial CS340 P vs NP problems Back to the question: Can algorithms solve any problem??? P: deterministic polynomial Good news: can be solved in deterministic time NP: non deterministic polynomial No fast solution They grow very fast! Why should we care? Many NP complete problems out there!

CS340 P vs NP problems

CS340 P vs NP problems