CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.

Slides:



Advertisements
Similar presentations
Discrete Structures CISC 2315
Advertisements

Lecture: Algorithmic complexity
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Chapter 3 Growth of Functions
The Growth of Functions
© 2004 Goodrich, Tamassia 1 Lecture 01 Algorithm Analysis Topics Basic concepts Theoretical Analysis Concept of big-oh Choose lower order algorithms Relatives.
Introduction to Analysis of Algorithms
Complexity Analysis (Part I)
Analysis of Algorithms1 Estimate the running time Estimate the memory space required. Time and space depend on the input size.
Analysis of Algorithms
Cmpt-225 Algorithm Efficiency.
1 TCSS 342, Winter 2005 Lecture Notes Course Overview, Review of Math Concepts, Algorithm Analysis and Big-Oh Notation Weiss book, Chapter 5, pp
Fall 2006CSC311: Data Structures1 Chapter 4 Analysis Tools Objectives –Experiment analysis of algorithms and limitations –Theoretical Analysis of algorithms.
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Algorithm Analysis CS 201 Fundamental Structures of Computer Science.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 1 Prepared by İnanç TAHRALI.
Algorithm Analysis (Big O)
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
Analysis of Performance
TCSS 342 Lecture Notes Course Overview, Review of Math Concepts,
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Analysis of Algorithms Algorithm Input Output © 2014 Goodrich, Tamassia, Goldwasser1Analysis of Algorithms Presentation for use with the textbook Data.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Week 2 CS 361: Advanced Data Structures and Algorithms
Analysis Tools Jyh-Shing Roger Jang ( 張智星 ) CSIE Dept, National Taiwan University.
Lecture 2 Computational Complexity
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
1 Computer Algorithms Lecture 3 Asymptotic Notation Some of these slides are courtesy of D. Plaisted, UNC and M. Nicolescu, UNR.
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithm Analysis Part of slides are borrowed from UST.
Asymptotic Notations By Er. Devdutt Baresary. Introduction In mathematics, computer science, and related fields, big O notation describes the limiting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Asymptotic Performance. Review: Asymptotic Performance Asymptotic performance: How does algorithm behave as the problem size gets very large? Running.
CSE 373: Data Structures and Algorithms
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
1 COMP9024: Data Structures and Algorithms Week Two: Analysis of Algorithms Hui Wu Session 2, 2014
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Analysis of Algorithms
COMP9024: Data Structures and Algorithms
Introduction to Algorithms
Analysis of Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
CS 201 Fundamental Structures of Computer Science
Chapter 2.
Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Analysis of Algorithms
Presentation transcript:

CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn

For Next Class, Thursday Homework 1 due tonight –Quiz 2 – Today, 01/21, before class!Quiz 2 Up to 3 submissions –Quiz 3 Thursday by class timeQuiz 3 Thursday by class time Homework 2 due Monday, 01/27, 11:59pm For Thursday –Chapter 2, Sections

Quiz Answers Let’s go to Blackboard Learn and see

Generics ng14/332/notes/day2_unix/Generics.ppthttp://people.uncw.edu/guinnc/courses/spri ng14/332/notes/day2_unix/Generics.ppt

Is This Algorithm Fast? Problem: given a problem, how fast does this code solve that problem? "My program finds all the primes between 2 and 1,000,000,000 in 1.37 seconds." –How good is this solution? Could try to measure the time it takes, but that is subject to lots of errors –multitasking operating system –speed of computer –language solution is written in

Math background: exponents Exponents –X Y, or "X to the Y th power"; X multiplied by itself Y times Some useful identities –X A X B = X A+B –X A / X B = X A-B –(X A ) B = X AB –X N +X N = 2X N –2 N +2 N = 2 N+1

Math background:Logarithms Logarithms –definition: X A = B if and only if log X B = A –intuition: log X B means: "the power X must be raised to, to get B" –In this course, a logarithm with no base implies base 2. log B means log 2 B Examples –log 2 16 = 4(because 2 4 = 16) –log = 3(because 10 3 = 1000)

Identities for logs with addition, multiplication, powers: log (AB) = log A + log B log (A/B) = log A – log B log (A B ) = B log A log A (B) = log C (B)/log C (A) Logarithm identities

Some helpful mathematics N + N + N + …. + N (total of N times) –N*N = N 2 which is O(N 2 ) … + N –N(N+1)/2 = N 2 /2 + N/2 is O(N 2 )

Analysis of Algorithms What do we mean by an “efficient” algorithm? –We mean an algorithm that uses few resources. –By far the most important resource is time. –Thus, when we say an algorithm is efficient (assuming we do not qualify this further), we mean that it can be executed quickly.

Is there some way to measure efficiency that does not depend on the state of current technology? –Yes! The Idea –Determine the number of “steps” an algorithm requires when given some input. We need to define “step” in some reasonable way. –Write this as a formula, based on the input.

Generally, when we determine the efficiency of an algorithm, we are interested in: –Time Used by the Algorithm Expressed in terms of number of steps. People also talk about “space efficiency”, etc. –How the Size of the Input Affects Running Time Think about giving an algorithm a list of items to operate on. The size of the problem is the length of the list. –Worst-Case Behavior What is the slowest the algorithm ever runs for a given input size? Occasionally we also analyze average-case behavior.

Typically use a simple model for basic operation costs RAM (Random Access Machine) model –RAM model has all the basic operations: +, -, *, /, =, comparisons –fixed sized integers (e.g., 32-bit) –infinite memory –All basic operations take exactly one time unit (one CPU instruction) to execute RAM model

Critique of the model Strengths: –simple –easier to prove things about the model than the real machine –can estimate algorithm behavior on any hardware/software Weaknesses: –not all operations take the same amount of time in a real machine –does not account for page faults, disk accesses, limited memory, floating point math, etc

Relative rates of growth Most algorithms' runtime can be expressed as a function of the input size N Rate of growth: measure of how quickly the graph of a function rises Goal: distinguish between fast- and slow- growing functions –We only care about very large input sizes (for small sizes, most any algorithm is fast enough)

Growth rate example Consider these graphs of functions. Perhaps each one represents an algorithm: n 3 + 2n 2 100n Which grows faster?

Growth rate example How about now?

“The fundamental law of computer science: As machines become more powerful, the efficiency of algorithms grows more important, not less.” — Nick Trefethen An algorithm (or function or technique …) that works well when used with large problems & large systems is said to be scalable. –Or “it scales well”.

Big O Definition: T(N) = O(f(N)) if there exist positive constants c, n 0 such that: T(N)  c · f(N) for all N  n 0 Idea: We are concerned with how the function grows when N is large. We are not picky about constant factors: coarse distinctions among functions Lingo: "T(N) grows no faster than f(N)."

Big O  c, n 0 > 0 such that f(N)  c g(N) when N  n 0 f(N) grows no faster than g(N) for “large” N

pick tightest bound. If f(N) = 5N, then: f(N) = O(N 5 ) f(N) = O(N 3 ) f(N) = O(N log N) f(N) = O(N)  preferred ignore constant factors and low order terms T(N) = O(N), not T(N) = O(5N) T(N) = O(N 3 ), not T(N) = O(N 3 + N 2 + N log N) remove non-base-2 logarithms f(N) = O(N log 6 N) f(N) = O(N log N)  preferred Preferred big-O usage

Big-O of selected functions

Defn: T(N) =  (g(N)) if there are positive constants c and n 0 such that T(N)  c g(N) for all N  n 0 –Lingo: "T(N) grows no slower than g(N)." Defn: T(N) =  (g(N)) if and only if T(N) = O(g(N)) and T(N) =  (g(N)). –Big-O, Omega, and Theta establish a relative ordering among all functions of N Big Omega, Theta

notationintuition O (Big-O)   (Big-Omega)   (Theta) = o (little-O) < Intuition about the notations

Big-Omega  c, n 0 > 0 such that f(N)  c g(N) when N  n 0 f(N) grows no slower than g(N) for “large” N

Big Theta: f(N) =  (g(N)) the growth rate of f(N) is the same as the growth rate of g(N)

An O(1) algorithm is constant time. –The running time of such an algorithm is essentially independent of the input. –Such algorithms are rare, since they cannot even read all of their input. An O(log b n) [for some b] algorithm is logarithmic time. –We do not care what b is. An O(n) algorithm is linear time. –Such algorithms are not rare. –This is as fast as an algorithm can be and still read all of its input. An O(n log b n) [for some b] algorithm is log-linear time. –This is about as slow as an algorithm can be and still be truly useful (scalable). An O(n 2 ) algorithm is quadratic time. –These are usually too slow. An O(b n ) [for some b] algorithm is exponential time. –These algorithms are much too slow to be useful.

T(N) = O(f(N)) –f(N) is an upper bound on T(N) –T(N) grows no faster than f(N) T(N) =  (g(N)) –g(N) is a lower bound on T(N) –T(N) grows at least as fast as g(N) T(N) = o(h(N)) (little-O) –T(N) grows strictly slower than h(N) Hammerin’ the terminolgy

Asymptotically less than or equal to O (Big-O) Asymptotically greater than or equal to  (Big-Omega) Asymptotically equal to  (Big-Theta) Asymptotically strictly less o (Little-O) Notations

Facts about big-O If T(N) is a polynomial of degree k, then: T(N) =  (N k ) –example: 17n 3 + 2n 2 + 4n + 1 =  (n 3 )

Hierarchy of Big-O Functions, ranked in increasing order of growth: –1 –log log n –log n –n –n log n –n 2 –n 2 log n –n 3... –2 n –n! –n n

Various growth rates

Evaluate: limit isBig-Oh relation 0f(N) = o(g(N)) c  0f(N) =  (g(N))  g(N) = o(f(N)) no limitno relation Techniques for Determining Which Grows Faster

L'Hôpital's rule: If and, then example: f(N) = N, g(N) = log N Use L'Hôpital's rule f'(N) = 1, g'(N) = 1/N  g(N) = o(f(N)) Techniques, cont'd

for (int i = 0; i < n; i += c) // O(n) statement(s); Adding to the loop counter means that the loop runtime grows linearly when compared to its maximum value n. –Loop executes its body exactly n / c times. Program loop runtimes

for (int i = 0; i < n; i *= c) // O(log n) statement(s); Multiplying the loop counter means that the maximum value n must grow exponentially to linearly increase the loop runtime; therefore, it is logarithmic. –Loop executes its body exactly log c n times.

for (int i = 0; i < n * n; i += c) // O(n 2 ) statement(s); The loop maximum is n 2, so the runtime is quadratic. –Loop executes its body exactly (n 2 / c) times.

Nesting loops multiplies their runtimes. for (int i = 0; i < n; i += c) { //O(n 2 ) for (int j = 0; j < n; i += c) { statement; } More loop runtimes

Loops in sequence add together their runtimes, which means the loop set with the larger runtime dominates. for (int i = 0; i < n; i += c) { // O(n) statement; } // O(nlog n) for (int i = 0; i < n; i += c) { for (int j = 0; j < n; i *= c) { statement; }

Express the running time as f(N), where N is the size of the input worst case: your enemy gets to pick the input average case: need to assume a probability distribution on the inputs Types of runtime analysis

Some rules When considering the growth rate of a function using Big-O Ignore the lower order terms and the coefficients of the highest-order term No need to specify the base of logarithm –Changing the base from one constant to another changes the value of the logarithm by only a constant factor If T 1 (N) = O(f(N) and T 2 (N) = O(g(N)), then –T 1 (N) + T 2 (N) = max(O(f(N)), O(g(N))), –T 1 (N) * T 2 (N) = O(f(N) * g(N))

For Next Class, Thursday Homework 1 due tonight –Quiz 2 – Today, 01/21, before class!Quiz 2 Up to 3 submissions –Quiz 3 Thursday by class timeQuiz 3 Thursday by class time Homework 2 due Monday, 01/27, 11:59pm For Thursday –Chapter 2, Sections