CS420 lecture two Fundamentals

Slides:



Advertisements
Similar presentations
Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
Advertisements

A simple example finding the maximum of a set S of n numbers.
CMSC 341 Asymptotic Analysis. 2 Mileage Example Problem: John drives his car, how much gas does he use?
Analysis of Algorithms
CS420 lecture one Problems, algorithms, decidability, tractability.
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
The Growth of Functions
Analysis of Recursive Algorithms
25 June 2015Comp 122, Spring 2004 Asymptotic Notation, Review of Functions & Summations.
Tirgul 2 Asymptotic Analysis. Motivation: Suppose you want to evaluate two programs according to their run-time for inputs of size n. The first has run-time.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
CS420 lecture three Lowerbounds wim bohm, cs CSU.
Analysis of Algorithms CPS212 Gordon College. Measuring the efficiency of algorithms There are 2 algorithms: algo1 and algo2 that produce the same results.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Analysis of Recursive Algorithms October 29, 2014
Instructor Neelima Gupta
Asymptotic Notations Iterative Algorithms and their analysis
1 Complexity Lecture Ref. Handout p
1 ©2008 DEEDS Group Introduction to Computer Science 2 - SS 08 Asymptotic Complexity Introduction in Computer Science 2 Asymptotic Complexity DEEDS Group.
Basic Concepts 2014, Fall Pusan National University Ki-Joune Li.
Mathematics Review and Asymptotic Notation
CS 3343: Analysis of Algorithms
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
2IL50 Data Structures Fall 2015 Lecture 2: Analysis of Algorithms.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Analyzing algorithms & Asymptotic Notation BIO/CS 471 – Algorithms for Bioinformatics.
Asymptotic Analysis-Ch. 3
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
Lecture 3 Analysis of Algorithms, Part II. Plan for today Finish Big Oh, more motivation and examples, do some limit calculations. Little Oh, Theta notation.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
Asymptotic Analysis (based on slides used at UMBC)
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Lecture 14 Lower Bounds Decision tree model Linear-time reduction.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Lecture aid 1 Devon M. Simmonds University of North Carolina, Wilmington CSC231 Data Structures.
8/3/07CMSC 341 Asymptotic Anaylsis1 CMSC 341 Asymptotic Analysis.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Asymptotic Complexity
Algorithms Chapter 3.
Chapter 2 Algorithm Analysis
Introduction to Algorithms: Asymptotic Notation
Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Introduction Algorithms Order Analysis of Algorithm
CS 3343: Analysis of Algorithms
Growth of functions CSC317.
The Growth of Functions
CS 3343: Analysis of Algorithms
Computation.
Time Complexity Analysis Neil Tang 01/19/2010
Algorithm Analysis Neil Tang 01/22/2008
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Introduction to Algorithms Analysis
Asymptotic Growth Rate
Fundamentals of Algorithms MCS - 2 Lecture # 9
CS 3343: Analysis of Algorithms
CS 2210 Discrete Structures Algorithms and Complexity
Analysis of Algorithms
CS200: Algorithms Analysis
CMSC 203, Section 0401 Discrete Structures Fall 2004 Matt Gaston
At the end of this session, learner will be able to:
Analysis of Algorithms
Presentation transcript:

CS420 lecture two Fundamentals wim bohm, cs, CSU

Asymptotics Asymptotics show the relative growth of functions by comparing them to other functions. There are different notations: f(x) ~ g(x): f(x) = o(g(x)): where f and g are positive functions

Big O, big omega, big theta f(x) = O(g(x)) iff there are positive integers c and n0 such that f(x) < c.g(x) for all x > n0 f(x) = Ω(g(x)) iff f(x) > c.g(x) for all x > n0 f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))

Big O etc. Big O used in upper bounds, ie the worst or average case complexity of an algorithm. Big theta is a tight bound, eg stronger than upper bound. Big omega used in lower bounds, ie the complexity of a problem. (So we were sloppy in the last lecture.)

Closed / open problems A closed problem has a lower bound Ω(f(x)) and an algorithm with upper bound O(f(x)) eg searching, sorting what about matrix multiply? An open problem has lower bound < upper bound

lower bounds An easy lower bound on a problem is the size of the output it needs to produce, or the number of inputs it has to access Generate all permutations of size n, lower bound? Towers of Hanoi, lower bound? Sum n input integers, lower bound? but... sum integers 1 to n?

growth rates f(n) = O(1) constant Scalar operations (+,-,*,/) when input size not measured in #bits Straight line code of simple assignments (x= simple expresssion) and conditionals with simple sub-expressions Function calls, discuss

f(n) = log(n) definition: bx = a  x = logba, eg 23=8, log28=3 log(x*y) = log x + log y because bx . by = bx+y log(x/y) = log x – log y log xa= a log x log x is a 1to1 monotonically growing function log x = log y  x=y

more log stuff logax = logbx / logba because

and more log stuff

log n and algorithms In algorithm analysis we often use log n when we should use floor(log(n)). Is that OK? When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log n steps to get to the base case Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?

log n and algorithms In algorithm analysis we often use log n when we should use floor(log(n)). That's OK floor(log(n)) = O(log(n)) When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log n steps to get to the base case Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?

log n and algorithms DivCo(P){ Algorithms with O(log n) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P)  P1 ... Pn DivCo(Pi) } eg Binary Search

log n and algorithms DivCo(P){ Algorithms with O(log n) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P)  P1 ... Pn DivCo(Pi) } Solve f(n) = f(n/2) + 1 f(1)=1 by repeated substitution

General Divide and Conquer DivCo(P) { if base(P) solve-direct(P) else split(P)  P1 ... Pn combine(DivCo(P1) ... DivCo(Pn)) } Depending on the costs of base, solve direct, split and combine we get different complexities (later).

f(n)=O(n) Linear complexity eg Linear Search in an unsorted list also: Polynomial evaluation A(x) = anxn+ an-1xn-1+...+ a1x+a0 (an!=0) Evaluate A(x0) How not to do it: an * exp(x,n)+ an-1*exp(x,n-1)+...+ a1*x+a0 why not?

How to do it: Horner's rule y=a[n] for (i=n-1;i>=0;i--) y = y *x + a[i]

Horner complexity Lower bound: Ω(n) because we need to access each a[i] at least once Upper bound: O(n) Closed problem But what if A(x) = xn Horner not optimal for xn

A(x)=xn Recurrence: x2n=xn.xn x2n+1=x.x2n y=1; Complexity? while(n!=0){ if(odd(n)) y=y*x; x=x*x; n = n/2; } Complexity?

O(n log(n)) Often resulting from divide and conquer algorithms where split & combine are O(n) and we divide in nearly equal halves. mergesort(A){ if size(A) <= 1 return A else return merge(mergesort(left half(A)), mergesort(right half(A)))

Merge Sort - Split {7,3,2,9,1,6,4,5} {7,3,2,9} {1,6,4,5} {7,3} {2,9} {1,6} {4,5} Give out cards, order suits: hearts, diamonds, spades, clubs Ask them to sort this way. Point out physical ease. {7} {3} {2} {9} {1} {6} {4} {5}

Merge Sort - Merge {1,2,3,4,5,6,7,9} {2,3,7,9} {1,4,5,6} {3,7} {2,9} {1,6} {4,5} {7} {3} {2} {9} {1} {6} {4} {5}

Merge sort complexity Time Space Total cost of all splits? Cost of each merge level in the tree? How many merge levels in the tree? Space Is extra space needed? (outside the array A) If so, how much?

Series

Geometric Series

Harmonic series why?

Products why?

Using integrals to bound sums If f is a monotonically growing function, then:

f(x1)+f(x2)+f(x3)+f(x4)<=f(x2)+f(x3)+f(x4)+f(x5) x1 x2 x3 x4 x5

some formula manipulation....

Example of use

using the integral bounds

concluding...

Sorting lower bound We will use in proving a lower bound on sorting