Download presentation

Presentation is loading. Please wait.

Published bySalma Tims Modified over 2 years ago

1
**CSCE 2100: Computing Foundations 1 Running Time of Programs**

Tamara Schneider Summer 2013

2
**What is Efficiency? Time it takes to run a program? Resources**

Storage space taken by variables Traffic generated on computer network Amount of data moved to and from disk

3
**Summarizing Running Time**

Benchmarking Use of benchmarks: small collection of typical inputs Analysis Group input based on size Running time is influenced by various factors Computer Compiler

4
Running Time worst-case running time: maximum running time over all inputs of size 𝑛 average running time: average running time of all inputs of size 𝑛 best-case running time: minimum running time over all inputs of size 𝑛

5
**Worst, Best, and Average Case**

6
**Running Time of a Program**

𝑇(𝑛) is the running time of a program as a function of the input size 𝑛. 𝑇(𝑛) = 𝑐𝑛 indicates that the running time is linearly proportional to the size of the input, that is, linear time.

7
**Running Time of Simple Statements**

We assume that “primitive operations” take a single instruction. Arithmetic operations (+, %, *, -, ...) Logical operations (&&, ||, ...) Accessing operations (A[i], x->y, ...) Simple assignment Calls to library functions (scanf, printf, ... )

8
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1

9
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1

10
Code Segment 1 sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1)

11
**Code Segment 1 1 1 + (n+1) + n = 2n+2 sum = 0; for(i=0; i<n; i++)**

12
**Code Segment 1 1 1 + (n+1) + n = 2n+2 1 How many times? sum = 0;**

for(i=0; i<n; i++) sum++; 1 1 + (n+1) + n = 2n+2 1 How many times?

13
**Code Segment 1 1 1 + (n+1) + n = 2n+2 1 How many times?**

sum = 0; for(i=0; i<n; i++) sum++; 1 1 + (n+1) + n = 2n+2 1 How many times? 1 + (2n+2) + n*1 = 3n + 3 Complexity?

14
**Code Segment 2 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++)**

15
**Code Segment 2 1 sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++)**

16
**Code Segment 2 1 2n+2 sum = 0; for(i=0; i<n; i++)**

for(j=0; j<n; j++) sum++; 1 2n+2

17
**Code Segment 2 1 2n+2 2n+2 sum = 0; for(i=0; i<n; i++)**

for(j=0; j<n; j++) sum++; 1 2n+2 2n+2

18
**Code Segment 2 1 2n+2 2n+2 1 sum = 0; for(i=0; i<n; i++)**

for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1

19
**Code Segment 2 1 2n+2 2n+2 1 1 + (2n+2) + (2n+2)*n + n*n*1 Complexity?**

sum = 0; for(i=0; i<n; i++) for(j=0; j<n; j++) sum++; 1 2n+2 2n+2 1 1 + (2n+2) + (2n+2)*n + n*n*1 Complexity?

20
**Code Segment 3 1 2n+2 ? 1 Complexity? sum = 0; for(i=0; i<n; i++)**

for(j=0; j<n*n; j++) sum++; 1 2n+2 ? 1 Complexity?

21
**Code Segment 4 1 2n+4 ? 1 Complexity? sum = 0; for(i=0; i<=n; i++)**

for(j=0; j<i; j++) sum++; 1 2n+4 ? 1 Complexity? i=0 i=1 j=0 i=2 j=0 j=1 i=3 j=0 j=1 j=2 … i=n j=0 j=1 j=2 j= j=n-1

22
**How Do Running Times Compare?**

23
**Towards “Big Oh” t (time) c f(n), e.g. 5 x2 with c = 5, f(n)=x2**

T(n) describes the runtime of some program, e.g. T(n) = 2x2-4x+3 n (input size) n0 We can observe that for an input size n ≥ n0 , the graph of the function c f(n) has a higher time value than the graph for the function T(n). For n ≥ n0, c f(n) is an upper bound on T(n), i.e. c f(n) ≥ T(n).

24
Big-Oh [1] It is too much work to use the exact number of machine instructions Instead, hide the details average number of compiler-generated machine instructions average number of instructions executed by a machine per second Simplification Instead of 4m-1 write O(m) O(m) ?!

25
**Big-Oh [2] Restrict argument to integer 𝑛 ≥ 0**

𝑇(𝑛) is nonnegative for all 𝑛 Definition: 𝑇(𝑛) is 𝑂(𝑓(𝑛)) if ∃ an integer 𝑛0 and a constant 𝑐 > 0: ∀ 𝑛 ≥ 𝑛0, 𝑇 𝑛 ≤ 𝑐·𝑓(𝑛) ∃ “there exists” ∀ “for all”

26
**Big-Oh - Example [1] Example 1: T(0) = 1 T(1) = 4 T(2) = 9**

Definition: T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) Example 1: T(0) = 1 T(1) = 4 T(2) = 9 in general : T(n) = (n+1)2 Is T(n) also O(n2) ???

27
**Big-Oh - Example [2] Definition:**

T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n)=(n+1)2. We want to show that T(n) is O(n2). In other words, f(n) = n2 If this is true, there exist and integer n0 and a constant c > 0 such that for all n ≥ n0 : T(n) ≤ cn2

28
**Big-Oh - Example [3] Definition:**

T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 Choose c=4, n0=1: Show that (n+1)2 ≤ 4n2 for n ≥ 1 (n+1)2 = n2 + 2n + 1 ≤ n2 + 2n2 + 1 = 3n2 + 1 ≤ 3n2 + n2 = 4n2 = cn2

29
**Big-Oh - Example [Alt 3] Definition:**

T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n) T(n) ≤ cn2 ⇔ (n+1)2 ≤ cn2 Choose c=2, n0=3: Show that (n+1)2 ≤ 2n2 for n ≥ 3 (n+1)2 = n2 + 2n + 1 ≤ n2 + n2 = 2n2 = cn2 For all n≥3: 2n+2 ≤ n2

30
**Simplification Rules for Big-Oh**

Constant factors can be omitted O(54n2) = O(n2) Lower-oder terms can be omitted O(n4 + n2) = O(n4) O(n2) + O(1) = O(n2) Note that the highest-order term should never be negative. Lower order terms can be negative. Negative terms can be omitted since they do not increase the runtime.

31
**Transitivity [1] What is transitivity? Is Big Oh transitive?**

if A☺B and B☺C, then A☺C example: a < b and b < c, then a < c e.g. 2 < 4 and 4 < 7, then 2 < since “<“ is transitive Is Big Oh transitive?

32
Transitivity [2] if f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) f(n) is O(g(n)): ∃ n1, c1 such that f(n) ≤ c1 g(n) ∀ n ≥ n1 g(n) is O(h(n)): ∃ n2, c2 such that g(n) ≤ c2 h(n) ∀ n ≥ n2 Choose n0 = max{n1,n2} and c = c1 c2 f(n) ≤ c1 g(n) ≤ c1 c2 h(n) ⇒ f(n) is O(h(n)) ≤ c2 h(n)

33
**Tightness Use constant factor “1”**

Use tightest upper bound that we can proof 3n is O(n2) and O(n) and O(2n) Which one should we use?

34
**Summation Rule [1] Consider a program that that contains 2 parts**

Part 1 takes T1(n) time and is O(f1(n)) Part 2 takes T2(n) time and is O(f2(n)) We also know that f2 grows no faster than f1 ⇒ f2(n) is O(f1(n)) What is the running time of the entire program? T1(n) + T2(n) is O(f1(n) + f2(n)) But can we simplify this?

35
Summation Rule [2] T1(n) + T2(n) is O(f1(n)) since f2 grows no faster than f1 Proof: T1(n) ≤ c1 f1(n) for n ≥ n1 T2(n) ≤ c2 f2(n) for n ≥ n2 f2(n) ≤ c3 f1(n) for n ≥ n3 n0 = max{n1,n2,n3} T1(n) + T2(n) ≤ c1 f1(n) + c2 f2(n) = c1 f1(n) + c2 f2(n) ≤ c1 f1(n) + c2 c3 f1(n) = c1 +c2 c3 f1(n) = c f1(n) with c=c1+c2c ⇒ T1(n) + T2(n) is O(f1(n))

36
**Summation Rule - Example**

//make A identity matrix scanf("%d", &d); for(i=0; i<n; i++) for(j=0; j<n; j++) A[i][j] = 0; A[i][i] = 1; 𝑂(1) 𝑂(𝑛) O(n2) 𝑂(1) 𝑂(𝑛) 𝑂(1) O(1) + O(n2) + O(n) = O(n2)

37
**Summary of Rules & Concepts [1]**

Worst-case, average-case, and best-case running time are compared for a fixed input size n, not for varying n! Counting Instructions Assume 1 instruction for assignments, simple calculations, comparisons, etc. Definition of Big-Oh T(n) is O(f(n)) if ∃ an integer n0 and a constant c > 0: ∀ n ≥ n0 T(n) ≤ cf(n)

38
**Summary of Rules & Concepts [2]**

Rule 1: Constant factors can be omitted Example: O(3n5) = O(n5) Rule 2: Low order terms can be omitted Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(3n5) We can combine Rule 1 and Rule 2: Example: O(3n5 + 10n4 - 4n3 + n + 1) = O(n5)

39
**Summary of Rules & Concepts [3]**

For O(f(n) + g(n)), we can neglect the function with the slower growth rate. Example: O(f(n) + g(n)) = O(n + nlogn) = O(nlogn) Transitivity: If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n)) Example: f(n)=3n, g(n)=n2, h(n)=n6 3n is O(n2) and n2 is O(n6) 3n is O(n6) Tightness: We try to find an upper bound Big-Oh that is as small as possible. Example: n2 is O(n6), but is O(n2) is a much tighter (and better) bound.

40
**Solutions to Instruction Counts on Code Segments**

Instructions Big Oh Code Segment 1 3n + 3 O(n) Code Segment 2 3n2 + 4n + 3 O(n2) Code Segment 3 3n3 + 4n + 3 O(n3) Code Segment 4 Argh!

Similar presentations

OK

Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright © 2008-2013.

Coursenotes CS3114: Data Structures and Algorithms Clifford A. Shaffer Department of Computer Science Virginia Tech Copyright © 2008-2013.

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on applied operational research ppt Free download ppt on autonomous car Ppt on non agricultural activities in nigeria Ppt on inhabiting other planets outside the solar Research presentation ppt on caterpillar Ppt on forward rate agreement ppt Ppt on various properties of air Ppt on business cycle phases order Short ppt on unemployment in india Ppt on bank lending ratios