Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists) Henry Kautz Winter 2002.

Similar presentations


Presentation on theme: "CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists) Henry Kautz Winter 2002."— Presentation transcript:

1 CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists)
Henry Kautz Winter 2002

2 Assignment #1 Goals of this assignment:
Introduce the ADTs (abstract data types) for lists and sparse vectors, motivated by an application to information retrieval. Show the connection between the empirical runtime scaling of an algorithm and formal asymptotic complexity Gain experience with the Unix tools g++, make, gnuplot, csh, and awk. Learn how to use templates in C++. We will use new g++ version 3.0 compiler – does templates right!

3 Implementing Linked Lists in C
(optional header) (a b c) a b c L struct node{ Object element; struct node * next; } Everything else is a pointer to a node! typedef stuct node * List; typedef struct node * Position;

4 Implementing in C++ (optional (a b c) header) 
Create separate classes for Node List (contains a pointer to the first node) List Iterator (specifies a position in a list; basically, just a pointer to a node) Pro: syntactically distinguishes uses of node pointers Con: a lot of verbage! Also, is a position in a list really distinct from a list?

5 Structure Sharing L = (a b c) M = (b c) 
Important technique for conserving memory usage in large lists with repeated structure Used in many recursive algorithms on lists

6 Implementing Linked Lists Using Arrays
1 2 3 4 5 6 7 8 9 10 Data F O A R N R T Next 3 8 6 4 -1 10 5 First = 2 “Cursor implementation” Ch 3.2.8 Can use same array to manage a second list of unused cells

7 Problem? List ADT Polynomial ADT ( 5 2 3 ) ( 7 8 ) ( 3 0 2 )
Ai is the coefficient of the xi-1 term: 5 + 2x + 3x2 ( ) 7 + 8x ( 7 8 ) Here’s an application of the list abstract data type as a _data structure_ for another abstract data type. Is there a problem here? Why? 3 + x2 ( ) Problem?

8 4 + 3x2001 ( ) What is it about lists that makes this a problem here and not in stacks and queues? (Answer: kth(int)!) Is there a solution? Will we get anything but zeroes overwhelming this data structure?

9 Sparse List Data Structure: 4 + 3x2001
(<4 0> <2001 3>) 4 3 2001 This slide is made possible in part by the sparse list data structure. Now, two questions: 1) Is a sparse list really a data structure or an abstract data type? (Answer: It depends but I lean toward data structure. YOUR ANSWER MUST HAVE JUSTIFICATION!) 2) Which list data structure should we use to implement it? Linked Lists or Arrays?

10 Addition of Two Polynomials?
15+10x50+3x1200 p 15 10 50 3 1200 5+30x50+4x100 q 5 30 50 4 100

11 Addition of Two Polynomials
Similar to merging two sorted lists – one pass! 15+10x50+3x1200 p 15 10 50 3 1200 5+30x50+4x100 q 5 30 50 4 100 r 20 40 50 4 100 3 1200

12 To ADT or NOT to ADT? Issue: when to bypass / expand List ADT?
Using general list operations: reverse(x) { y = new list; while (! x.empty()){ /* remove 1st element from x, insert in y */ y.insert_after_kth( x.kth(1), 0); x.delete_kth(1); } return y; } Disadvantages?

13 Destructive LL Version
a b c x a b c x reverse(node * x) { last = NULL; while (x->next != NULL){ tmp = x->next; x->next = last; last = x; x = tmp;} return x; } Faster in practice? Asymptotically faster?

14 Analysis of Algorithms
Analysis of an algorithm gives insight into how long the program runs and how much memory it uses time complexity space complexity Why useful? Input size is indicated by a number n sometimes have multiple inputs, e.g. m and n Running time is a function of n n, n2, n log n, n(log n2) + 5n3

15 Simplifying the Analysis
Eliminate low order terms 4n  4n 0.5 n log n - 2n  n log n 2n + n3 + 3n  2n Eliminate constant coefficients 4n  n 0.5 n log n  n log n log n2 = 2 log n  log n log3 n = (log3 2) log n  log n We didn’t get very precise in our analysis of the UWID info finder; why? Didn’t know the machine we’d use. Is this always true? Do you buy that coefficients and low order terms don’t matter? When might they matter? (Linked list memory usage)

16 Order Notation BIG-O T(n) = O(f(n)) OMEGA T(n) =  (f(n))
Upper bound Exist constants c and n’ such that T(n)  c f(n) for all n  n’ OMEGA T(n) =  (f(n)) Lower bound T(n)  c f(n) for all n  n0 THETA T(n) = θ (f(n)) Tight bound θ(n) = O(n) =  (n) We’ll use some specific terminology to describe asymptotic behavior. There are some analogies here that you might find useful.

17 Examples n2 + 100 n = O(n2) because n2 + 100 n = (n2) because
( n n )  2 n2 for n  10 n n = (n2) because ( n n )  1 n2 for n  0 Therefore: n n = (n2)

18 Notation Gotcha Order notation is not symmetric; write but never
2n2 + 4n = O(n2) but never O(n2) = 2n2 + 4n right hand side is a crudification of the left Likewise O(n2) = O(n3) (n3) = (n2)

19 Mini-Quiz 5n log n = O(n2) 5n log n = (n2) 5n log n = O(n)
5n log n = (n log n)

20 Silicon Downs Post #2 Post #1 100n2 + 1000 n3 + 2n2 log n n0.1
Welcome, everyone, to the Silicon Downs. I’m getting race results as we stand here. Let’s start with the first race. I’ll have the first row bet on race #1. Raise your hand if you bet on function #1 (the jockey is n^0.1) So on. Show the race slides after each race.

21 Race I n3 + 2n2 vs. 100n

22 Race II n0.1 vs. log n Well, log n looked good out of the starting gate and indeed kept on looking good until about n^17 at which point n^0.1 passed it up forever. Moral of the story? N^epsilon beats log n for any eps > 0. BUT, which one of these is really better?

23 Race III n + 100n0.1 vs. 2n + 10 log n Notice that these just look like n and 2n once we get way out. That’s because the larger terms dominate. So, the left is less, but not asymptotically less. It’s a TIE!

24 Race IV 5n5 vs. n! N! is BIG!!!

25 Race V n-152n/100 vs. 1000n15 No matter how you put it, any exponential beats any polynomial. It doesn’t even take that long here (~250 input size)

26 Race VI 82log(n) vs. 3n7 + 7n We can reduce the left hand term to n^6, so they’re both polynomial and it’s an open and shut case.

27 The Losers Win Better algorithm! O(n2) O(log n) TIE O(n) O(n5) O(n15)
Post #1 n3 + 2n2 n0.1 n + 100n0.1 5n5 n-152n/100 82log n Post #2 100n log n 2n + 10 log n n! 1000n15 3n7 + 7n Welcome, everyone, to the Silicon Downs. I’m getting race results as we stand here. Let’s start with the first race. I’ll have the first row bet on race #1. Raise your hand if you bet on function #1 (the jockey is n^0.1) So on. Show the race slides after each race.

28 Common Names constant: O(1) logarithmic: O(log n) linear: O(n)
log-linear: O(n log n) superlinear: O(n1+c) (c is a constant > 0) quadratic: O(n2) polynomial: O(nk) (k is a constant) exponential: O(cn) (c is a constant > 1) Well, it turns out that the old Silicon Downs is fixed. They dope up the horses to make the first few laps interesting, but we can always find out who wins. Here’s a chart comparing some of the functions. Notice that any exponential beats any polynomial. Any superlinear beats any poly-log-linear. Also keep in mind (though I won’t show it) that sometimes the input has more than one parameter. Like if you take in two strings. In that case you need to be very careful about what is constant and what can be ignored. O(log m + 2n) is not necessarily O(2n)

29 Analyzing Code C++ operations - constant time
consecutive stmts - sum of times conditionals - sum of branches, condition loops - sum of iterations function calls - cost of function body recursive functions - solve recursive equation Alright, I want to do some code examples, but first, how will we examine code? These are just rules of thumb, not the way you always do it. Sometimes, for ex, it’s important to keep track of which branch of a conditional is followed when (like when analyzing recursive functions). Above all, use your head!

30 Conditionals Conditional time  time(C) + MAX( time(S1), time(S2) )
if C then S1 else S2 time  time(C) + MAX( time(S1), time(S2) ) OK, so this isn’t exactly an example. Just reiterating the rule. Time <= time of C plus max of S1 and S2 <= time of C plus S1 plus S2 time <= sum of times of iterations often #of iterations * time of S (or worst time of S)

31 Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1
I spent 5 minutes looking at my notes trying to figure out why this 1 was an i when it wasn’t. So, I’m putting variables in blue italics. This example is pretty straightforward. Each loop goes N times, constant amount of work on the inside. N*N*1 = O(N^2)

32 Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1
I spent 5 minutes looking at my notes trying to figure out why this 1 was an i when it wasn’t. So, I’m putting variables in blue italics. This example is pretty straightforward. Each loop goes N times, constant amount of work on the inside. N*N*1 = O(N^2)

33 Nested Dependent Loops
for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2)

34 Nested Dependent Loops
for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2) ?

35 Nested Dependent Loops
for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2)

36 To Do Finish reading Ch 1 and 2 Start reading Ch 3
Get started on assignment #1!


Download ppt "CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists) Henry Kautz Winter 2002."

Similar presentations


Ads by Google