Download presentation

Presentation is loading. Please wait.

Published byAaliyah Chavez Modified over 2 years ago

1
L1. 1 Introduction to Algorithms 6.046J/18.401J LECTURE 1 Analysis of Algorithms Insertion sort Merge sort Prof. Charles E. Leiserson

2
L1. 2 Course information 1. Staff 2. Prerequisites 3. Lectures 4. Recitations 5. Handouts 6. Textbook (CLRS) 7. Extra help 8. Registration 9.Problem sets 1 0.Describing algorithms 1 1.Grading policy 1 2.Collaboration policy Course information handout September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

3
L1. 3 Analysis of algorithms September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson modularity correctness maintainability functionality robustness user-friendliness programmer time simplicity extensibility reliability The theoretical study of computer-program performance and resource usage. Whats more important than performance?

4
L1. 4 Why study algorithms and performance? Algorithms help us to understand scalability. Performance often draws the line between what is feasible and what is impossible. Algorithmic mathematics provides a language for talking about program behavior. Performance is the currency of computing. The lessons of program performance generalize to other computing resources. Speed is fun! September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

5
L1. 5 The problem of sorting Input: sequence of numbers. Output: permutation such that September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson Example: Input: Output:

6
L1. 6 Insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson pseudocode INSERTION-SORT

7
L1. 7 Insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson pseudocode INSERTION-SORT

8
L1. 8 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

9
L1. 9 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

10
L1. 10 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

11
L1. 11 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

12
L1. 12 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

13
L1. 13 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

14
L1. 14 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

15
L1. 15 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

16
L1. 16 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

17
L1. 17 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

18
L1. 18 Example of insertion sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson done

19
L1. 19 Running time The running time depends on the input: an already sorted sequence is easier to sort. Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones. Generally, we seek upper bounds on the running time, because everybody likes a guarantee. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

20
L1. 20 Kinds of analyses Worst-case: (usually) T(n) = maximum time of algorithm on any input of size n. Average-case: (sometimes) T(n) = expected time of algorithm over all inputs of size n. Need assumption of statistical distribution of inputs. Best-case: (bogus) Cheat with a slow algorithm that works fast on some input. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

21
L1. 21 Machine-independent time What is insertion sorts worst-case time? It depends on the speed of our computer: relative speed (on the same machine), absolute speed (on different machines). BIG IDEA: Ignore machine-dependent constants. Look at growth of T(n) as n. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson Asymptotic Analysis

22
L1. 22 Θ-notation Math: Θ(g(n)) = { f (n) : there exist positive constants c 1, c 2, and n 0 such that 0 c 1 g(n) f (n) c 2 g(n) for all n n 0 } Engineering: Drop low-order terms; ignore leading constants. Example: 3n n 2 – 5n = Θ(n 3 ) September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

23
L1. 23 Asymptotic performance When n gets large enough, a Θ(n 2 ) algorithm always beats a Θ(n 3 ) algorithm. We shouldnt ignore asymptotically slower algorithms, however. Real-world design situations often call for a careful balancing of engineering objectives. Asymptotic analysis is a useful tool to help to structure our thinking. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

24
L1. 24 Insertion sort analysis Worst case: Input reverse sorted. [arithmetic series] Average case: All permutations equally likely. Is insertion sort a fast sorting algorithm? Moderately so, for small n. Not at all, for large n. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

25
L1. 25 Merge sort MERGE-SORT A[1.. n] 1. If n = 1, done. 2. Recursively sort A[ 1.. n/2 ] and A[ n/2+1.. n ]. 3. Merge the 2 sorted lists. Key subroutine: MERGE September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

26
L1. 26 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

27
L1. 27 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

28
L1. 28 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

29
L1. 29 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

30
L1. 30 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

31
L1. 31 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

32
L1. 32 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

33
L1. 33 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

34
L1. 34 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

35
L1. 35 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

36
L1. 36 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

37
L1. 37 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

38
L1. 38 Merging two sorted arrays September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson Time = Θ(n) to merge a total of n elements (linear time).

39
L1. 39 Analyzing merge sort September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson Abuse Sloppiness: Should be, but it turns out not to matter asymptotically. MERGE-SORT A[1.. n] 1. If n = 1, done. 2. Recursively sort A[ 1.. n/2 ] and A[ n/2+1.. n ]. 3. Merge the 2 sorted lists

40
L1. 40 Recurrence for merge sort We shall usually omit stating the base case when T(n) = Θ(1) for sufficiently small n, but only when it has no effect on the asymptotic solution to the recurrence. CLRS and Lecture 2 provide several ways to find a good upper bound on T(n). September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

41
L1. 41 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

42
L1. 42 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson T(n)T(n)

43
L1. 43 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

44
L1. 44 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

45
L1. 45 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

46
L1. 46 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

47
L1. 47 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

48
L1. 48 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

49
L1. 49 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

50
L1. 50 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

51
L1. 51 Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

52
L1. 52 Conclusions Θ(n lg n) grows more slowly than Θ(n2). Therefore, merge sort asymptotically beats insertion sort in the worst case. In practice, merge sort beats insertion sort for n > 30 or so. Go test it out for yourself! September 8, 2004Introduction to Algorithms © 2001–4 by Charles E. Leiserson

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google