Download presentation

Presentation is loading. Please wait.

Published bySamuel McWilliams Modified over 2 years ago

1
Introduction to Algorithms 6.046J/18.401J LECTURE1 Analysis of Algorithms Insertion sort Asymptotic analysis Merge sort Recurrences Copyright © Erik D. Demaineand Charles E. Leiserson Prof. Charles E. Leiserson

2
Course information 1.Staff 2.Distance learning 3.Prerequisites 4.Lectures 5.Recitations 6.Handouts 7.Textbook 8.Course website 9.Extra help 10.Registration 11.Problem sets 12.Describing algorithms 13.Grading policy 14.Collaboration policy September 7, 2005Introduction to Algorithms L1.2

3
Analysis of algorithms The theoretical study of computer-program performance and resource usage. Whats more important than performance? modularity correctness maintainability functionality robustness user-friendliness programmer time simplicity extensibility reliability September 7, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.3

4
Why study algorithms and performance? Algorithms help us to understand scalability. Performance often draws the line between what is feasible and what is impossible. Algorithmic mathematics provides a language for talking about program behavior. Performance is the currency of computing. The lessons of program performance generalize to other computing resources. Speed is fun! September 7, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.4

5
The problem of sorting Input: sequence a 1, a 2, …, a n of numbers. Output: permutation a' 1, a' 2, …, a' n Such that a' 1 a' 2 …a' n. Example: Input: Output: September 7, 2005L1.5 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

6
INSERTION-SORT I NSERTION -S ORT (A, n) A[1.. n] for j 2 to n do key A[ j] i j –1 while i > 0 and A[i] > key do A[i+1] A[i] i i –1 A[i+1] = key Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.6September 7, 2005 pseudocode

7
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.7September 7, 2005 I NSERTION -S ORT (A, n) A[1.. n] for j 2 to n do key A[ j] i j –1 while i > 0 and A[i] > key do A[i+1] A[i] i i –1 A[i+1] = key pseudocode INSERTION-SORT i A: sorted key nj 1

8
Example of insertion sort Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.8September 7, 2005

9
Example of insertion sort September 7, 2005L1.9 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms

10
September 7, 2005L1.10 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms Example of insertion sort

11
September 7, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.11 Example of insertion sort

12
September 7, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.12 Example of insertion sort

13
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.13 Example of insertion sort

14
September 7, 2005 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms L1.14 Example of insertion sort

15
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.15 Example of insertion sort

16
September 7, 2005L1.16 Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms Example of insertion sort

17
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.17 Example of insertion sort

18
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.18 Example of insertion sort done

19
Running time The running time depends on the input: an already sorted sequence is easier to sort. Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones. Generally, we seek upper bounds on the running time, because everybody likes a guarantee. Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.19

20
Kinds of analyses Worst-case: (usually) T(n) =maximum time of algorithm on any input of size n. Average-case: (sometimes) T(n) =expected time of algorithm over all inputs of size n. Need assumption of statistical distribution of inputs. Best-case: (bogus) Cheat with a slow algorithm that works fast on some input. Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.20

21
Asymptotic Analysis Machine-independent time What is insertion sorts worst-case time? It depends on the speed of our computer: relative speed (on the same machine), absolute speed (on different machines). BIG IDEA: Ignore machine-dependent constants. Look at growth of T(n) as n. Asymptotic Analysis Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.21

22
Θ-notation Math: Θ(g(n)) = { f (n): there exist positive constants c 1, c 2, and n 0 such that 0 c 1 g(n) f (n) c 2 g(n) for all nn 0 } Engineering: Drop low-order terms; ignore leading constants. Example: 3n n 2 –5n = Θ(n 3 ) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.22

23
Asymptotic performance When n gets large enough, a Θ(n 2 )algorithm always beats a Θ(n 3 )algorithm. Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.23 We shouldnt ignore asymptotically slower algorithms, however. Real-world design situations often call for a careful balancing of engineering objectives. Asymptotic analysis is a useful tool to help to structure our thinking. T(n) n n0n0

24
Insertion sort analysis Worst case: Input reverse sorted. [arithmetic series ] Average case: All permutations equally likely. Is insertion sort a fast sorting algorithm? Moderately so, for small n. Not at all, for large n. Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.24

25
Merge sort MERGE-SORT A[1.. n] 1.If n= 1, done. 2.Recursively sort A[ 1...n/2.]and A[ [n/2]+1.. n ]. 3.Merge the 2 sorted lists. Key subroutine: MERGE Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.25

26
Merging two sorted arrays Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.26

27
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1. 27 Merging two sorted arrays

28
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.28 Merging two sorted arrays

29
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.29 Merging two sorted arrays

30
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.30 Merging two sorted arrays

31
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.31 Merging two sorted arrays

32
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.32 Merging two sorted arrays

33
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.33 Merging two sorted arrays

34
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.34 Merging two sorted arrays

35
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.35 Merging two sorted arrays

36
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.36 Merging two sorted arrays

37
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.37 Merging two sorted arrays

38
Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.37 Merging two sorted arrays Time = Θ(n) to merge a total of n elements (linear time).

39
Analyzing merge sort Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.38 MERGE-SORTA[1.. n] 1.If n= 1, done. 2.Recursively sort A[ 1.. n/2 ] and A[ n/ n ]. 3.Mergethe 2sorted lists Sloppiness: Should be T( n/2 ) + T( n/2 ), but it turns out not to matter asymptotically. T(n) Θ(1) 2T(n/2) Θ(n) Abuse

40
Recurrence for merge sort We shall usually omit stating the base case when T(n) = Θ(1) for sufficiently small n, but only when it has no effect on the asymptotic solution to the recurrence. CLRS and Lecture 2 provide several ways to find a good upper bound on T(n). Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.39 Θ(1) if n= 1; 2T(n/2)+ Θ(n) if n> 1. T(n) =

41
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.40

42
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.41

43
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/2) T(n/2) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.42

44
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 T(n/4) T(n/4) T(n/4) T(n/4) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.43

45
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44

46
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn

47
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn cn

48
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn cn

49
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn cn

50
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn cn #leaves = n Θ(n)

51
Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/2 cn/2 cn/4 cn/4 cn/4 cn/4 Θ(1) Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.44 h= lgn cn #leaves = n Θ(n) Total= Θ( n lg n)

52
Conclusions Θ(n lg n) grows more slowly than Θ(n 2 ). Therefore, merge sort asymptotically beats insertion sort in the worst case. In practice, merge sort beats insertion sort for n> 30 or so. Go test it out for yourself! Copyright © Erik D. Demaine and Charles E. Leiserson Introduction to Algorithms September 7, 2005L1.48

Similar presentations

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google