Download presentation
Presentation is loading. Please wait.
1
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B
2
Introduction u The basic mathematical techniques for analyzing algorithms are central to more advanced topics in computer science and give you a way to formalize the notion that one algorithm is significantly more efficient than another. u We will also study sorting algorithms. Sorting algorithms provide varied and relatively easy examples of the analysis of efficiency.
3
Topics to be Covered u Comparing programs u Time as a function of Problem size u Big-O notation u Growth Rate function u Worst-case and average case analyses u Summary
4
Comparing Programs u Comparing the execution times of programs instead of the algorithms has the following difficulties: 1. Efficiency may depend on implementation of the algorithms rather than algorithms themselves. 2. Dependency on the computer or hardware on which the program is run. 3. Dependency on the instance of data that is used. Algorithm analysis should be independent of specific implementation, computers and data.
5
Example1 PtrType cur = head; // 1 assignment while (cur != null) // N comparisons {System.out.println(cur.data); // N writes cur = cur.next; // N assignments } Time = (N+1) a + N c + N w where a - assignment time, c - comparison time, w - write time. Simply, Time is proportional to N.
6
Time as a function of problem size u Absolute time expressions have same difficulties as comparing execution times. –Alg. A requires N-squared/5 time units –Alg. B requires 5 * N time units. u Moreover, the attribute of interest is how quickly the algorithm’s time requirement grows as a function of the problem size. u Instead of above expressions, –Alg. A requires time proportional to N-squared. –Alg. B requires time proportional to N. u These characterize the inherent efficiency of algs. independent of such factors as particular computers and implementations. u The analyses are done for large values of N.
7
Order-of-Magnitude Analysis u If Alg A requires time proportional to f(N), Alg A is said to be order f(N), which is denoted by O(f(N)); u f(N) is called the algorithm’s growth-rate function. u The notation uses the upper-case O to denote order, it is called the Big O notation. u If a problem size of N requires time that is directly proportional to N, the problem is O(N), if it is, then it is O( ), and so on.
8
Key concepts u Formal definition of the order of an algorithm: Algorithm A is order f(N)-- denoted O(f(N)) -- if constants c and N0 exist such that A requires no more than c* f(N) time units to solve a problem of size N >= N0.
9
Interpretation of growth-rate functions u 1 -- A growth rate function 1 implies a problem whose time requirement is constant and, therefore independent of problem size. u log2N -- Time requirement for a logarithmic algorithm increases slowly as the problem size increases. For example, if you square the problem size you only double the time requirement. u N -- Linear algorithm: Time requirement increases directly with the size of the problem. u N-squared: Quadratic. Algorithms that use two nested loops are examples.
10
Interpretation of growth-rate functions u N-cubed : Time requirement for a cubic algorithm increases more rapidly. Three nested loops is an example. u 2-power N : exponential algorithm. Too rapidly to be of any practical use. u N * log N : Algorithms that divide the problems into subproblems and solve them. u N-squared: Quadratic. Algorithms that use two nested loops are examples. u N-cubed : Time requirement for a cubic algorithm increases more rapidly. Three nested loops is an example. u 2-power N : exponential algorithm. Too rapidly to be of any practical use.
11
Properties of growth rate functions u You can ignore low-order terms in an algorithm’s growth: Example: u You can ignore multiplicative constant in the higher order term of a growth-rate function: Example: u You can combine growth rate functions. u Example: O(f(N))) + O(g(N)) = O(f(N) + g(N))
12
Worst-case and average-case analyses u A particular algorithm may require different times to solve different problems of the same size. u For example: searching for an element in a sorted list. u Worst-case analysis gives the pessimistic time estimates. u Average-case analysis attempts to determine the average amount of time that A requires to solve the problems of size N.
13
How to use Order-Of-Magnitude function? u For example, array-based listRetrieve is O(1) : meaning whether it is nth element or 1st element it will take the same time to access it. u A linked-list based listRetrieve is O(N) : meaning that the retrieval time depends on the position of the element in the list. u When using an ADT’s implementation, consider how frequently particular ADT operations occur in a given application.
14
How to …? u If the problem size is small, you can ignore an algorithm’s efficiency. u Compare algorithms for both style and efficiency. u Order-of-magnitude analysis focuses on large problems. u Sometimes you may have to weigh the trade-offs between an algorithm’s time requirements and its memory requirements.
15
Efficiency of search algorithms u Linear search: (sequential search) : u Best case : First element is the required element: O(1) u Worst case: Last element or element not present : O(N) u Average case: N/2 : After dropping the multiplicative constant (1/2) : O(N)
16
Binary search algorithm u Search requires the following steps: 1. Inspect the middle item of an array of size N. 2. Inspect the middle of an array of size N/2 3. Inspect the middle item of an array of size N/power(2,2) and so on until N/power(2,k) = 1. –This implies k = log 2 N –k is the number of partitions.
17
Binary search algorithm u Best case : O(1) u Worst case : O(log 2 N) u Average Case : O(log 2 N)/2 = O(log 2 N)
18
Efficiency of sort algorithms u We will consider internal sorts (not external sorts). u Selection sort, Bubble sort(exchange sort), Insertion sort, Merge sort, Quick sort, Radix sort u For each sort, study the 1. Algorithm 2. Analysis and Order of magnitude expression 3. Application
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.