Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 1 – Basic Concepts

Similar presentations


Presentation on theme: "Chapter 1 – Basic Concepts"— Presentation transcript:

1 Chapter 1 – Basic Concepts
Data Structure Chapter 1 – Basic Concepts

2 1.5 Algorithm Specification
Definition An algorithm is a finite set of instructions that, if followed, accomplishes a particular task and satisfies: Input: 0 or more quantities are externally supplied. Output: At least one quantity is produced. Definiteness: Each instruction must be clear and unambiguous. Finiteness: An algorithm terminates in a finite number of steps. Effectiveness: Every instruction must be basic enough to be carried out.

3 Example 1.2 Selection Sort
Goal: To sort a collection of n≧1 integers in non-increasing order. Idea: From those integers that are currently unsorted, find the smallest and place it next in the sorted list.

4 Selection Sort Algorithm
An algorithm is often written partially in C++ and partially in English. Program 1.7: Assume integers are initially stored in an array a, and the length of a is n. SelectionSort (int a[], int n) for (int i=0; i<n; i++) { examine a[i] to a[n-1] and suppose the smallest integer is at a[j]; interchange a[i] and a[j]; }

5 C++ Implementation of Selection Sort

6 1.7.1 Performance Analysis Priori performance evaluation.
Space complexity The required amount of memory to run a program Time complexity The required amount of computer time to run a program. Instance characteristic The number/size/scale of data to be dealt with in a problem.

7 Space Complexity Let S(P) be the space requirement of any program P. S(P) = c + Sp(n) c is constant; n denotes instance characteristic. When analyzing S(P), we concentrate solely on estimating Sp(n).

8 float Abc(float a, float b, float c)
Program 1.16 float Abc(float a, float b, float c) { return a+b+b*c+(a+b-c)/(a+b)+4.0; } The fixed part: The space to store a, b, and c, and the return value. Sp(n) = 0.

9 float Sum(float *a, const int n)
Program 1.17 float Sum(float *a, const int n) { float s = 0; for (int i=0; i<n; i++) s += a[i]; return s; } The instance characteristic is n. Since a is actually the address of the first element of a[], and n is passed by value, the space needed by Sum() is constant (Ssum(n)=0).

10 Program 1.18 Each call requires at least 4 words
float RSum(float *a, const int n) { if (n <= 0) return 0; else return (RSum(a, n-1) + a[n-1]); } Each call requires at least 4 words The values of n, a, return value and return address. The depth of the recursion is n+1. The stack space needed is 4(n+1). n=997 n=998 n=999 n=1000

11 Time Complexity Let T(P) be the time requirement of any program T (compile time+run time). We shall concern ourselves with run time (tp(n)), where n denote instance characteristic. Count a particular operation Count program steps Asymptotic complexity Complexity in terms of O, Ω, and Θ.

12 Asymptotic Notation Comparing the time complexity of two programs, it is too difficult to determine exact step count. a step could compound several steps. Consider 2n+2 and 1000n+3 When n is very large, the factors of coefficients and constants become less important. Consider 1000n+3 and 2n2+4 As n increases, 2n2+4 grows relatively faster than 1000n+3. We shall concern ourselves with the relationship of program to the instance characteristic n. 2n+2 1000n+3 2n2+4

13 Definition [Big “oh”] f(n) = O(g(n)) iff there exist positive constants c and n0 such that f(n)≦cg(n) for all n where n≧n0. Example 1: Prove 3n+2=O(n) Suppose f(n) = 3n+2, g(n) = n. ∵Let c=4, n0=2, and then for all n≧2, 3n+2≦4n. ∴3n+2=O(n)

14 Definition [Big “oh”] Example 2:
Prove 1000n2+10n-6=O(n2) In other words, O(.) denotes an upper bound for f(n). Suppose f(n) = 1000n2+10n-6, g(n) = n2. ∵Let c=2000, n0=1, and then for all n≧n0=1, 1000n2+10n-6 ≦ n2. ∴ 1000n2+10n-6=O(n2)

15 Note The coefficient of g(n) is usually 1.
g(n) should be as the smallest function for f(n) = O(g(n)). It is usually written as “3n+2=O(n)” rather than “O(n) = 3n+2”.

16 Theorem 1.2

17 Definition [Omega] f(n) = Ω(g(n)) iff there exist positive constants c and n0 such that f(n)≧cg(n) for all n where n≧ n0. Example 1: 3n+2=Ω(n) Suppose f(n) = 3n+2, g(n) = n. ∵Let c=4, n0=1, and then we have 3n+2≧cn=2n for all n≧n0=1. ∴3n+2=Ω(n).

18 Definition [Omega] Ω(.) denotes an lower bound for f(n). Theorem 1.3
Like O(.), g(n) has to be the largest function and usually has coefficient 1. Theorem 1.3

19 Comparison Constant time: O(1) Polynomial time:
Linear: O(n) Quadratic: O(n2) Cubic: O(n3) Exponential time: O(2n) O(1)<O(logn)<O(n)<O(nlogn)<O(n2)<O(n3)<O(2n)<O(n!)

20 Definition [Theta] f(n) = Θ(g(n)) iff there exist positive constants c1, c2 and n0 such that c1g(n)≦ f(n)≦c2g(n) for all n where n≧ n0. Example 1: 3n+2=Θ(n) ∵Let c1=3, c2=4, n0=2, and then we have 3n+2≦3n and 3n+2≧4n all n≧n0=2. ∴3n+2=Θ(n).

21 Definition [Theta] Θ(g(n)) means f(n) will be bounded around g(n).
g(n) is an upper and lower bound on f(n). Theorem 1.4

22 Asymptotic Analysis Asymptotic notations are used to evaluate computation time of an algorithm. O and Ω correspond to computation time in worst- and best case for an algorithm. Usually, we consider computation time in worst case (O(.)) rather than that in average case. O(.) provides an upper bound for the entire execution. Average time is hard to define.

23 A Simple Example Program 1.17 float Sum(float *a, const int n)
{ 1 float s = 0; 2 for (int i=0; i<n; i++) s += a[i]; 4 return s; } Line 1: O(1). Line 2-3: Line 3 executes in O(1) and repeat n times. Therefore, totally O(n) time. Line 4: O(1). Overall, the computation time of Program 1.17 is O(1) + O(n) + O(1) = O(n)

24 An Recursive Example float RSum(float *a, const int n) { 1 if (n <= 0) return 0; 2 else return (RSum(a, n-1) + a[n-1]); } Let TRSum(n) denote the computation time to execute RSum(n). Therefore, Line 1: O(1). Line 2: TRSum(n-1) + O(1). Therefore, TRSum(n) = O(1) + TRSum(n-1) = O(1) + O(1) + TRSum(n-2) = O(1) + O(1) + … + TRSum(0) = (n+1)O(1) = O(n) n

25 Example 1.3: Binary Search
Assume there are n≧1 distinct integers that are sorted and stored in the array a[0], a[1], …, a[n-1]. Objective: to determine if the integer x is in the array. If so, return j if x = a[j]; Otherwise, return -1.

26 Idea Let left and right denote the left and right ends of the list to be searched. Initially, left=0, right=n-1. Let middle = (left+right) / 2. Three cases: x < a[middle]: search x in the left half. right = middle -1. x == a[middle]: x is found, return middle. x > a[middle]: search x in the right half. left = middle + 1. if left > right, it means x is not found.

27 Iterative Binary Search
int BinarySearch(int *a, const int x, const in n) { 1 Initialize left and right; 2 while (left <= right) 3 { middle = (left + right) / 2; if (x < a[middle]) right = middle - 1; else if (x > a[middle]) left = middle + 1; else Return middle; 8 } 9 Return -1; }

28 Performance Analysis of Iterative Binary Search
Choose n as the instance characteristic. Space Complexity: The memory space is used to store the values of x, n, return value and the start address of a[], which is independent of n. Therefore, space complexity is O(1).

29 Performance Analysis of Iterative Binary Search
Time complexity: Line 1 and 9: O(1). Line 2 to 8: a while-loop whose body (Line 4 to 7) executes totally in O(1) time. The loop ends when the length of the list to be searched is equal to 0 and the length is decreased about one-half in each iteration. Therefore, there are iterations. Totally, O(1)xO(log2n)= O(log2n) Overall, the time complexity is O(log2n).

30 Recursive Binary Search
int BinarySearch(int *a, const int x, const in left, const in right) { 1 If (left <= right) { 2 middle = (left + right) / 2; 3 if (x < a[middle]) Return BinarySearch(a, x, left, middle – 1); 4 if (x > a[middle]) Return BinarySearch(a, x, middle+1, right); 5 Return middle; 6 } 7 Return -1; }

31 Performance Analysis of Recursive Binary Search
Choose n as the length of the list (right-left+1) to be the instance characteristic. Space Complexity: O(1) memory space is used when the function is invoked each time. In worst cast (x is not found), the size of the stack space is O(log2n). Therefore, the space complexity is O(log2n).

32 Performance Analysis of Iterative Binary Search
Time complexity: Let T (n) denote the computation time to execute BinarySearch(). Therefore, T (n) = O(1) + T (n/2) = O(1) + O(1) + T (n/4) = O(1) + O(1) + … + T (0) = O(log2n)O(1) = O(log2n)


Download ppt "Chapter 1 – Basic Concepts"

Similar presentations


Ads by Google