Presentation is loading. Please wait.

Presentation is loading. Please wait.

Algorithm & Application Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin Hanyang University.

Similar presentations


Presentation on theme: "Algorithm & Application Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin Hanyang University."— Presentation transcript:

1 Algorithm & Application Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin Hanyang University

2 Foundations of Algorithms Richard Neapolitan and Kumarss Naimipour 3 rd Edition. Jones and Bartlett Computer Science, 2004 Time : CPU cycles Storage: memory Instance: Each specific assignment of values to parameters

3 Problem Is the number x in the list S of n numbers? The answer is yes if x is in S and no if it is not. (ex) S={10,7,11,5,13,8}, n=6, and x=5. Solution yes Algorithm : search ( S, n, x ) { for ( i=1; i<=n; i++ ) if S[i]==x, return ( yes ); return ( no ); } /* cf. text P5 */

4 Exchange Sort Problem : Sort n keys in nondecreasing order Inputs : n, S[1],…,S[n] Outputs : Sorted keys in the array S. Algorithm: Exchange Sort { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] }

5 Algorithm Exchange Sort Algorithm: Exchange Sort (ex) n=4 S=[ ] { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] } Homework Show i, j, S, for exchange sort of S=[ ]. Due 1 week ijS

6 Matrix Multiplication C n×n =A n×n. B n×n C ij = a ik. B kj, for i<=n, j <=n. (ex) = Algorithm { /*Matrix multiplication*/ for( i=1; i<=n; i++ ) for( j=1; j<=n; j++ ) { C[i][j]=0; for(k=1;k<=n; k++) C[i][j]= C[i][j] + A[i][k] ×B[k][j]; }

7 Fibonacci Sequence f 0 =0 f 1 =1 f n = f n-1 + f n-2 for n>=2. (ex) f 2 =f 1 + f 0 =1 + 0=1 f 3 =f 2 + f 1 =1 + 1=2 f 4 =f 3 + f 2 =2 + 1=3 f 5 =f 4 + f 3 =3 + 2=5 …

8 Fibonacci (Recursive) int fib (int n) { /*divide-and-conquer : chap2 */ if(n<=1) return n; else return( fib(n-1) + fib(n-2) ); } (ex) fib(5) computation

9 Fibonacci (Iterative) Int fib_iter (int n){/*dynamic programing:chap3*/ Index i; int f[0..n]; f[0]=0; If(n>0){ f[1]=1; for( i=2; i<=n; i++ ) f[i]=f[i-1]+f[i-2]; } Return f[n]; } Complexity (cf text p16) Fib(100) takes 13 days. Fib_iter(100) takes 101 n sec

10 Complexity: Exchange Sort Algorithm: Exchange Sort { for( i=1; i<=n; i++ ) for( j=i+1; j<=n; j++ ) if( S[j] < S[i]) exchange S[i] and S[j] } Basic operation: Comparison of S[j] with S[i] Input size: n, the number of items to be sorted. Complexity: the number of basic operations T(n)=(n-1)+(n-2)+(n-3)+…+1 =(n-1).n/2 ЄO(n 2 )

11 Complexity: Matrix Multiplication Algorithm { /*Matrix multiplication*/ for( i=1; i<=n; i++ ) for( j=1; j<=n; j++ ) { C[i][j]=0; for(k=1;k<=n; k++) C[i][j]= C[i][j] + A[i][k] ×B[k][j]; } Basic operation: multiplication (innermost for loop) Input size: n, #rows and #columns Complexity: T(n)=n×n×n =n 3 ЄO(n 3 )

12 Memory Complexity Analysis of algorithm efficiency in terms of memory. Time complexity is usually used. Memory complexity is occasionally useful.

13 Order : Big O

14 Divide and Conquer Top-Down Approach (p47) Divide the problem into subproblems Conquer subproblems Obtain the solution from the solutions of subproblems Binary search Problem: Is x in the sorted array S of size n ? Inputs: Sorted array S, a key x. Outputs: Location of x in S (0 if x is not in S)

15 Binary Search Locationout=location(1,n); Index location (index low, index high) { index mid; if(low>high) return 0 ; else{ mid= ; if (x==S[mid]) return mid; else if (x

16 Worst-Case Complexity: Binary Search Locationout=location(1,n); Index location (index low, index high){ index mid; if(low>high) return 0 ; else{ mid= (low+high)/2; if (x==S[mid]) return mid; else if (x

17 Complexity: Binary Search W(n)=W(n/2)+1, for n>1, n a power of 2 W(1)=1 It appears that W(n) = log n + 1 (Induction base) For n=1, t 1 =1=log1+1 (Induction hypothesis) Assume that W(n)=log n + 1 (Induction step) L=W(2n)=log(2n)+1 R=W(2n)=W(n)+1=(logn + 1)+1 =logn + log2 + 1=log(2n) + 1

18 Merge Sort (O(nlogn))

19 Quick Sort Sort by dividing the array into two partitions Using a pivot item. (ex)(first item) Quick sort(index low, index high) {index pivot;/*index of the pivot*/ if(high>low){ partition (low, high, pivot ); quicksort (low, pivot-1); quicksort (pivot+1, high); }

20 Homework Given (n=8) (1)Mergesort as in Fig 2.2 P54 (2) Quicksort as in Fig 2.3 P61 (3) Partition as in Table 2.2 P62 Due 1 week

21 Worst-case complexity: Quick sort

22 Dynamic Programming (Bottom-up) Dynamic programming 1.Establish a recursive property 2.Solve in bottom-up fashion by solving smaller instances first (ex): Fibonacci (Iterative) Divide-and-conquer – Divide a problem into smaller instances – Solve these smaller instances (blindly) – Examples: Fibonacci (Recursive): Instances are related Merge sort: Instances are unrelated

23 Binomial coefficient Frequently, n! is too large to compute directly Proof:

24 Binomial coefficients: Divide-and-conquer Algorithm /* Inefficient */ int bin (int n, int k) { if ( k = = 0 || n = = k) return 1; else return bin (n-1, k - 1)+bin (n - 1, k); }

25 Binomial coefficients Figure 3.1: The array B used to compute the binomial coefficient Complexity : O(nK)

26 Example 3.1: Compute Compute row 0: {This is done only to mimic the algorithm exactly.} {The value B [0] [0] is not needed in a later computation.} B [0] [0] = 1 Compute row 1: B [1] [0] = 1 B [1] [1] = 1 Compute row 2: B [2] [0] = 1 B [2] [1] = B [1] [0] + B [1] [1] = 1+1 = 2 B [2] [2] = 1 Compute row 3: B [3] [0] = 1 B [3] [1] = B [2] [0] + B [2] [1] = 1+2 = 3 B [3] [2] = B [2] [1] + B [2] [2] = 2+1 = 3 Compute row 4: B [4] [0] = 1 B [4] [1] = B [3] [0] + B [3] [1] = 1+3 = 4 B [4] [2] = B [3] [1] + B [3] [2] = 3+3 = 6

27 Binomial Coefficient: Dynamic Programming 1.Establish a recursive property 2.Solve in bottom up fashion Algorithm: int bin2 (int n, int k) { index i, j; int B[0..n][0..k]; for (i = 0; i < = n; i ++) for (j = 0; j < = minimum(i, k); j ++) if (j == 0 || j == i) B[i][j] = 1; else B[i][j] = B[i - 1][j - 1] + B[i - 1][j]; return B[n][k]; }

28 HOMEWORK Use dynamic programming approach to compute B[5][3]. Draw diagram like figure 3.1 (Page 94) Due in 1 week

29 Binary Search Tree Definition: For a given node n, 1.Each node contains one key 2.Key (node in the left subtree of n) <= Key (n) 3.Key(n) <= Key(node in the right subtree of n) Optimality depends on the probability

30 Binary Search Tree Depth(n): # edges in the unique path from the root to n. (Depth=level) Search time = depth(key) + 1 The root has a depth of 0.

31 Binary Search Algorithm struct nodetype{ Key type key; Nodetype* left; Nodetype* right; }; typeof nodetype* node_pointer; Void search (node_pointer tree, keytype keyin, node_pointer & p) { {bool found=false; p=tree; while(!found) if (p->key==keyin) found=true; elseif(keyin key) p=p->left; else p=p->right; }

32 Example

33 Greedy Approach Start with an empty set and add items to the set until the set represents a solution. Each iteration consists of the following components: A selection procedure A feasibility check A solution check

34 Spanning Tree A connected subgraph that contains all the vertices and is a tree.

35 Graph G=(V, E) Where V is a finite set of vertices and E is a set of edges (pairs of vertices in V). (ex) V={v 1, v 2, v 3, v 4, v 5 } E={(v 1, v 2,), (v 1, v 3,), (v 2, v 3,), (v 2, v 4,), (v 3, v 4,), (v 3, v 5,), (v 4, v 5,), }

36 Weight of a Graph

37 Prims Algorithm Figure 4.4: A weighted graph (in upper-left corner) and the steps in Prim's algorithm for that graph. The vertices in Y and the edges if F are shaded at each step.

38 Prims Algorithm F = Ø; for (i = 2; i <= n; i++){ // Initialize nearest [i] = 1; // v 1 is the nearest distance [i] = W[1] [i] ; // distance is the weight } repeat (n - 1 times){ // Add all n - 1 vertices min = for (i = 2; i <= n; i++) if (0 distance [i] < min) { min = distance [i]; vnear = i; } e = edge connecting vnear and nearest [vnear]; F=F {e} //add e to F distance [vnear] = - 1; for (i = 2; i <= n; i++) //update distance if (W[i] [vnear] < distance [i]){ distance = W[i] [vnear]; nearest [i] = vnear; }

39 Prims Algorithm 2345 distance13 nearest1111

40 2345 distance36 nearest1121

41 2345 distance 42 nearest1133

42 2345 distance 4 nearest1133

43 2345 distance nearest1133

44 Prims Spanning Tree Complexity : O(n 2 ) (n-1) iterations of the repeat loop (n-1) iterations in two for loops T(n)= 2(n-1) (n-1) Theorem Prims algorithm always produces a minimum Spanning tree.

45 Dijkstras Shortest Paths Figure 4.8: A weighted, directed graph (in upper-left corner) and the steps in Dijkstra's algorithm for that graph. The vertices in Y and the edges in F are shaded in color at each step.

46 Dijkstras Algorithm F = Ø; for (i = 2; i<= n; i++){ //Initialize touch [i] = 1; // paths from V 1 length [i] = W[1] [i]; } repeat (n - 1 times){ min = ; for (i = 2; i < = n; i++) if ( 0 length [i] < min) { min = length [i]; vnear = i; } e = edge from from[vnear] to vnear; F=F {e} //add e to F for (i = 2; i < = n; i++) if (length [vnear] + W[vnear] [i] < length [i]){ length[i] = length[vnear] + W[vnear][i]; touch[i] = vnear; } length[vnear] = -1; }

47 Complexity Prims and Dijkstras : O (n 2 ) Heap implementation : O (mlogn) Fibonacci heap implementation : O (m + nlogn) 1.Find a minimum spanning tree for the following graph 2.Find the shortest paths from V 4 to all the other vertices

48 Scheduling Minimizing the total time (waiting + service) (ex) Three jobs : t 1 =5, t 2 =10, t 3 =4. Schedule Total Time in the System [1, 2, 3] 5+(5+10)+(5+10+4) = 39 [1, 3, 2] 5+(5+4)+(5+4+10) = 33 [2, 1, 3] 10+(10+5)+(10+5+4) = [3, 1, 2] 4+(4+5)+(4+5+10) = 32 3! cases

49 Optimal Scheduling for Total Time Smallest service time first. // Sort the jobs in nondecreasing order of service time // Schedule in sorted order. Complexity (sorting) w(n)Є O ( nlogn )

50 Schedule with Deadlines Schedule to maximize the total profit. Each job takes one unit of time to finish. (ex) Job Deadline Profit [1,3] : TP=30+25=55 [2,1] : TP=35+30=65 … [4,1] : TP=40+30=70.(optimal) … Is highest profit first optimal?

51 Schedule with Deadlines Profit and deadline should be considered Problem : Maximize total profit Input : n jobs, deadline[1..n], sorted by profits in nonincreasing order Output : An optimal sequence J for the jobs. Algorithm Schedule (O(n 2 )) J=[1] for (i = 2; i <= n; i++){ K = J with i added according to nondecreasing values of deadline [i]; if (K is feasible) J = K; } }

52 Suppose we have the jobs in Example 4.4. Homework : schedulingExample 4.4 Recall that they had the following deadlines: 1. schedule to minimize the total time. Job Deadline Profit Job Service time Schedule with deadlines for max profit Job Deadline Profit Algorithm 4.4Algorithm 4.4 does the following: J is set to [1] K is set to [2, 1] and is determined to be feasible J is set to [2, 1] because K is feasible K is set to [2, 3, 1] and is rejected because it is not feasible. 4.K is set to [2, 1, 4] and is determined to be feasible. J is set to [2, 1, 4] because K is feasible. 5.K is set to [2, 5, 1, 4] and is rejected because it is not feasible. 6.K is set to [2, 1, 6, 4] and is rejected because it is not feasible. 7.K is set to [2, 7, 1, 4] and is rejected because it is not feasible. The final value of J is [2, 1, 4].

53 Huffman Code Variable-length binary code for data compression Prefix code : No codeword constitutes the beginning of another codeword. (ex) 0 1 is the code for a can not be a code ( for b ).

54 Prefix Code Figure 4.10: The binary character code for Code C2 in Example 4.7 appears in (a), while the one for Code C3 (Huffman) appears in (b).Example 4.7

55 Variable Length (Prefix) Code Bits(C1)=16(3)+5(3)+12(3)+17(3)+10(3)+25(3)=255 Bits(C2)=16(2)+5(5)+12(4)+17(3)+10(5)+25(1)=231 Bits(C3)=16(2)+5(4)+12(3)+17(2)+10(4)+25(2)=212

56 Huffman Code (Optimal) Figure 4.11: Given the file whose frequencies are shown in Table 4.1, this shows the state of the subtrees, constructed by Huffman's algorithm, after each pass through the for-i loop. The first tree is the state before the loop is enteredTable 4.1

57 Huffman Algorithm Priority queue : Highest priority (lowest frequency) Element is removed first Homework for(i=1; i<=n-1; i++){ remove(PQ,p); remove(PQ,q); r=new nodetype; r->left=p; r->right=q; r->frequency=p->frequency + q ->frequency; insert(PQ, r); } remove(PQ, r) return r; Priority queue (heap) Initialization O(n) Each heap operation O(logn) Huffman algorithm complexity O(nlogn).

58 Knapsack Problem Let s={item 1, item 2, …, item n} w i =weight of item i p i =profit of item i W =max weight the knapsack can hold Determine a subset A of S such that (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg

59 Example :0-1 Knapsack Figure 4.13: A greedy solution and an optimal solution to the 0-1 Knapsack problem.

60 Dynamic Programming : 0-1 Knapsack for i>0 and w>0, let P[i][w] be the optimal profit obtained when choosing items only from the first i items under the restriction that the total weight cannot exceed w, Max profit = P[n][ ] P[n][ ] can be computed from 2D array P with rows(0 to n) and Columns(0 to ). P[0][w]=0 P[i][0]=0

61 Example :Dynamic Prog.(knapsack) (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg P[3][30] w 3 =20 P[2][30] P[2][10] w 2 =10 w 2 =10 P[1][30] P[1][20] P[1][10] P[1][0] $50 $50 $50 $0

62 Example : Dynamic Prog. (ex)item1 :$50, 5kg ($50/5 = 10) item2 :$60, 10kg ($60/10 = 6) item3 :$140, 20kg ($140/20 = 7) =30 kg P[3][30] $200 P 3 =$140 P[2][30] $110 P[2][10] $60 P 2 =$60 P 2 =$60 P[1][30] P[1][20] P[1][10] P[1][0] $50 $50 $50 $0

63 Complexity : Dynamic Prog.(Knapsack) (n-i)th row : 2 i entries are computed Total number of entries = …+2 n-1 =2 n -1 Complexity : O(2 n )

64 Backtracking Path finding in a maze If dead end, pursue another path If a sign were positioned near the beginning of the path, the time saving could be enormous Backtracking After determining that a node can lead to nothing but dead ends, we go back (backtrack) to the parent node and proceed with the search on the next child. Pruning the nonpromising subtree

65 4 Queens Problem Figure 5.5: The actual chessboard positions that are tried when backtracking is used to solve the instance of the n-Queens problem in which n = 4. Each nonpromising position is marked with a cross.

66 n-Queens Problem void queens (index i) { index j; if (promising (i)) if (i == n) cout << col [1] through col [n]; else for (j = 1; j <= n; j++){ // See if queen in col [i + 1] = j; // (i + 1) st row can be queens (i + 1); // positioned in each of // the n columns. } bool promising (index i) { index k; bool switch; k = 1; switch = true; // Check if any queen threatens while (k < i && switch){ // queen in the ith row. if (col [i] == col [k] || abs (col [i] - col [k] == i --k) switch = false; k++; } return switch; }

67 Branch-and-Bound Exponential-time complexity in the worst case Dynamic programming Backtracking Branch-and-bound algorithm Are improvement on the backtracking algorithm No limit in the way of traversing the tree (Best-first or breadth-first) Used only for optimization problems (Bound determines whether the node is promising)

68 Branch-and-Bound A node is nonpromising if (upper) bound is less than or equal to maxprofit (value of best solution found up to that point). (ex) 0-1 Knapsack weight(profit): weight profit sum of items up to the node Promising? (Bound should be computed to decide) Sorted items by(p i /w i )

69 Branch-and-Bound:0-1 Knapsack Promising? If a node is at level i, and the node at level k is the one whose weight would bring the weight above, then totweight=weight + w j bound=(profit + P j )+( - totweight)P k /w K Nonpromising if bound < maxprofit or weight >

70 B&B Example 0–1 Knapsack problem( =16) Ordered according to p i /w i i P i w i p i /w i 1 $40 2 $20 2 $30 5 $6 3 $50 10 $5 4 $10 5 $2

71 B&B Knapsack : Breath-First ( =16) Figure 6.2: The pruned state space tree produced using breadth-first search with branch-and-bound pruning in Example 6.1. Stored at each node from top to bottom are the total profit of the items stolen up to that node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.Example 6.1

72 B&B Knapsack: Best-First Figure 6.3: The pruned state space tree produced using best-first search with branch-and-bound pruning in Example 6.2. Stored at each node from top to bottom are the total profit of the items stolen up to the node, their total weight, and the bound on the total profit that could be obtained by expanding beyond the node. The node shaded in color is the one at which an optimal solution is found.Example 6.2

73 Problem Solving Approaches Behavioral approach Relationship between a stimulus (input) and response(output). Without speculating about the intervening Process. Information processing approach Based on the process that intervenes between input and output an leads to a desired goal from an initial state. Thinking to achieve a desired goal. Rubinstein & Firstenberg, Patterns of Problem Solving, Prentice Hall, 1995

74 Model of Memory Sensory Register Short Term Memory long Term Memory forgetting sec forgetting Sensory register Important information to higher-order systems The rest quickly fades Short-term memory or working memory Limited capacity (bottleneck) n±2 unrelated items (n digit phone number ) Long-term memory A network of interconnecting ideas, concepts, and facts.

75 Short-Term Memory (STM) Limited working memory 3×4 is easy. 5+3×144 is hard, since STM can not retain all the Subcalculations. You can not remember a long sentence. Information in STM is replaced with competing information. Information will be transferred to LTM or lost.

76 Long-Term Memory(LTM) A network of interconnecting ideas Learning new information: Integrating that information within the structure. The richer the cognitive structure already set up in LTM, the easier it is to learn new information. Familial topic is easy. Multiple relationships among pieces of information that are stored( >creative thinking) The richness and complexity leads to the easiest types of retrieval from memory. LTM can not fill up.

77 Forgetting Two theories of forgetting Changes during storage causing the information to decay. Failure to retrieve the information. Effective forgetting to update memories Need to know where we parked the car today (not yesterday). Difficult or impossible When a friend tells you a secret and adds forget I ever said anything.

78 Heap & Priority Queue Priority Queue: The highest priority element is always removed first. A PQ can be implemented as a linked list, but more efficiently as a heap. Heap: A heap is an essentially complete binary tree such that The values come from an ordered set. Heap property is satisfied. Value(parent node) >= Value(child node)

79 A Heap Essentially complete binary tree (of depth d) Complete binary tree down to a depth of d-1 Nodes with depth d are as far to the left as possible Heap property: Value(parent) >= Value(child) A heap

80 siftdown Input tree: heap property except the root Output: a heap void siftdown (heap& H) // H starts out having the { // heap property for all node parent, largerchild; // nodes except the root. // H ends up a heap. parent = root of H; largerchild = parent's child containing larger key; while (key at parent is smaller than key at largerchild){ exchange key at parent and key at largerchild; parent = largerchild; largerchild = parent's child containing larger key; }

81 sfitdown Procedure siftdown sifts 6 down until the heap property is restored.

82 Remove Root Remove the key at the root and restore the heap property. keytype root (heap& H) { keytype keyout; keyout = key at the root; move the key at the bottom node to the root; // Bottom node is delete the bottom node; // far-right leaf. siftdown (H); // Restore the return keyout; // heap property. } Given a heap of n keys, place the keys in sorted array S. void removekeys (int n, heap H, keytype S[]) O (nlog n) { index i; for (i = n; i >= 1; i--) S[i] = root (H); }

83 Make Heap Transform all subtrees whose roots have depth d-i into heaps for i=1,2,…,d. Complexity O(n) void makeheap (int n, heap& H) // H ends-up a heap. { index i; heap Hsub; // Hsub ends up a heap. for (i = d - 1; i >= 0; i--) // Tree has depth d. for (all subtrees Hsub whose roots have depth i) siftdown (Hsub); }

84 Make Heap Using siftdown to make a heap from an essentially complete binary tree. After the steps shown, the right subtree, whose root has depth d-2, must be made into a heap, and finally the entire tree must be made into a heap.

85 Make Heap More with depth d-2 Depth d-3

86 Heapsort void heapsort (int n, heap H, // H ends up a heap. keytype S[]) { makeheap (n, H); removekeys (n, H, S); } A heap The array representation of the heap.


Download ppt "Algorithm & Application Algorithm : A step-by-step procedure for solving a problem Prof. Hyunchul Shin Hanyang University."

Similar presentations


Ads by Google