Presentation is loading. Please wait.

Presentation is loading. Please wait.

ICS 252 Introduction to Computer Design winter 2005 Eli Bozorgzadeh Computer Science Department-UCI.

Similar presentations


Presentation on theme: "ICS 252 Introduction to Computer Design winter 2005 Eli Bozorgzadeh Computer Science Department-UCI."— Presentation transcript:

1 ICS 252 Introduction to Computer Design winter 2005 Eli Bozorgzadeh Computer Science Department-UCI

2 2 Winter 2005ICS 252 Introduction to Computer Design References and Copyright Textbooks referred –[Mic94] G. De Micheli “Synthesis and Optimization of Digital Circuits” McGraw-Hill, 1994. –[CLR90] T. H. Cormen, C. E. Leiserson, R. L. Rivest “Introduction to Algorithms” MIT Press, 1990. –[Sar96] M. Sarrafzadeh, C. K. Wong “An Introduction to VLSI Physical Design” McGraw-Hill, 1996. –[She99] N. Sherwani “Algorithms For VLSI Physical Design Automation” Kluwer Academic Publishers, 3 rd edition, 1999.

3 3 Winter 2005ICS 252 Introduction to Computer Design References and Copyright (cont.) Slides all from: http://mountains.ece.umn.edu/~kia/Courses/EE5301/02- Alg/EE5301-Alg.ppt Slides references: –[©Sherwani] © Naveed A. Sherwani, 1992 (companion slides to [She99]) –[©Keutzer] © Kurt Keutzer, Dept. of EECS, UC-Berekeley http://www-cad.eecs.berkeley.edu/~niraj/ee244/index.htm –[©Gupta] © Rajesh Gupta UC-San Diego http://www.ics.uci.edu/~rgupta/ics280.html

4 4 Winter 2005ICS 252 Introduction to Computer Design Reading Assignment Textbook: –Chapter 2 Sections 2.1….2.5 Lecture note: –Lecture note 2

5 5 Winter 2005ICS 252 Introduction to Computer Design Combinatorial Optimization Problems with discrete variables –Examples: SORTING: given N integer numbers, write them in increasing order KNAPSACK: given a bag of size S, and items {(s 1, w 1 ), (s 2, w 2 ),..., (s n, w n )}, where s i and w i are the size and value of item i respectively, find a subset of items with maximum overall value that fit in the bag. More examples: http://www.research.compaq.com/SRC/JCAT/ –A problem vs. problem instance Problem complexity: –Measures of complexity –Complexity cases: average case, worst case –Tractability ­ solvable in polynomial time [©Gupta]

6 6 Winter 2005ICS 252 Introduction to Computer Design Algorithm An algorithm defines a procedure for solving a computational problem –Examples: Quick sort, bubble sort, insertion sort, heap sort Dynamic programming method for the knapsack problem Definition of complexity –Run time on deterministic, sequential machines –Based on resources needed to implement the algorithm Needs a cost model: memory, hardware/gates, communication bandwidth, etc. Example: RAM model with single processor  running time  # operations [©Gupta]

7 7 Winter 2005ICS 252 Introduction to Computer Design Algorithm (cont.) Definition of complexity (cont.) –Example: Bubble Sort –Scalability with respect to input size is important How does the running time of an algorithm change when the input size doubles? Function of input size (n). Examples: n 2 +3n, 2 n, n log n,... Generally, large input sizes are of interest (n > 1,000 or even n > 1,000,000) What if I use a better compiler? What if I run the algorithm on a machine that is 10x faster? [©Gupta] for (j=1 ; j< N; j++) { for (i=j; i < N-1; i++) { if (a[i] > a[i+1]) { hold = a[i]; a[i] = a[i+1]; a[i+1] = hold; } for (j=1 ; j< N; j++) { for (i=j; i < N-1; i++) { if (a[i] > a[i+1]) { hold = a[i]; a[i] = a[i+1]; a[i+1] = hold; }

8 8 Winter 2005ICS 252 Introduction to Computer Design Function Growth Examples

9 9 Winter 2005ICS 252 Introduction to Computer Design Asymptotic Notions Idea: –A notion that ignores the “constants” and describes the “trend” of a function for large values of the input Definition –Big-Oh notation f(n) = O ( g(n) ) if constants K and n 0 can be found such that:  n  n 0, f(n)  K. g(n) g is called an “upper bound” for f (f is “of order” g: f will not grow larger than g by more than a constant factor) Examples: 1/3 n 2 = O (n 2 ) (also O(n 3 ) ) 0.02 n 2 + 127 n + 1923 = O (n 2 )

10 10 Winter 2005ICS 252 Introduction to Computer Design Asymptotic Notions (cont.) Definition (cont.) –Big-Omega notation f(n) =  ( g(n) ) if constants K and n 0 can be found such that:  n  n 0, f(n)  K. g(n) g is called a “lower bound” for f –Big-Theta notation f(n) =  ( g(n) ) if g is both an upper and lower bound for f Describes the growth of a function more accurately than O or  Example: n 3 + 4 n   (n 2 ) 4 n 2 + 1024 =  (n 2 )

11 11 Winter 2005ICS 252 Introduction to Computer Design Asymptotic Notions (cont.) How to find the order of a function? –Not always easy, esp if you start from an algorithm –Focus on the “dominant” term 4 n 3 + 100 n 2 + log n  O(n 3 ) n + n log(n)  n log (n) –n! = K n > n K > log n > log log n > K  n > log n, n log n > n, n! > n 10. What do asymptotic notations mean in practice? –If algorithm A has “time complexity” O(n 2 ) and algorithm B has time complexity O(n log n), then algorithm B is better –If problem P has a lower bound of  (n log n), then there is NO WAY you can find an algorithm that solves the problem in O(n) time.

12 12 Winter 2005ICS 252 Introduction to Computer Design Problem Tractability Problems are classified into “easier” and “harder” categories –Class P: a polynomial time algorithm is known for the problem (hence, it is a tractable problem) –Class NP (non-deterministic polynomial time): ~ polynomial solution not found yet (probably does not exist)  exact (optimal) solution can be found using an algorithm with exponential time complexity Unfortunately, most CAD problems are NP –Be happy with a “reasonably good” solution [©Gupta]

13 13 Winter 2005ICS 252 Introduction to Computer Design Algorithm Types Based on quality of solution and computational effort –Deterministic –Probabilistic or randomized –Approximation –Heuristics: local search Problem vs. algorithm complexity [©Gupta]

14 14 Winter 2005ICS 252 Introduction to Computer Design Deterministic Algorithm Types Algorithms usually used for P problems –Exhaustive search! (aka exponential) –Dynamic programming –Divide & Conquer (aka hierarchical) –Greedy –Mathematical programming –Branch and bound Algorithms usually used for NP problems (not seeking “optimal solution”, but a “good” one) –Greedy (aka heuristic) –Genetic algorithms –Simulated annealing –Restrict the problem to a special case that is in P

15 15 Winter 2005ICS 252 Introduction to Computer Design Graph Definition Graph: set of “objects” and their “connections” Formal definition: –G = (V, E), V={v 1, v 2,..., v n }, E={e 1, e 2,..., e m } –V: set of vertices (nodes), E: set of edges (links, arcs) –Directed graph: e k = (v i, v j ) –Undirected graph: e k ={v i, v j } –Weighted graph: w: E  R, w(e k ) is the “weight” of e k. a c de b a c de b

16 16 Winter 2005ICS 252 Introduction to Computer Design Graph Representation: Adjacency List acdebacdeb b · c · d · e · a · b·d·c·a·d·a·e·b·b·a·d· b · c · d · e · a · b·d·a·a·b·e·d·

17 17 Winter 2005ICS 252 Introduction to Computer Design Graph Representation: Adjacency Matrix acdebacdeb b c d e a bcdea 11100 00101 00001 10011 00100 b c d e a bcdea 10100 00100 00001 00011 00000

18 18 Winter 2005ICS 252 Introduction to Computer Design Edge / Vertex Weights in Graphs Edge weights –Usually represent the “cost” of an edge –Examples: Distance between two cities Width of a data bus –Representation Adjacency matrix: instead of 0/1, keep weight Adjacency list: keep the weight in the linked list item Node weight –Usually used to enforce some “capacity” constraint –Examples: The size of gates in a circuit The delay of operations in a “data dependency graph” acdeb 2 1.5 01 3 -2 a-+*b 55 1 3 1

19 19 Winter 2005ICS 252 Introduction to Computer Design Hypergraphs Hypergraph definition: –Similar to graphs, but edges not between pairs of vertices, but between a set of vertices –Directed / undirected versions possible –Just like graphs, a node can be the source (or be connected to) multiple hyperedges Data structure? 1 2 4 3 5 13542

20 20 Winter 2005ICS 252 Introduction to Computer Design Graph Search Algorithms Purpose: to visit all the nodes Algorithms –Depth-first search –Breadth-first search –Topological Examples [©Sherwani] A B G F D E C A B DE F C G B A E GF C D

21 21 Winter 2005ICS 252 Introduction to Computer Design Depth-First Search Algorithm struct vertex {... int mark; }; dfs ( v ) v.marked  1 print v for each (v, u)  E if (u.mark != 1) // not visited yet? dfs (u) Algorithm DEPTH_FIRST_SEARCH ( V, E ) for each v  V v.marked  0 // not visited yet for each v  V if (v.marked == 0) dfs (v)

22 22 Winter 2005ICS 252 Introduction to Computer Design Breadth-First Search Algorithm bfs ( v, Q) v.marked  1 for each (v, u)  E if (u.mark != 1) // not visited yet? Q  Q + u Algorithm BREADTH_FIRST_SEARCH ( V, E ) Q  {v 0 } // an arbitrary node while Q != {} v  Q.pop() if (v.marked != 1) print v bfs (v) // explore successors There is something wrong with this code. What is it?

23 23 Winter 2005ICS 252 Introduction to Computer Design Distance in (non-weighted) Graphs Distance d G (u,v): –Length of a shortest u--v path in G. uv d(u,v)=2 x y d(x,y) = 

24 24 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm Objective: –Find d(u,v) for a given pair (u,v) and a shortest path u--v How does it work? –Do BFS, and assign (w) the first time you visit a node. (w)=depth in BFS. Data structure –Q a queue containing vertices to be visited – (w)length of the shortest path u--w (initially  ) –  (w)parent node of w on u--w

25 25 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm Algorithm: 1.Initialize (w)   for w  u (u)  0 Q  Q+u 2.If Q  , x  pop(Q) else stop. “no path u--v” 3.  (x,y)  E, if (y)= ,  (y)  x (y)  (x)+1 Q  Q+y 4.If (v)=  return to step 2 5.Follow  (w) pointers from v to u.

26 26 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = u Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 0   -----------

27 27 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 1, v 2, v 5 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 011  1   -uu--u-----

28 28 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 4, v 3, v 6, v 10 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 0112212  2  -uuv 2 v 1 uv 5 ---v 5

29 29 Winter 2005ICS 252 Introduction to Computer Design Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 7 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 01122123  2  -uuv 2 v 1 uv 5 v 4 --v 5

30 30 Winter 2005ICS 252 Introduction to Computer Design Notes on Moor’s Algorithm Why the problem of BREADTH_FIRST_SEARCH algorithm does not exist here? Time complexity? Space complexity?

31 31 Winter 2005ICS 252 Introduction to Computer Design Distance in Weighted Graphs Why a simple BFS doesn’t work any more? –Your locally best solution is not your globally best one –First, v 1 and v 2 will be visited u v1v1 v2v2 3 11 2 =3  =u =11  =u v 2 should be revisited u v1v1 v2v2 3 11 2 =3  =u =5  =v 1

32 32 Winter 2005ICS 252 Introduction to Computer Design Dijkstra’s Algorithm Objective: –Find d(u,v) for all pairs (u,v) (fixed u) and the corresponding shortest paths u--v How does it work? –Start from the source, augment the set of nodes whose shortest path is found. –decrease (w) from  to d(u,v) as you find shorter distances to w.  (w) changed accordingly. Data structure: –S the set of nodes whose d(u,v) is found – (w)current length of the shortest path u--w –  (w)current parent node of w on u—w

33 33 Winter 2005ICS 252 Introduction to Computer Design Dijkstra’s Algorithm O(e) O(v 2 ) Algorithm: 1.Initialize (v)   for v  u (u)  0 S  {u} 2.For each v  S’ s.t. u i v  E, If (v) > (u i ) + w(u i v), – (v)  (u i ) + w(u i v) –  (v)  u i 3.Find m=min{ (v)|v  S’ } and (v j )==m S  S  {v j } 4.If |S|<|V|, goto step 2

34 34 Winter 2005ICS 252 Introduction to Computer Design Dijkstra’s Algorithm - why does it work? Proof by contradiction –Suppose v is the first node being added to S such that (v) > d(u 0,v) (d(u 0,v) is the “real” shortest u 0 --v path) –The assumption that (v) and d(u 0,v) are different, means there are different paths with lengths (v) and d(u 0,v) –Consider the path that has length d(u 0,v). Let x be the first node in S’ on this path –d(u 0,v) (x) contradiction u0u0 v x

35 35 Winter 2005ICS 252 Introduction to Computer Design Minimum Spanning Tree (MST) Tree (usually undirected): –Connected graph with no cycles –|E| = |V| - 1 Spanning tree –Connected subgraph that covers all vertices –If the original graph not tree, graph has several spanning trees Minimum spanning tree –Spanning tree with minimum sum of edge weights (among all spanning trees) –Example: build a railway system to connect N cities, with the smallest total length of the railroad

36 36 Winter 2005ICS 252 Introduction to Computer Design Difference Between MST and Shortest Path Why can’t Dijkstra solve the MST problem? –Shortest path: min sum of edge weight to individual nodes –MST: min sum of TOTAL edge weights that connect all vertices Proposal: –Pick any vertex, run Dijkstra and note the paths to all nodes (prove no cycles created) Debunk: show a counter example a b c 5 4 2 Write on your slides!

37 37 Winter 2005ICS 252 Introduction to Computer Design Minimum Spanning Tree Algorithms Basic idea: –Start from a vertex (or edge), and expand the tree, avoiding loops (i.e., add a “safe” edge) –Pick the minimum weight edge at each step Known algorithms –Prim: start from a vertex, expand the connected tree –Kruskal: start with the min weight edge, add min weight edges while avoiding cycles (build a forest of small trees, merge them)

38 38 Winter 2005ICS 252 Introduction to Computer Design Prim’s Algorithm for MST Data structure: –Sset of nodes added to the tree so far –S’set of nodes not added to the tree yet –Tthe edges of the MST built so far – (w)current length of the shortest edge (v, w) that connects w to the current tree –  (w)potential parent node of w in the final MST (current parent that connects w to the current tree)

39 39 Winter 2005ICS 252 Introduction to Computer Design Prim’s Algorithm –Initialize S, S’ and T S  {u 0 }, S’  V \ {u 0 } // u 0 is any vertex T  { }  v  S’, (v)   –Initialize and  for the vertices adjacent to u 0 For each v  S’ s.t. (u 0,v)  E, – (v)   ((u 0,v)) –  (v)  u 0 –While (S’ !=  ) Find u  S’, s.t.  v  S’, (u)  (v) S  S  {u}, S’  S’ \ {u}, T  T  { (  (u), u) } For each v s.t. (u, v)  E, –If (v) >  ((u,v)) then (v)   ((u,v))  (v)  u

40 40 Winter 2005ICS 252 Introduction to Computer Design Other Graph Algorithms of Interest... Min-cut partitioning Graph coloring Maximum clique, independent set Min-cut algorithms Steiner tree Matching...


Download ppt "ICS 252 Introduction to Computer Design winter 2005 Eli Bozorgzadeh Computer Science Department-UCI."

Similar presentations


Ads by Google