Presentation is loading. Please wait.

Presentation is loading. Please wait.

Computational Complexity Jang, HaYoung BioIntelligence Lab.

Similar presentations


Presentation on theme: "Computational Complexity Jang, HaYoung BioIntelligence Lab."— Presentation transcript:

1

2 Computational Complexity Jang, HaYoung (hyjang@bi.snu.ac.kr)hyjang@bi.snu.ac.kr BioIntelligence Lab.

3 Algorithm Analysis Why Analysis? to predict the resources that the algorithm requires, such as computational time, memory, communication bandwidth, or logic gates. The running time of an algorithm is the number of primitives operations or “step”(machine independent) executed.

4 Complextity Space / Memory Time Count a particular operation Count number of steps Asymptotic complexity

5 Time Complexity Worst-case an upper bound on the running time for any input Average-case We shall assume that all inputs of a given size are equally likely. Best-case to get the lower bound

6 Time Complexity Sequential search in a list of size n worst-case : n times best-case : 1 times average-case :

7 Asymptotic Notation Asymptotic upper bound, we use - notation. For a given function g(n), we denote by  (g(n)) the set of functions; (g(n)) = {f(n): there exist positive constants c and n 0 such that f(n) = n 0 }.

8 Asymptotic Notation -notation provides an asymptotic lower bound. For a given function g(n), we denote by (g(n)) the set of functions (g(n)) = {f(n): there exist positive constants c and n 0 such that f(n) >= cg(n) for all n >= n 0 }.

9 Asymptotic Notation (g(n)) = {f(n) : there exist positive constants c 1, c 2, and n 0 such that c 1 g(n) = n 0 }.

10 Asymptotic Notation

11 The sets (n 2 ), (n 2 ), and (n 2 )

12 Practical Complexities 10 9 instructions/second computer

13 Impractical Complexities 10 9 instructions/second computer

14 Faster Computer Vs Better Algorithm Algorithmic improvement more useful than hardware improvement. E.g. 2 n to n 3

15 Intractability A polynomial-time algorithm is one whose worst- case time complexity is bounded above by a polynomial function of its input size. example - worst-case time complexity - Polynomial-time : 2n, 3n 3 + 4n, 5n + n 10, n log n - Non polynomial-time : Intractable problem - No polynomial-time algorithm can solve it. W(n)   (p(n))

16 Three Categories of Problems (1) 1. Problems for which polynomial-time algorithms have been found 2. Problems that have been proven to be intractable - The first type is problems that require a non- polynomial amount of output. e.g. Determining all Hamiltonian Circuits. - The second type of intractability occurs when our requests are reasonable and we can prove that the problem cannot be solved in polynomial time. e.g. Halting Problem, Presburger Arithmetic Problem

17 Three Categories of Problems (2) Presburger Arithmetic is the theory of integers with addition (Z,+,=,<,0,1) and is known to require doubly exponential nondeterministic time. 3. Problems that have not been proven to be intractable but for which polynomial-time algorithms have never been found e.g. 3-SAT, 0-1 Knapsack, TSP, Sum-of-Subset, Partition, Graph-Coloring, Independent Set, Vertex-Cover, Clique,3D-Matching, Set Cover, etc.

18 The Sets P and NP (1) Definition P is the set of all decision problems that can be solved in polynomial-time. A NP algorithm have two stages: 1. Guessing (in nondeterministic polynomial time) Stage 2. Verification (in deterministic polynomial time) Stage Definition NP is the set of all decision problems that can be solved in nondeterministic polynomial-time.

19 The Sets P and NP (2)

20 Genetic Algorithms

21 An Abstract View of GA generate initial population G(0); evaluate G(0); t := 0 ; repeat t := t + 1; generate G(t) using G(t-1); evaluate G(t); until termination condition has reached;

22 Search Techniques SEARCH TECHNIQUES Calculus-based techniques. Guided Random Search Techniques Enumerative Techniques Directed Methods Indirected Methods Fibonacci Newton Simulated Evolutionary Annealing Algorithms Evolutionary Genetic Algorithms strategies Dynamic PGMing Parallel Sequential GAs Classes of Search techniques

23 Simple Genetic Algo's components 1. A mechanism to encode the solutions as binary strings 2. A population of binary strings 3. A fitness function 4. Genetic operators 5. Selection mechanism 6. Control parameters

24 Population (chromosomes) Selection (mating pool ) Evaluation (fitness) Genetic Operators Mates Manipulation New Generation Offspring* Decoded strings Parents* Reproduction = Evaluation + Selection Crossover & Mutation The GA Cycle

25 The mechanism for evaluating each string To maintain uniformity over various problem domains, normalize the obj.function to 0 to 1. The normalized value of the obj. function = the fitness of the string Fitness function (object function)

26 Models nature's "survival-of-the-fittest " mechanism A fitter string receives higher number of offspring. Proportionate selection scheme The roulette wheel selection scheme Selection

27 After Selection, pairs of string are picked at random If string length = n, randomly choose a number from 1 to n - 1, then use it as a crossover point. GA invokes crossover ONLY IF a randomly generated n o > p c. (p c = the crossover rate) Crossover

28 After crossover, string are subjected to mutation Flipping bits : 0 1, 1 0 Mutation rate : P m = probability that a bit will be flipped The bits in a string are independently mutated. = Role : restoring lost genetic material Mutation

29 Function Definition Find x from the range [-1, 2] which maximizes the f

30 Analysis of function f

31 Representation (1) Representation of string six places after decimal point The range [-1,2] should be divided into at least 3000000 ranges. Binary representation : 22 bits Mapping from a binary string into a real number x Convert the binary string from the base 2 to base 10: Find a corresponding real number x:

32 Representation (2) String Example : string (1000101110110101000111) String 0000000000000000000000 1111111111111111111111 x 2.0

33 Initial Population Create a population of strings, where each chromosome is a binary vector of 22 bits. All 22 bits for each string are initialized randomly.

34 Evaluation Function eval(v) = f(x) For example, Strings v 1=(1000101110110101000111) v 2=(0000001110000000010000) v 3=(1110000000111111000101) xf(x) x 1= 0.637197 x 2= -0.958973 x 3= 1.627888 f(x 1 ) = 1.586345 f(x 2 ) = 0.078878 f(x 3 ) = 2.250650 The string v 3 is the best of the three strings, since its evaluation returns the highest value. The string v 3 is the best of the three strings, since its evaluation returns the highest value.

35 Genetic Operators : Mutation Mutation Mutation alters one or more genes with a probability equal to mutation rate. Mutation Example : v 3=(1110000000111111000101) v 3’=(1110100000111111000101) v 3’’=(1110000000011111000101) Fifth gene10th gene

36 Genetic Operators : Crossover Crossover Example : The crossover on v2 and v3 Assume that the crossover point was randomly selected after 5th gene: v 2=(00000 | 01110000000010000) v 3=(11100 | 00000111111000101) v 2=(00000 | 01110000000010000) v 3=(11100 | 00000111111000101) v 2’=(00000 | 00000111111000101) v 3”=(11100 | 01110000000010000) v 2’=(00000 | 00000111111000101) v 3”=(11100 | 01110000000010000) Before Crossover After Crossover x 2= -0.958973 x 3= 1.627888 f(x 1 ) = 0.637197 f(x 2 ) = -0.958973 x 2’= -0.998113 x 3”= 1.666028 f(x 1’ ) = 0.940865 f(x 2” ) = 2.459245

37 Parameters Population size = 50 Probability of crossover = 0.25 Probability of mutation = 0.01

38 Experimental results The best chromosome after 150 generations Vmax = (1111001101000100000101) Xmax = 1.850773 As expected,, and f(x max ) is slightly larger than 2.85.


Download ppt "Computational Complexity Jang, HaYoung BioIntelligence Lab."

Similar presentations


Ads by Google