Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms.

Similar presentations


Presentation on theme: "CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms."— Presentation transcript:

1 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms Khawaja Mohiuddin Assistant Professor Department of Computer Sciences Bahria University, Karachi Campus Contact: khawaja.mohiuddin@bimcs.edu.pk Lecture # 14, 15 – Natural And Randomized Algorithms

2 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Topics To Cover 2  Natural Algorithms  Genetic Algorithm  Simulated Annealing  Artificial Neural Networks  Randomized Algorithms  Monte Carlo Algorithm  Las Vegas Algorithm  Reasons for Using Randomized Algorithms

3 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 3  Natural Algorithms  Algorithms that take inspiration from nature for the development of novel problem-solving techniques are called Natural Algorithms.  Computational paradigms studied by natural computing are abstracted from natural phenomena as diverse as the Biological Evolution, the Annealing Processes used in metallurgy and the Central Nervous System (CNS) of living beings.  These have led to following Optimization methods:  Genetic Algorithm (GA)  Simulated Annealing (SA)  Artificial Neural Networks (ANNs)  An Optimization problem is the search for the most suitable or optimum solution, under various constraints of the problem, one of which can be limited computational resource.

4 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 4  Genetic Algorithms  Genetic algorithms are a particular class of evolutionary algorithms that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover (also called recombination).  Genetic algorithms are implemented as a computer simulation in which a population of abstract representations (called chromosomes or the genotype or the genome) of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem evolves toward better solutions.  Traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also possible.

5 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 5  Genetic Algorithms (contd.)  The evolution usually starts from a population of randomly generated individuals and happens in generations.  In each generation, the fitness of every individual in the population is evaluated, multiple individuals are selected from the current population (based on their fitness), and modified (recombined and possibly mutated) to form a new population.  The new population is then used in the next iteration of the algorithm.  Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.  If the algorithm has terminated due to a maximum number of generations, a satisfactory solution may or may not have been reached.

6 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 6  Genetic Algorithms (contd.) Chromosomes could be:  Bit strings (0101... 1100)  Real numbers (43.2 -33.1... 0.0 89.2)  Permutations of element (E11 E3 E7... E1 E15)  Lists of rules (R1 R2 R3... R22 R23)  Program elements (genetic programming) ... any data structure...

7 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 7  Genetic Algorithms Requirements  A typical genetic algorithm requires two things to be defined: 1. A genetic representation of the solution domain 2. A fitness function to evaluate the solution domain  A standard representation of the solution is as an array of bits. Arrays of other types and structures can be used in essentially the same way.  The main property that makes these genetic representations convenient is that their parts are easily allied due to their fixed size, that facilitates simple crossover operation.  Variable length representations may also be used, but crossover implementation is more complex in this case.  Tree-like representations are explored in Genetic programming.

8 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 8  Genetic Algorithms Requirements (contd.)  The fitness function is defined over the genetic representation and measures the quality of the represented solution.  The fitness function is always problem dependent.  For instance, we want to maximize the total value of objects that we can put in a knapsack of some fixed capacity.  A representation of a solution might be an array of bits, where each bit represents a different object, and the value of the bit (0 or 1) represents whether or not the object is in the knapsack.  Not every such representation is valid, as the size of objects may exceed the capacity of the knapsack.  The fitness of the solution is the sum of values of all objects in the knapsack if the representation is valid, or 0 otherwise. In some problems, it is hard or even impossible to define the fitness expression; in these cases, interactive genetic algorithms are used.

9 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 9  Genetic Algorithms – Fitness Function

10 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 10  Genetic Algorithm – How it Works  A population is created with a group of individuals created randomly.  The individuals in the population are then evaluated.  The evaluation function is provided by the programmer and gives the individuals a score based on how well they perform at the given task.  Two individuals are then selected based on their fitness, the higher the fitness, the higher the chance of being selected.  These individuals then "reproduce" to create one or more offspring, after which the offspring are mutated randomly.  This continues until a suitable solution has been found or a certain number of generations have passed, depending on the needs of the programmer.

11 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 11  Genetic Algorithms (contd.) Set time t = 0 Initialize population P(t) While termination condition not met Evaluate fitness of each member of P(t) Select members from P(t) based on fitness Produce offspring from the selected pairs Replace members of P(t) with better offspring Set time t = t + 1

12 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 12  Genetic Algorithms – Why Use them?  They can solve hard problems  Easy to interface genetic algorithms to existing simulations and models  GA’s are extensible  GA’s are easy to hybridize (crossbreed)  GA’s work by sampling, so populations can be sized to detect differences with specified error rates  Use little problem specific code

13 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 13  Simulated Annealing  The name and inspiration come from annealing in metallurgy  Annealing is a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects.  Both are attributes of the material that depend on its thermodynamic free energy.  Heating and cooling the material affects both the temperature and the thermodynamic free energy.  While the same amount of cooling brings the same amount of decrease in temperature it will bring a bigger or smaller decrease in the thermodynamic free energy depending on the rate that it occurs, with a slower rate producing a bigger decrease.

14 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 14  Simulated Annealing (contd.)  This notion of slow cooling is implemented in the Simulated Annealing algorithm as a slow decrease in the probability of accepting worse solutions as it explores the solution space.  Accepting worse solutions is a fundamental property of metaheuristics because it allows for a more extensive search for the optimal solution.

15 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 15  Simulated Annealing – How it Works  The state of some physical systems, and the function E(s) to be minimized is analogous to the internal energy of the system in that state.  The goal is to bring the system, from an arbitrary initial state, to a state with the minimum possible energy.  At each step, the SA heuristic considers some neighboring state s' of the current state s, and probabilistically decides between moving the system to state s' or staying in state s.  These probabilities ultimately lead the system to move to states of lower energy.  Typically this step is repeated until the system reaches a state that is good enough for the application, or until a given computation budget has been exhausted.  The neighbors of a state are new states of the problem that are produced after altering a given state in some well-defined way.

16 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 16  Simulated Annealing – Example: TSP  For example, in the traveling salesman problem each state is typically defined as a permutation of the cities to be visited.  The neighbors of a state are the set of permutations that are produced, for example, by reversing the order of any two successive cities.  The well-defined way in which the states are altered in order to find neighboring states is called a "move" and different moves give different sets of neighboring states.  These moves usually result in minimal alterations of the last state in order to help the algorithm keep the better parts of the solution and change only the worse parts.  In the traveling salesman problem, the parts of the solution are the city connections.

17 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 17  Simulated Annealing – Example: TSP (contd.)  Searching for neighbors of a state is fundamental to optimization because the final solution will come after a tour of successive neighbors.  Simple heuristics move by finding best neighbor after best neighbor and stop when they have reached a solution which has no neighbors. The problem with this approach is that the neighbors of a state are not guaranteed to contain any of the existing better solutions which means that failure to find a better solution among them does not guarantee that no better solution exists.  This is why the best solution found by such algorithms is called a local optimum in contrast with the actual best solution which is called a global optimum.

18 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 18  Simulated Annealing – Example: TSP (contd.)  Metaheuristics use the neighbors of a solution as a way to explore the solutions space and although they prefer better neighbors they also accept worse neighbors in order to avoid getting stuck in local optima. As a result, if the algorithm is run for an infinite amount of time, the global optimum will be found.  A metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a lower-level procedure or heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity.

19 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 19  Artificial Neural Networks  An Artificial Neural Network (ANN) is a network of many simple processors (“units”), each possibly having a small amount of local memory. The units are connected by communication channels.  The idea is inspired by biological neural networks of the central nervous systems, particularly the brain.  The units operate only on their local data and on the inputs they receive via the connections.  ANNs have some sort of “training” rule whereby the weights of connections are adjusted on the basis of data.  In other words, ANNs “learn” from examples and exhibit some capability for generalization beyond the training data.

20 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 20  Artificial Neural Networks – Example  For example, a neural network for handwriting recognition is defined by a set of input neurons which may be activated by the pixels of an input image.  After being weighted and transformed by a function (determined by the network's designer), the activations of these neurons are then passed on to other neurons.  This process is repeated until finally, an output neuron is activated. This determines which character was read.

21 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 21  Artificial Neural Networks – Applications  Like other machine learning methods (systems that learn from data) neural networks have been used to solve a wide variety of tasks that are hard to solve using ordinary rule-based programming, including computer vision and speech recognition.  In practice, ANNs are especially useful for classification and function approximation / mapping problems.  Problems which have lots of training data available, but to which hard and fast rules cannot easily be applied.

22 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 22  Introduction  A randomized algorithm employs a degree of randomness during its execution to determine what to do next (for example: flipping a coin)  When considering a randomized algorithm, we usually care about its expected worst-case performance, which is the average amount of time it takes on the worst input of a given size  This average is computed over all the possible outcomes of the coin flips during the execution of the algorithm  In studying randomized algorithms: how to design a good randomized algorithm, and how to prove that it works within given time or error bounds  The main difference is that it is often easier to design a randomized algorithm – randomness turns out to be a good substitute for cleverness more often than one might expect – but harder to analyze it  So much of what one does is develop good techniques for analyzing the often very complex random processes that arise in the execution of an algorithm

23 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 23  Randomized Algorithm  In addition to the input, the algorithm uses a source of pseudo random numbers  During execution, it takes random choices depending on those random numbers  The behaviour (output) can vary if the algorithm is run multiple times on the same input Algorithm INPUT OUTPUT RANDOM NUMBERS

24 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 24  A Trivial Example  Suppose we have two boxes. Inside one box is a valuable prize, inside the other is nothing  Our goal is to obtain the prize after opening the fewest possible boxes  A deterministic algorithm tries one box, then the next  In the worst-case, two boxes are opened  In the average case, if we assume that both boxes are equally likely to hide the prize, we open one box half the time and the other box half the time.  We can obtain the same expected time even in the worst case by flipping a coin ourselves to decide which box to open first  This gives a randomized algorithm, and because we flip the coin (instead of nature, in the case of average-case algorithm), we can guarantee the good expected performance no matter what the person hiding the prize does

25 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 25  Basic Types of Randomized Algorithms  Monte Carlo algorithm  Las Vegas Algorithm  Monte Carlo Algorithm  A Monte Carlo algorithm uses randomness and the answer is guaranteed to be correct most of the time.  Las Vegas Algorithm  A Las Vegas algorithm uses randomness, the answer is guaranteed to be correct, but the running time is polynomial only on average

26 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 26  Reasons for Using Randomized Algorithms  Simplicity- randomized algorithms are usually much simpler than deterministic algorithms for the same problem  Speed- they are much faster and provide a possibility of polynomial time algorithms  De-randomization – there are the possibilities, at least for some algorithms, to de- randomize and get a deterministic algorithm  Various Approaches – there are various approaches, or paradigms, available for designing a randomized algorithm for a given problem

27 CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Summary 27  Natural Algorithms  Genetic Algorithm  Simulated Annealing  Artificial Neural Networks  Randomized Algorithms  Monte Carlo Algorithm  Las Vegas Algorithm  Reasons for Using Randomized Algorithms


Download ppt "CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms."

Similar presentations


Ads by Google