CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms.

Slides:



Advertisements
Similar presentations
CS6800 Advanced Theory of Computation
Advertisements

CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Algorithm.
A Genetic Algorithms Approach to Feature Subset Selection Problem by Hasan Doğu TAŞKIRAN CS 550 – Machine Learning Workshop Department of Computer Engineering.
Genetic Algorithms and Ant Colony Optimisation
Evolutionary Intelligence
Stochastic Algorithms Some of the fastest known algorithms for certain tasks rely on chance Stochastic/Randomized Algorithms Two common variations – Monte.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Genetic algorithms Prof Kang Li
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Artificial Intelligence Lecture No. 31 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
Analysis of algorithms Analysis of algorithms is the branch of computer science that studies the performance of algorithms, especially their run time.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
1/27 Discrete and Genetic Algorithms in Bioinformatics 許聞廉 中央研究院資訊所.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Evolution Programs (insert catchy subtitle here).
1 Genetic Algorithms and Ant Colony Optimisation.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Simulated Annealing G.Anuradha.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
Optimization Problems
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Optimization Problems
Genetic Algorithm in TDR System
Genetic Algorithms.
Evolutionary Algorithms Jim Whitehead
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Heuristic Optimization Methods
School of Computer Science & Engineering
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Optimization Problems
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
School of Computer Science & Engineering
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Population Based Metaheuristics
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Presentation transcript:

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Design and Analysis of Algorithms Khawaja Mohiuddin Assistant Professor Department of Computer Sciences Bahria University, Karachi Campus Contact: Lecture # 14, 15 – Natural And Randomized Algorithms

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Topics To Cover 2  Natural Algorithms  Genetic Algorithm  Simulated Annealing  Artificial Neural Networks  Randomized Algorithms  Monte Carlo Algorithm  Las Vegas Algorithm  Reasons for Using Randomized Algorithms

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 3  Natural Algorithms  Algorithms that take inspiration from nature for the development of novel problem-solving techniques are called Natural Algorithms.  Computational paradigms studied by natural computing are abstracted from natural phenomena as diverse as the Biological Evolution, the Annealing Processes used in metallurgy and the Central Nervous System (CNS) of living beings.  These have led to following Optimization methods:  Genetic Algorithm (GA)  Simulated Annealing (SA)  Artificial Neural Networks (ANNs)  An Optimization problem is the search for the most suitable or optimum solution, under various constraints of the problem, one of which can be limited computational resource.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 4  Genetic Algorithms  Genetic algorithms are a particular class of evolutionary algorithms that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover (also called recombination).  Genetic algorithms are implemented as a computer simulation in which a population of abstract representations (called chromosomes or the genotype or the genome) of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem evolves toward better solutions.  Traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also possible.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 5  Genetic Algorithms (contd.)  The evolution usually starts from a population of randomly generated individuals and happens in generations.  In each generation, the fitness of every individual in the population is evaluated, multiple individuals are selected from the current population (based on their fitness), and modified (recombined and possibly mutated) to form a new population.  The new population is then used in the next iteration of the algorithm.  Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.  If the algorithm has terminated due to a maximum number of generations, a satisfactory solution may or may not have been reached.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 6  Genetic Algorithms (contd.) Chromosomes could be:  Bit strings ( )  Real numbers ( )  Permutations of element (E11 E3 E7... E1 E15)  Lists of rules (R1 R2 R3... R22 R23)  Program elements (genetic programming) ... any data structure...

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 7  Genetic Algorithms Requirements  A typical genetic algorithm requires two things to be defined: 1. A genetic representation of the solution domain 2. A fitness function to evaluate the solution domain  A standard representation of the solution is as an array of bits. Arrays of other types and structures can be used in essentially the same way.  The main property that makes these genetic representations convenient is that their parts are easily allied due to their fixed size, that facilitates simple crossover operation.  Variable length representations may also be used, but crossover implementation is more complex in this case.  Tree-like representations are explored in Genetic programming.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 8  Genetic Algorithms Requirements (contd.)  The fitness function is defined over the genetic representation and measures the quality of the represented solution.  The fitness function is always problem dependent.  For instance, we want to maximize the total value of objects that we can put in a knapsack of some fixed capacity.  A representation of a solution might be an array of bits, where each bit represents a different object, and the value of the bit (0 or 1) represents whether or not the object is in the knapsack.  Not every such representation is valid, as the size of objects may exceed the capacity of the knapsack.  The fitness of the solution is the sum of values of all objects in the knapsack if the representation is valid, or 0 otherwise. In some problems, it is hard or even impossible to define the fitness expression; in these cases, interactive genetic algorithms are used.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 9  Genetic Algorithms – Fitness Function

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 10  Genetic Algorithm – How it Works  A population is created with a group of individuals created randomly.  The individuals in the population are then evaluated.  The evaluation function is provided by the programmer and gives the individuals a score based on how well they perform at the given task.  Two individuals are then selected based on their fitness, the higher the fitness, the higher the chance of being selected.  These individuals then "reproduce" to create one or more offspring, after which the offspring are mutated randomly.  This continues until a suitable solution has been found or a certain number of generations have passed, depending on the needs of the programmer.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 11  Genetic Algorithms (contd.) Set time t = 0 Initialize population P(t) While termination condition not met Evaluate fitness of each member of P(t) Select members from P(t) based on fitness Produce offspring from the selected pairs Replace members of P(t) with better offspring Set time t = t + 1

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 12  Genetic Algorithms – Why Use them?  They can solve hard problems  Easy to interface genetic algorithms to existing simulations and models  GA’s are extensible  GA’s are easy to hybridize (crossbreed)  GA’s work by sampling, so populations can be sized to detect differences with specified error rates  Use little problem specific code

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 13  Simulated Annealing  The name and inspiration come from annealing in metallurgy  Annealing is a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects.  Both are attributes of the material that depend on its thermodynamic free energy.  Heating and cooling the material affects both the temperature and the thermodynamic free energy.  While the same amount of cooling brings the same amount of decrease in temperature it will bring a bigger or smaller decrease in the thermodynamic free energy depending on the rate that it occurs, with a slower rate producing a bigger decrease.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 14  Simulated Annealing (contd.)  This notion of slow cooling is implemented in the Simulated Annealing algorithm as a slow decrease in the probability of accepting worse solutions as it explores the solution space.  Accepting worse solutions is a fundamental property of metaheuristics because it allows for a more extensive search for the optimal solution.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 15  Simulated Annealing – How it Works  The state of some physical systems, and the function E(s) to be minimized is analogous to the internal energy of the system in that state.  The goal is to bring the system, from an arbitrary initial state, to a state with the minimum possible energy.  At each step, the SA heuristic considers some neighboring state s' of the current state s, and probabilistically decides between moving the system to state s' or staying in state s.  These probabilities ultimately lead the system to move to states of lower energy.  Typically this step is repeated until the system reaches a state that is good enough for the application, or until a given computation budget has been exhausted.  The neighbors of a state are new states of the problem that are produced after altering a given state in some well-defined way.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 16  Simulated Annealing – Example: TSP  For example, in the traveling salesman problem each state is typically defined as a permutation of the cities to be visited.  The neighbors of a state are the set of permutations that are produced, for example, by reversing the order of any two successive cities.  The well-defined way in which the states are altered in order to find neighboring states is called a "move" and different moves give different sets of neighboring states.  These moves usually result in minimal alterations of the last state in order to help the algorithm keep the better parts of the solution and change only the worse parts.  In the traveling salesman problem, the parts of the solution are the city connections.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 17  Simulated Annealing – Example: TSP (contd.)  Searching for neighbors of a state is fundamental to optimization because the final solution will come after a tour of successive neighbors.  Simple heuristics move by finding best neighbor after best neighbor and stop when they have reached a solution which has no neighbors. The problem with this approach is that the neighbors of a state are not guaranteed to contain any of the existing better solutions which means that failure to find a better solution among them does not guarantee that no better solution exists.  This is why the best solution found by such algorithms is called a local optimum in contrast with the actual best solution which is called a global optimum.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 18  Simulated Annealing – Example: TSP (contd.)  Metaheuristics use the neighbors of a solution as a way to explore the solutions space and although they prefer better neighbors they also accept worse neighbors in order to avoid getting stuck in local optima. As a result, if the algorithm is run for an infinite amount of time, the global optimum will be found.  A metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a lower-level procedure or heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 19  Artificial Neural Networks  An Artificial Neural Network (ANN) is a network of many simple processors (“units”), each possibly having a small amount of local memory. The units are connected by communication channels.  The idea is inspired by biological neural networks of the central nervous systems, particularly the brain.  The units operate only on their local data and on the inputs they receive via the connections.  ANNs have some sort of “training” rule whereby the weights of connections are adjusted on the basis of data.  In other words, ANNs “learn” from examples and exhibit some capability for generalization beyond the training data.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 20  Artificial Neural Networks – Example  For example, a neural network for handwriting recognition is defined by a set of input neurons which may be activated by the pixels of an input image.  After being weighted and transformed by a function (determined by the network's designer), the activations of these neurons are then passed on to other neurons.  This process is repeated until finally, an output neuron is activated. This determines which character was read.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Natural Algorithms 21  Artificial Neural Networks – Applications  Like other machine learning methods (systems that learn from data) neural networks have been used to solve a wide variety of tasks that are hard to solve using ordinary rule-based programming, including computer vision and speech recognition.  In practice, ANNs are especially useful for classification and function approximation / mapping problems.  Problems which have lots of training data available, but to which hard and fast rules cannot easily be applied.

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 22  Introduction  A randomized algorithm employs a degree of randomness during its execution to determine what to do next (for example: flipping a coin)  When considering a randomized algorithm, we usually care about its expected worst-case performance, which is the average amount of time it takes on the worst input of a given size  This average is computed over all the possible outcomes of the coin flips during the execution of the algorithm  In studying randomized algorithms: how to design a good randomized algorithm, and how to prove that it works within given time or error bounds  The main difference is that it is often easier to design a randomized algorithm – randomness turns out to be a good substitute for cleverness more often than one might expect – but harder to analyze it  So much of what one does is develop good techniques for analyzing the often very complex random processes that arise in the execution of an algorithm

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 23  Randomized Algorithm  In addition to the input, the algorithm uses a source of pseudo random numbers  During execution, it takes random choices depending on those random numbers  The behaviour (output) can vary if the algorithm is run multiple times on the same input Algorithm INPUT OUTPUT RANDOM NUMBERS

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 24  A Trivial Example  Suppose we have two boxes. Inside one box is a valuable prize, inside the other is nothing  Our goal is to obtain the prize after opening the fewest possible boxes  A deterministic algorithm tries one box, then the next  In the worst-case, two boxes are opened  In the average case, if we assume that both boxes are equally likely to hide the prize, we open one box half the time and the other box half the time.  We can obtain the same expected time even in the worst case by flipping a coin ourselves to decide which box to open first  This gives a randomized algorithm, and because we flip the coin (instead of nature, in the case of average-case algorithm), we can guarantee the good expected performance no matter what the person hiding the prize does

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 25  Basic Types of Randomized Algorithms  Monte Carlo algorithm  Las Vegas Algorithm  Monte Carlo Algorithm  A Monte Carlo algorithm uses randomness and the answer is guaranteed to be correct most of the time.  Las Vegas Algorithm  A Las Vegas algorithm uses randomness, the answer is guaranteed to be correct, but the running time is polynomial only on average

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Randomized Algorithms 26  Reasons for Using Randomized Algorithms  Simplicity- randomized algorithms are usually much simpler than deterministic algorithms for the same problem  Speed- they are much faster and provide a possibility of polynomial time algorithms  De-randomization – there are the possibilities, at least for some algorithms, to de- randomize and get a deterministic algorithm  Various Approaches – there are various approaches, or paradigms, available for designing a randomized algorithm for a given problem

CSC-305 Design and Analysis of AlgorithmsBS(CS) -6 Fall-2014 Summary 27  Natural Algorithms  Genetic Algorithm  Simulated Annealing  Artificial Neural Networks  Randomized Algorithms  Monte Carlo Algorithm  Las Vegas Algorithm  Reasons for Using Randomized Algorithms