Last lecture summary. SOM supervised x unsupervised regression x classification Topology? Main features? Codebook vector? Output from the neuron?

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

CS6800 Advanced Theory of Computation
Self-Organizing Map (SOM). Unsupervised neural networks, equivalent to clustering. Two layers – input and output – The input layer represents the input.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Genetic Algorithms Representation of Candidate Solutions GAs on primarily two types of representations: –Binary-Coded –Real-Coded Binary-Coded GAs must.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Genetic Algorithm.
Project reminder Deadline: Monday :00 Prepare 10 minutes long pesentation (in Czech/Slovak), which you’ll present on Wednesday during.
Genetic Algorithms and Ant Colony Optimisation
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
More on Heuristics Genetic Algorithms (GA) Terminology Chromosome –candidate solution - {x 1, x 2,...., x n } Gene –variable - x j Allele –numerical.
Neural and Evolutionary Computing - Lecture 5 1 Evolutionary Computing. Genetic Algorithms Basic notions The general structure of an evolutionary algorithm.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
1 Genetic Algorithms and Ant Colony Optimisation.
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Evolutionary Algorithms K. Ganesh Research Scholar, Ph.D., Industrial Management Division, Humanities and Social Sciences Department, Indian Institute.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Waqas Haider Bangyal 1. Evolutionary computing algorithms are very common and used by many researchers in their research to solve the optimization problems.
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
Preliminary Background Tabu Search Genetic Algorithm.
Neural Networks And Its Applications By Dr. Surya Chitra.
An Introduction to Genetic Algorithms Lecture 2 November, 2010 Ivan Garibay
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms. Solution Search in Problem Space.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
July 6, 2016Knowledge-Based System, Lecturer # 09 1 Knowledge Based System Lecture #09 Dr. Md. Hasanuzzaman Assistant Professor Department of Computer.
Genetic Algorithm (Knapsack Problem)
Genetic Algorithm in TDR System
Genetic Algorithms.
Advanced Computing and Networking Laboratory
Evolutionary Algorithms Jim Whitehead
Traffic Simulator Calibration
Artificial Intelligence (CS 370D)
Basics of Genetic Algorithms (MidTerm – only in RED material)
Basics of Genetic Algorithms
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Population Based Metaheuristics
Presentation transcript:

Last lecture summary

SOM supervised x unsupervised regression x classification Topology? Main features? Codebook vector? Output from the neuron?

Compettive learning BMU Scaling Which neuron gets updated? How it will be updated? Topology preservation Neighborhood

Variable parameters – NS (neighborhood strength) – Neighborhood size – Learning rate

Multidimensional data IRIS (attributes: sepal length, sepal widht, petal length, petal width)

Since we have class labels, we can assess the classification accuracy of the map. So first we train the map using all 150 patterns. And then we present input patterns individually again and note the winning neuron. – The class to which the input belongs is the class associated with this BMU codebook vector (see previous slide, Class panel). – Only the winner decides classification.

Vers (2) – 100% accuracy Set (1) – 86% Virg (3) – 88% Overall accuracy = 91.3% Vers (2) – 100% accuracy Set (1) – 90% Virg (3) – 94% Overall accuracy = 94.7% Sandhya Samarasinghe, Neural Networks for Applied Sciences and Engineering, 2006 Only winner decides the classification Neighborhood of size 2 decides the classification

U-matrix Distances between the neighboring codebook vectors can highlight different cluster regions in the map and can be a useful visualization tool Two neurons: w 1 = {w 11, w 21, … w n1 }, w 2 = {w 12, w 22, … w n2 } Euclidean distance between them The average of the distance to the nearest neighbors – unified distance, U -matrix

The larger the distance between neurons, the larger (i.e., lighter color) is the U value. Large distance between this cluster (Iris versicolor) and the middle cluster (Iris setosa). Large distances between codebook vectors indicate a sharp boundary between the clusters.

Surface graph The height represents the distance. 3 rd row – large height = separation Other two clusters are not separated.

Quantization error Measure of the distance between codebook vectors and inputs. If for input vector x the winner is w c, then distortion error e can be calculated as Comput e for all input vectors and get average – quantization error, average map distortion error E.

Iris quantization error High distortion error indicates areas where the codebook vector is relatively far from the inputs. Such information can be used to refine the map to obtain a more uniform distortion error measure if a more faithful reproduction of the input distribution from the map is desired.

Genetic algorithms (new stuff)

Optimization in DM f’(x) = 0 f’’(x) > 0 … minimum f’’(x) < 0 … maximum

Optimization in DM traditional methods (exact) – e.g. gradient based methods heuristics (approximate) – deterministic – stochastic (chance) e.g. genetic algorithms, simulated annealing, ant colony optimization, particle swarm optimization

Optimization in DM Applications of optimization techniques in DM are numerous. Optimize parameters to obtain the best performance. Optimize weights in NN From many features, find the best (small) subset giving the best performance (feature selection). …

Biology Inspiration Every organism has a set of rules describing how that organism is built up from the tiny building blocks of life. These rules are encoded in genes. Genes are connected together into long strings called chromosomes. locus gene for color of teeth allele for blue teeth Genes + alleles = genotype. Physical expression of the genotype = phenotype.

When two organisms mate they share their genes. The resultant offspring may end up having half the genes from one parent and half from the other. This process is called recombination (crossover). Very occasionally a gene may be mutated.

Life on earth has evolved through the processes of natural selection, recombination and mutation. The individuals with better traits will survive longer and produce more offsprings. – Their survivability is given by their fitness. This continues to happen, with the individuals becoming more suited to their environment every generation. It was this continuous improvement that inspired John Holland in 1970’s to create genetic algorithms.

GA step by step Objective: find the maximum of the function O(x 1, x 2 ) = x x 2 2 – This function is called objective function. – And it will be use to evaluate the fitness. Adopted from Genetic Algorithms – A step by step tutorial, Max Moorkap, Barcelona, 29 th November 2005

Encoding A model parameters (x 1, x 2 ) are encoded into binary strings. How to encode (and decode back) a real number as a binary string? – For each real valued variable x we need to know: the domain of the variable x ϵ [x L,x U ] length of the gene k

5-bitx1x2 xLxL 0 xUxU 13.1 Stepsize x 1 ϵ [-1, 1] x 2 ϵ [0, 3.1] c 1 = ( ) → (01011) = * = (10011) = * 0.1 = 1.9 chromosome gene

At the start a population of N random models is generated c 1 = ( ) → (01011) = * = (10011) = * 0.1 = 1.9 c 2 = ( ) → (11110) = * = (10110) = * 0.1 = 2.2 c 3 = ( ) → (10010) = * = (10001) = * 0.1 = 1.7 c 4 = ( ) → (01101) = * = (00001) = * 0.1 = 0.1

For each member of the population calculate the value of the objective function O(x 1, x 2 ) = x x 2 2 O 1 = O(-0.29, 1.9) = 3.69 O 2 = O(0.935, 2.2) = 5.71 O 3 = O(0.161, 1.7) = 2.92 O 4 = O(-0.161, 0.1) = 0.04 genotype phenotype

Chromosome with bigger fitness has higher probability to be selected for breeding. We will use the following formula O 1 = 3.69 O 2 = 5.71 O 3 = 2.92 O 4 = 0.04 ∑O j = P 1 = 0.30 P 2 = 0.46 P 3 = 0.24 P 4 = 0.003

Roulette wheel p 2 (46%) p 3 (24%) p 1 (30%) P 4 (0.3%)

Now select two chromosomes according to roulette wheel. – Allow the same chromosome to be selected more than once for breeding. These two chromosomes will: 1.cross over 2.mutate Let’s say c 2 = ( ) and c 3 = ( ) chromosomes were selected. With probability P c these two chromosomes will exchange their parts at the randomly selected locus (crossover point).

PcPc PmPm

PcPc PmPm

PcPc PmPm

Crossover point is selected randomly. P c generally should be high, about 80%-95% – If the crossover is not performed, just clone two parents into new generation. P m should be low, about 0.5%-1% – Perform mutation on each of the two offsprings at each locus. Very big population size usually does not improve performance of GA. – Good size: 20-30, sometimes reported as best – Depends on size of encoded string

Repeat previous steps till the size of new population reaches N. – The new population replaces the old one. Each cycle throught this algorithm is called generation. Check whether termination criteria have been met. – Change in the mean fitness from generation to generation. – Preset the number of generation.

1.[Start] Generate random population of N chromosomes (suitable solutions for the problem) 2.[Fitness] Evaluate the fitness f(x) of each chromosome x in the population 3.[New population] Create a new population by repeating following steps until the new population is complete 1.[Selection] Select two parent chromosomes from a population according to their fitness (the better fitness, the bigger chance to be selected) 2.[Crossover] With a crossover probability cross over the parents to form a new offspring (children). If no crossover was performed, offspring is an exact copy of parents. 3.[Mutation] With a mutation probability mutate new offspring at each locus (position in chromosome). 4.[Accepting] Place new offspring in a new population 4.[Replace] Use new generated population for a further run of algorithm 5.[Test] If the end condition is satisfied, stop, and return the best solution in current population 6.[Loop] Go to step 2.