Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.

Slides:



Advertisements
Similar presentations
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Advertisements

Particle Swarm Optimization (PSO)
The Particle Swarm Optimization Algorithm
Ali Husseinzadeh Kashan Spring 2010
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Particle Swarm Optimization
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image:
Particle Swarm Optimization
Particle Swarm Optimization (PSO)
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
Coordinative Behavior in Evolutionary Multi-agent System by Genetic Algorithm Chuan-Kang Ting – Page: 1 International Graduate School of Dynamic Intelligent.
Differential Evolution Hossein Talebi Hassan Nikoo 1.
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
Particle Swarm Optimization Algorithms
Swarm Intelligence 虞台文.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
PSO and its variants Swarm Intelligence Group Peking University.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
Simulated Annealing.
Particle Swarm optimisation. These slides adapted from a presentation by - one of main researchers.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Exact and heuristics algorithms
Solving of Graph Coloring Problem with Particle Swarm Optimization Amin Fazel Sharif University of Technology Caro Lucas February 2005 Computer Engineering.
1 Effect of Spatial Locality on An Evolutionary Algorithm for Multimodal Optimization EvoNum 2010 Ka-Chun Wong, Kwong-Sak Leung, and Man-Hon Wong Department.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Particle Swarm Optimization by Dr. Shubhajit Roy Chowdhury Centre for VLSI and Embedded Systems Technology, IIIT Hyderabad.
Tutorial 3, Part 1: Optimization of a linear truss structure
Genetic Algorithm Optimization Using the HP Prime Presented by Namir Shammas 1.
1 Motion Fuzzy Controller Structure(1/7) In this part, we start design the fuzzy logic controller aimed at producing the velocities of the robot right.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
1. Searching The basic characteristics of any searching algorithm is that searching should be efficient, it should have less number of computations involved.
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
Particle Swarm Optimization (PSO)
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Journal of Computational and Applied Mathematics Volume 253, 1 December 2013, Pages 14–25 Reporter : Zong-Dian Lee A hybrid quantum inspired harmony search.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithm(GA)
Particle Swarm Optimization (PSO) Algorithm. Swarming – The Definition aggregation of similar animals, generally cruising in the same directionaggregation.
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Differential Evolution (DE) and its Variant Enhanced DE (EDE)
The 2st Chinese Workshop on Evolutionary Computation and Learning
Particle Swarm Optimization with Partial Search To Solve TSP
Scientific Research Group in Egypt (SRGE)
Particle Swarm Optimization
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Dr. Ashraf Abdelbar American University in Cairo
Ana Wu Daniel A. Sabol A Novel Approach for Library Materials Acquisition using Discrete Particle Swarm Optimization.
Meta-heuristics Introduction - Fabien Tricoire
آموزش شبکه عصبی با استفاده از روش بهینه سازی PSO
Multy- Objective Differential Evolution (MODE)
Multi-objective Optimization Using Particle Swarm Optimization
Computational Intelligence
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
EE368 Soft Computing Genetic Algorithms.
SWARM INTELLIGENCE Swarms
Presentation transcript:

Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1

Dedication I dedicate this tutorial, with an immense sense of gratitude and thankfulness, to the fine Boston College Jesuits who spent decades working in Iraq to educate students in Baghdad College and Al-Hikmat University. They instilled in us the power to be self-taught! No words can fully thank them for that gift alone. 2

Dedication 3

Particle Swarm Optimization Efficient method for stochastic optimization. Is based on the movement and cleverness of swarms in the search space to locate the best solution. Was developed by James Kennedy in Algorithm simulates the use of particles (agents) making up a swarm moving throughout the search space looking for the best solution. 4

Particle Swarm Optimization Every particle represents a point flying in a multi- dimensional space. Each particle adjusts its flying metrics influenced by its own experience and in addition by the experience of other particles. Each particle keeps track of it’s personal best (lowest/highest function value), partBest(i). The particles keeps track of the global best, globBest. 3

Particle Swarm Optimization 6

X(k) is the current point coordinate. X(k+1) is the updated point coordinate. V(k) is the current velocity. V(k+1) is the updated velocity. V(partBest) is the particle’s best velocity. V(globBest) is the global best velocity. 5

Particle Swarm Optimization V(k+1) uses the following equations: V(i,k+1) = w * V(i,k) + C1 * r1 (partBest(i) − X(i,k)) + C2 * r2 (globaBest – X(i,k)) X(i,k) = X(i,k) + V(i,k+1) C1 and C2 are constants. The parameter changes w decreases in value with each iteration. Recommend to change from 1 to 0.3. r1 and r2 are uniform random numbers in range [0, 1]. 6

Particle Swarm Optimization Easy to code. Requires fewer input parameters, lines, and subroutines, than GA. Requires smaller population sizes than GA. Requires fewer generations/iterations than GA. Does not require a restart! Often returns accurate answers! 9

HP Prime Implementation Function PSO returns the best solution for optimized function MyFx. Parameters for PSO are: Number of variables. Population size (number of probes). Maximum generations (i.e. iterations). Two arrays that define the lower and upper bounds for each variable. 8

HP Prime Implementation To solve the optimum points of the Rosenbrock function, call PSO with the following arguments: Number of variables = 4. Population size of 40. Maximum generations of Vector for lower values of [ ]. Vector for upper values of [ ]. Store results in matrix M3. 9

HP Prime Implementation 10

HP Prime Implementation 11

GCPSO Improved version of PSO that guarantees convergence (hence abbreviation of GCPSO) by keeping track of the best point. Similar to PSO, but updates the velocity of the best point using: V(i,j)= −X(i,j) + globPop(j) + w * V(i,j) + ρ ( 1 − 2 * Rand) w is a constant, Rand is [0, 1] random number, and ρ is a dynamically- adjusted scaling factor. The indices i and j control the population and variable indices, respectively. The scaling factor is doubled after each 5 sequential successes to get a better candidate for the optimum function value. The scaling factor is reduced in half after 5 sequential failures to get a better candidate for the optimum function value. 12

HP Prime Implementation Function GCPSO returns the best solution for optimized function MyFx. Parameters for GCPSO are: Number of variables. Population size (number of probes). Maximum generations (i.e. iterations). Two arrays that define the lower and upper bounds for each variable. 13

HP Prime Implementation To solve the optimum points of the Rosenbrock function, call GCPSO with the following arguments: Number of variables = 4. Population size of 40. Maximum generations of Vector for lower values of [ ]. Vector for upper values of [ ]. Store results in matrix M3. 14

HP Prime Implementation 15

HP Prime Implementation 16

BONUS MATERIAL! 17

Differential Evolution Very simple and very efficient family of evolutionary algorithm. Developed by Storn and Price in the mid 1990s. I present the simplest scheme called DE/rand/1. There are at least 9 more schemes! Uses a simple equation to calculate a potential replacement for the current position using: Xtrl(j) = X(c, j) + F * (X(a, j) – X(b, j) Where a, b, and c are distinct indices (also not equal to the current population index) in the range of [1, MaxPop]. The letter j is the index to a variable. Replaces old position with better fit candidates. 18

HP Prime Implementation Function EA returns the best solution for optimized function MyFx. Parameters for EA are: Number of variables. Population size (number of probes). Maximum generations (i.e. iterations). Two arrays that define the lower and upper bounds for each variable. 19

HP Prime Implementation To solve the optimum points of the Rosenbrock function, call EA with the following arguments: Number of variables = 4. Population size of 40. Maximum generations of Vector for lower values of [ ]. Vector for upper values of [ ]. Store results in matrix M3. 22

HP Prime Implementation 23

HP Prime Implementation 22

HP Prime Implementation 22 Paraphrasing actor Steve Buscemi in Escape from LA, This crowd loves a winner!

Final Remarks Evolutionary algorithms for optimization are a mix between science, experimentation, and art. There is a large number of variants for the various evolutionary algorithms. Number exceeds by far the number of classical optimization methods and their variants. Complexity of algorithm does not always equal superiority of performance. Some cleverly designed simple algorithm can perform very well. 23

No Free Lunch in Search and Optimization There is no single algorithm that can solve all optimization problems! Principle is abbreviated as NFL. Concept proposed by Wolpert and Macready in You will need to experiment with several algorithms to find the best one for your problem. Given an algorithm A, researchers have found that A may succeed with p % of the problems, while some other algorithms, collectively call them B, will solve (100-p)% of the remaining problems. B may represent one of more algorithms. 24

No Free Lunch in Search and Optimization (cont.) See the following web sites: zation zation

References Clever Algorithms-Nature-Inspired Programming Recipes, first ed., 2011, by Jason Brownlee. (Available for online reading). Handbook of Metaheuristics, first ed., 2003, by Fred Glover and Gary A. Kochenberger. Particle Swarm Optimization, first ed., 2006, by Maurice Clerc. Practical Genetic Algorithms, second ed., 2004, by Randy L. Haupt and Sue Ellen Haupt. Evolutionary Optimization Algorithms, first ed., 2015, by Dan Simon. 29

Thank You!! 30