Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1.

Slides:



Advertisements
Similar presentations
Particle Swarm Optimization (PSO)
Advertisements

1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
Particle Swarm Optimization
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Particle Swarm Optimization (PSO)
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
EMBIO – Cambridge Particle Swarm Optimization applied to Automated Docking Automated docking of a ligand to a macromolecule Particle Swarm Optimization.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
Automating Keyphrase Extraction with Multi-Objective Genetic Algorithms (MOGA) Jia-Long Wu Alice M. Agogino Berkeley Expert System Laboratory U.C. Berkeley.
Design of Curves and Surfaces by Multi Objective Optimization Rony Goldenthal Michel Bercovier School of Computer Science and Engineering The Hebrew University.
The Pareto fitness genetic algorithm: Test function study Wei-Ming Chen
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
Particle Swarm Optimization Algorithms
1 Reasons for parallelization Can we make GA faster? One of the most promising choices is to use parallel implementations. The reasons for parallelization.
SWARM INTELLIGENCE IN DATA MINING Written by Crina Grosan, Ajith Abraham & Monica Chis Presented by Megan Rose Bryant.
SBSE Course 4. Overview: Design Translate requirements into a representation of software Focuses on –Data structures –Architecture –Interfaces –Algorithmic.
COLLABORATIVE EXECUTION ENVIRONMENT FOR HETEROGENEOUS PARALLEL SYSTEMS Aleksandar Ili´c, Leonel Sousa 2010 IEEE International Symposium on Parallel & Distributed.
Simulation of Memory Management Using Paging Mechanism in Operating Systems Tarek M. Sobh and Yanchun Liu Presented by: Bei Wang University of Bridgeport.
An Effective Dynamic Scheduling Runtime and Tuning System for Heterogeneous Multi and Many-Core Desktop Platforms Authous: Al’ecio P. D. Binotto, Carlos.
Predictive Runtime Code Scheduling for Heterogeneous Architectures 1.
OPERATING SYSTEMS CPU SCHEDULING.  Introduction to CPU scheduling Introduction to CPU scheduling  Dispatcher Dispatcher  Terms used in CPU scheduling.
SWARM INTELLIGENCE Sumesh Kannan Roll No 18. Introduction  Swarm intelligence (SI) is an artificial intelligence technique based around the study of.
Introduction to multi-objective optimization We often have more than one objective This means that design points are no longer arranged in strict hierarchy.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
4 Fundamentals of Particle Swarm Optimization Techniques Yoshikazu Fukuyama.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Topics in Artificial Intelligence By Danny Kovach.
Evaluating FERMI features for Data Mining Applications Masters Thesis Presentation Sinduja Muralidharan Advised by: Dr. Gagan Agrawal.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
PSMS for Neural Networks on the Agnostic vs Prior Knowledge Challenge Hugo Jair Escalante, Manuel Montes and Enrique Sucar Computer Science Department.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
Advanced Computer Architecture & Processing Systems Research Lab Framework for Automatic Design Space Exploration.
Particle Swarm Optimization † Spencer Vogel † This presentation contains cheesy graphics and animations and they will be awesome.
DynamicMR: A Dynamic Slot Allocation Optimization Framework for MapReduce Clusters Nanyang Technological University Shanjiang Tang, Bu-Sung Lee, Bingsheng.
Laboratory of mechatronics and robotics Institute of solid mechanics, mechatronics and biomechanics, BUT & Institute of Thermomechanics, CAS Mechatronics,
SwinTop: Optimizing Memory Efficiency of Packet Classification in Network Author: Chen, Chang; Cai, Liangwei; Xiang, Yang; Li, Jun Conference: Communication.
1 Motion Fuzzy Controller Structure(1/7) In this part, we start design the fuzzy logic controller aimed at producing the velocities of the robot right.
Faculty of Information Engineering, Shenzhen University Liao Huilian SZU TI-DSPs LAB Aug 27, 2007 Optimizer based on particle swarm optimization and LBG.
Particle Swarm Optimization (PSO)
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
- Divided Range Multi-Objective Genetic Algorithms -
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
Genetic Algorithms. Solution Search in Problem Space.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Advanced Computing and Networking Laboratory
Scientific Research Group in Egypt (SRGE)
AEEICB-2016 PAPER ID- 187 Voltage Stability Enhancement and Voltage Deviation Minimization Using Ant-Lion Optimizer Algorithm Indrajit N. Trivedi 1 Siddharth.
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Meta-heuristics Introduction - Fabien Tricoire
C.-S. Shieh, EC, KUAS, Taiwan
Distributed Dynamic BDD Reordering
Multi-objective Optimization Using Particle Swarm Optimization
Multi-band impedance matching using an evolutionary algorithm
Particle swarm optimization
Optimizing MapReduce for GPUs with Effective Shared Memory Usage
Heuristic Optimization Methods Pareto Multiobjective Optimization
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
CPU SCHEDULING.
by Xiang Mao and Qin Chen
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
Multi-objective Optimization Using Particle Swarm Optimization
Presentation transcript:

Robin McDougall Scott Nokleby Mechatronic and Robotic Systems Laboratory 1

Outline Context and Background Multi-Objective PSO (MOPSO) via Pareto Dominance Parallelization of PSO (PAPSO) MOPAPSO Results 2

Introduction Increasing role for global optimization techniques in engineering design Ambition in design leads to more highly- parameterized systems More parameters lead to increasingly non-linear objective function surfaces 3

Introduction Particle Swarm Optimization (PSO) gaining increasing attention in both research and applications Over time a number of variants have been utilized with great success Many on the “Particle” level: Particle Accelerations Transient Social and Personal Weights Dynamic Forms … and many more 4

Motivation Optimization-Based Mechanism Synthesis (OBMS) Highly parameterized system Use optimization techniques to choose parameters More parameters typically lead to more non-linear objective function surfaces Effects of which can confound traditional optimization techniques 5

Motivation Replace previously used deterministic techniques with a global optimization technique No need for parameter transforms No need for “pseudo-global” techniques Prevent artificial constriction of search space 6

Motivation A typical OBMS objective could be to design a system to follow a given path… 7

Motivation OBMS with PSO synthesized mechanisms which could fulfill this task better than deterministic algorithms 8

Motivation With one major caveat…. PSO took hours instead of minutes 9

Objectives Use Multi-Objective PSO (MOPSO) to handle multi- objective problem specifications Use Parallel Asynchronous PSO (PAPSO) to speed things up Both topics well covered in the literature individually Little mention of combining the two 10

Multi-Objective Optimization Engineering design choices often involve balancing competitive objectives: Cost vs. Performance Size vs. Strength Effectiveness vs. Efficiency What options are available to us to deal with these competing objectives? 11

Multi-Objective Optimization Could use a weighting scheme Concerns: With no prior knowledge, how do you select the weights? Potential to unfairly influence optimization before execution begins 12

Multi-Objective Optimization What we would like to do is change: to Shift the decision on when to decide how influential each objective will be to after the optimization effort instead of before hand 13

Multi-Objective Optimization MOPSO uses Pareto Dominance to determine set of solutions for one or more competing objectives Each point in the optimal set constitutes a non- dominated solution Two objective function systems form a front, more, a hyper-surface 14

Multi-Objective Optimization Imagine a two objective function system: 15 Instead of a single optimum solution, MOPSO delivers a front of non-dominated solutions Each point on the front represents the best possible solution for a given objective function with respect to the other objective functions

MOPSO MOPSO requires two significant changes to the basic form of PSO: Creation and active maintenance of a repository to collect the non-dominated candidate solutions Modification of the basic form of the velocity equation to choose a social leader form this repository instead of of a global best 16

MOPSO No longer a single social leader (SL) available in MOPSO Instead, need to choose a particle from the repository to serve as the SL Use weighted Roulette Wheel procedure to select SL Biased towards sparsely populated regions of the emerging front 17

PAPSO Reduce runtime by performing swarm activities simultaneously PSO lends itself well to parallelization Fitness, velocity, position updates independent per swarm Processors: Master Processor to administrate swarm Slave Processors perform Objective Function Evaluations and Particle Updates 18

PAPSO Notes: If (# Particles > Number of Processors) FIFO queue for particles Asynchronous nature mitigates negative performance effects caused by runtime variability Runtime improvement proportional to ratio of Objective Function Calculation time to Network Transmission Time 19

MOPAPSO The idea is to combine these variants: MOPSO to provide formal multi-objective support PAPSO to speed things up Requirements: Should match MOPSO results Should reduce overall runtime 20

MOPAPSO Two Roles for Processors… 21 One MasterN# of Slaves Initializes the swarm Creates a FIFO particle queue Dispatches the first “N” jobs Catch GBEST, PPOS, PVEL Update Velocity Calculate OFs for each Object Return OFs, PPOS and PVEL Catch updated particle specs. Dispatch next particle job Update the repositoryEvery “m” iterations

MOPAPSO – Benchmark Tests To test effectiveness of MOPAPSO implementation: Used two MOPSO benchmarks from the literature before applying to OBMS Configuration: Nine-node grid running Rocks Cluster Distribution 5 dual core 2.0 GHz processors with 1GB RAM each acslX Interpconsole v2.4.1 using MPICH2 40 Particles, 100 “Iterations” (100 x 40 = Updates) 22

MOPAPSO – Benchmark One 23

MOPAPSO – Benchmark One 24

MOPAPSO – Benchmark Two 25

MOPAPSO – Benchmark Two 26

OBMS Example 27

Objective Functions 28

Results 29

Runtime Improvements 30

Conclusions 31 A high-level implementation of MOPAPSO has been developed Testing of the algorithm on two benchmark problems showed that MOPAPSO can easily locate Pareto-fronts for multi-objective problems MOPAPSO effectively solved OBMS featuring multiple objectives

Acknowledgements 32 Ontario Research Fund