Ants in the Pants! An Overview Real world insect examples Theory of Swarm Intelligence From Insects to Realistic A.I. Algorithms Examples of AI applications.

Slides:



Advertisements
Similar presentations
Computational Intelligence Winter Term 2011/12 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Advertisements

Artificial Bee Colony Algorithm
Particle Swarm Optimization (PSO)
Computational Intelligence Winter Term 2013/14 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Particle Swarm Optimization
Swarm Intelligence From Natural to Artificial Systems Ukradnuté kde sa dalo, a adaptované.
Particle Swarm Optimization (PSO)  Kennedy, J., Eberhart, R. C. (1995). Particle swarm optimization. Proc. IEEE International Conference.
Swarm algorithms COMP308. Swarming – The Definition aggregation of similar animals, generally cruising in the same direction Termites swarm to build colonies.
PARTICLE SWARM OPTIMISATION (PSO) Perry Brown Alexander Mathews Image:
Artificial Bee Colony Algorithm
Particle Swarm Optimization (PSO)
Ant Colony Optimization. Brief introduction to ACO Ant colony optimization = ACO. Ants are capable of remarkably efficient discovery of short paths during.
Biologically Inspired Computation Lecture 10: Ant Colony Optimisation.
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
Swarm Intelligence Corey Fehr Merle Good Shawn Keown Gordon Fedoriw.
Ant Colony Optimization Optimisation Methods. Overview.
Artificial Immune Systems Our body’s immune system is a perfect example of a learning system. It is able to distinguish between good cells and potentially.
Ant Colony Optimization: an introduction
1 PSO-based Motion Fuzzy Controller Design for Mobile Robots Master : Juing-Shian Chiou Student : Yu-Chia Hu( 胡育嘉 ) PPT : 100% 製作 International Journal.
Particle Swarm Optimization Algorithms
Distributed Systems 15. Multiagent systems and swarms Simon Razniewski Faculty of Computer Science Free University of Bozen-Bolzano A.Y. 2014/2015.
SWARM INTELLIGENCE IN DATA MINING Written by Crina Grosan, Ajith Abraham & Monica Chis Presented by Megan Rose Bryant.
Genetic Algorithms and Ant Colony Optimisation
Lecture Module 24. Swarm describes a behaviour of an aggregate of animals of similar size and body orientation. Swarm intelligence is based on the collective.
By:- Omkar Thakoor Prakhar Jain Utkarsh Diwaker
Swarm Computing Applications in Software Engineering By Chaitanya.
Swarm Intelligence 虞台文.
Algorithms and their Applications CS2004 ( )
SWARM INTELLIGENCE Sumesh Kannan Roll No 18. Introduction  Swarm intelligence (SI) is an artificial intelligence technique based around the study of.
Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of.
-Abhilash Nayak Regd. No. : CS1(B) “The Power of Simplicity”
DRILL Answer the following question’s in your notebook: 1.How does ACO differ from PSO? 2.What does positive feedback do in a swarm? 3.What does negative.
(Particle Swarm Optimisation)
The Particle Swarm Optimization Algorithm Nebojša Trpković 10 th Dec 2010.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Kavita Singh CS-A What is Swarm Intelligence (SI)? “The emergent collective intelligence of groups of simple agents.”
Topics in Artificial Intelligence By Danny Kovach.
FRE 2672 TFG Self-Organization - 01/07/2004 Engineering Self-Organization in MAS Complex adaptive systems using situated MAS Salima Hassas LIRIS-CNRS Lyon.
Selected topics in Ant 2002 By Hanh Nguyen. Selected topics in Ant 2002 Homogeneous Ants for Web Document Similarity Modeling and Categorization Ant Colonies.
Swarm Intelligence Quantitative analysis: How to make a decision? Thank you for all referred pictures and information.
Modeling and Simulation. Warm-up Activity (1 of 3) You will be given a set of nine pennies. Let’s assume that one of the pennies is a counterfeit that.
Neural and Evolutionary Computing - Lecture 11 1 Nature inspired metaheuristics  Metaheuristics  Swarm Intelligence  Ant Colony Optimization  Particle.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Solving of Graph Coloring Problem with Particle Swarm Optimization Amin Fazel Sharif University of Technology Caro Lucas February 2005 Computer Engineering.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Technical Seminar Presentation Presented By:- Prasanna Kumar Misra(EI ) Under the guidance of Ms. Suchilipi Nepak Presented By Prasanna.
1 Swarm Intelligence on Graphs (Consensus Protocol) Advanced Computer Networks: Part 1.
Biologically Inspired Computation Ant Colony Optimisation.
Particle Swarm Optimization (PSO)
DRILL Answer the following question’s about yesterday’s activity in your notebook: 1.Was the activity an example of ACO or PSO? 2.What was the positive.
Swarm Intelligence. An Overview Real world insect examples Theory of Swarm Intelligence From Insects to Realistic A.I. Algorithms Examples of AI applications.
Topic1:Swarm Intelligence 李长河,计算机学院
Ant Colony Optimisation. Emergent Problem Solving in Lasius Niger ants, For Lasius Niger ants, [Franks, 89] observed: –regulation of nest temperature.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Particle Swarm Optimization (PSO) Algorithm. Swarming – The Definition aggregation of similar animals, generally cruising in the same directionaggregation.
 Introduction  Particle swarm optimization  PSO algorithm  PSO solution update in 2-D  Example.
Swarm Intelligence. Content Overview Swarm Particle Optimization (PSO) – Example Ant Colony Optimization (ACO)
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Energy Quest – 8 September
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Meta-heuristics Introduction - Fabien Tricoire
Multi-objective Optimization Using Particle Swarm Optimization
Computational Intelligence
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Overview of SWARM INTELLIGENCE and ANT COLONY OPTIMIZATION
现代智能优化算法-粒子群算法 华北电力大学输配电系统研究所 刘自发 2008年3月 1/18/2019
Computational Intelligence
SWARM INTELLIGENCE Swarms
Presentation transcript:

Ants in the Pants! An Overview Real world insect examples Theory of Swarm Intelligence From Insects to Realistic A.I. Algorithms Examples of AI applications

Bees Colony cooperation Regulate hive temperature Efficiency via Specialization: division of labour in the colony Communication : Food sources are exploited according to quality and distance from the hive

Termites Cone-shaped outer walls and ventilation ducts Brood chambers in central hive Spiral cooling vents Support pillars

Ants Organizing highways to and from their foraging sites by leaving pheromone trails Form chains from their own bodies to create a bridge to pull and hold leafs together with silk Division of labour between major and minor ants

Social Insects Problem solving benefits include: Flexible Robust Decentralized Self-Organized

Summary of Insects The complexity and sophistication of Self-Organization is carried out with no clear leader What we learn about social insects can be applied to the field of Intelligent System Design The modeling of social insects by means of Self-Organization can help design artificial distributed problem solving devices. This is also known as Swarm Intelligent Systems.

Interrupt The Flow

The Path Thickens!

The New Shortest Path

Adapting to Environment Changes

Four Ingredients of Self Organization Positive Feedback Negative Feedback Amplification of Fluctuations - randomness Reliance on multiple interactions

Types of Interactions For Social Insects Direct Interactions Food/liquid exchange, visual contact, chemical contact (pheromones) Indirect Interactions (Stigmergy) Individual behavior modifies the environment, which in turn modifies the behavior of other individuals

WEB CLUSTERING Why? The size of the internet has doubling its size every year. Estimated 2.1 billion as of July 2001 Organizing and categorizing document is not scalable to the growth of internet. Document clustering? Is the operation of grouping similar document to classes that can be used to obtain an analysis of the content. Ant clustering algorithm categorize web document to different interest domain.

Ant Colony Models for Data Clustering Data clustering? is the task that seek to identify groups of similar objects based on the value of their attributes. Messor sancta ants collect and pile dead corpses to form “cemeteries” (Deneubourg et al. ) f: fraction of items in the neighborhood of the agent k 1, k 2 : threshold constants

Ant Colony Models for Data Clustering The model later extend by Lume & Faieta to include distance function d, between data objects. c is a cell, N(c) is the number of adjacent cells of c, alpha is constant

Homogeneous Multi-agent System for Document Clustering Main components: colony of agents, feature vector of web document, 2D grid. Rule: agent move one step at a time to an adjacent cell. Only a single agent and/or a single item are allowed to occupy a cell at a time. Picking up or dropping item based on P p & P d N(c) = 8,o i is the item at cell i, g(o i ) determine the similarity of o i and other item of o j, where j E N(c) Density:

Homogeneous Multi-agent System for Document Clustering r is the number of common term in doc i and doc j m,n is the total number of term in doc i and doc j, respectively. F is the frequency Similarity measure

Homogeneous Multi-agent System for Document Clustering

Experimental Results Experimental data: 84 web pages from 4 different categories: Business, Computer, Health and Science. These web page have 17,776 distinct words. Use 30x30 toroidal grid 15 agents. t max is 300,000. k 1 and k 2 in [0.01, 0.2] increment of 0.05 for each run.

Experimental Results t = 0

Experimental Results t = 50,000

Experimental Results t = 200,000

Experimental Results t = 300,000

Experimental Result Table

Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by James Kennedy and Russ Eberhart [Kennedy, J. and Eberhart, R. (1995). “Particle Swarm Optimization”, Proceedings of the 1995 IEEE International Conference on Neural Networks, pp , IEEE Press.] ( ) It has been applied successfully to a wide variety of search and optimization problems. In PSO, a swarm of n individuals communicate either directly or indirectly with one another search directions (gradients). PSO is a simple but powerful search technique.

Particle Swarm Optimization: Swarm Topology In PSO, there have been two basic topologies used in the literature Ring Topology (neighborhood of 3) Star Topology (global neighborhood) I 4 I 0 I 1 I 2 I 3 I 4 I 0 I 1 I 2 I 3

Particle Swarm Optimization: The Anatomy of a Particle A particle (individual) is composed of: Three vectors: The x-vector records the current position (location) of the particle in the search space, The p-vector records the location of the best solution found so far by the particle, and The v-vector contains a gradient (direction) for which particle will travel in if undisturbed. Two fitness values: The x-fitness records the fitness of the x-vector, and The p-fitness records the fitness of the p-vector. I k X = P = V = x_fitness = ? p_fitness = ?

Particle Swarm Optimization: Swarm Search In PSO, particles never die! Particles can be seen as simple agents that fly through the search space and record (and possibly communicate) the best solution that they have discovered. So the question now is, “How does a particle move from on location in the search space to another?” This is done by simply adding the v-vector to the x-vector to get another x-vector (X i = X i + V i ). Once the particle computes the new Xi it then evaluates its new location. If x-fitness is better than p-fitness, then P i = X i and p- fitness = x-fitness.

Particle Swarm Optimization pbest gbest v (k) v (k+1)

Particle Swarm Optimization: Swarm Search Actually, we must adjust the v-vector before adding it to the x- vector as follows: v id = v id +  1*rnd()*(p id -x id ) +  2*rnd()*(p gd -x id ); x id = x id + v id ; Where i is the particle,  1,  2 are learning rates governing the cognition and social components Where g represents the index of the particle with the best p- fitness, and Where d is the d th dimension.

Particle Swarm Optimization: Swarm Search Intially the values of the velocity vectors are randomly generated with the range [-Vmax, Vmax] where Vmax is the maximum value that can be assigned to any v id.

Particle Swarm Optimization: Swarm Types In his paper, [Kennedy, J. (1997), “The Particle Swarm: Social Adaptation of Knowledge”, Proceedings of the 1997 International Conference on Evolutionary Computation, pp , IEEE Press.] Kennedy identifies 4 types of PSO based on  1 and  2. Given: v id = v id +  1*rnd()*(p id -x id ) +  2*rnd()*(p gd -x id ); x id = x id + v id ; Full Model (  1,  2 > 0) Cognition Only (  1 > 0 and  2 = 0), Social Only (  1 = 0 and  2 > 0) Selfless (  1 = 0,  2 > 0, and g  i)

Particle Swarm Optimization: Related Issues There are a number of related issues concerning PSO: Controlling velocities (determining the best value for Vmax), Swarm Size, Neighborhood Size, Updating X and Velocity Vectors, Robust Settings for (  1 and  2 ), An Off-The-Shelf PSO Carlisle, A. and Dozier, G. (2001). “An Off-The-Shelf PSO”, Proceedings of the 2001 Workshop on Particle Swarm Optimization, pp. 1-6, Indianapolis, IN. ( Shelf_PSO.pdf) Shelf_PSO.pdf

Particle Swarm: Controlling Velocities When using PSO, it is possible for the magnitude of the velocities to become very large. Performance can suffer if Vmax is inappropriately set. Two methods were developed for controlling the growth of velocities: A dynamically adjusted inertia factor, and A constriction coefficient.

Particle Swarm Optimization: The Inertia Factor When the inertia factor is used, the equation for updating velocities is changed to: v id =  *v id +  1*rnd()*(p id -x id ) +  2*rnd()*(p gd -x id ); Where  is initialized to 1.0 and is gradually reduced over time (measured by cycles through the algorithm).

Particle Swarm Optimization: Swarm and Neighborhood Size Concerning the swarm size for PSO, as with other ECs there is a trade-off between solution quality and cost (in terms of function evaluations). Global neighborhoods seem to be better in terms of computational costs. The performance is similar to the ring topology (or neighborhoods greater than 3). There has been little research on the effects of swarm topology on the search behavior of PSO.

Particle Swarm Optimization: Particle Update Methods There are two ways that particles can be updated: Synchronously Asynchronously Asynchronous update allows for newly discovered solutions to be used more quickly I 4 I 0 I 1 I 2 I 3

Satellite Maintenance The Future? Medical Interacting Chips in Mundane Objects Cleaning Ship Hulls Pipe Inspection Pest Eradication Miniaturization Engine Maintenance Telecommunications Self-Assembling Robots Job Scheduling Vehicle Routing Data Clustering Distributed Mail Systems Optimal Resource Allocation Combinatorial Optimization

Dumb parts, properly connected into a swarm, yield smart results.