MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #36 4/29/02 Multi-Objective Optimization.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Advertisements

1 Modeling and Simulation: Exploring Dynamic System Behaviour Chapter9 Optimization.
Topic Outline ? Black-Box Optimization Optimization Algorithm: only allowed to evaluate f (direct search) decision vector x objective vector f(x) objective.
Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Engineering Optimization
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Introduction to multi-objective optimization We often have more than one objective This means that design points are no longer arranged in strict hierarchy.
Support Vector Machines Instructor Max Welling ICS273A UCIrvine.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
1 Logic-Based Methods for Global Optimization J. N. Hooker Carnegie Mellon University, USA November 2003.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #35 4/26/02 Multi-Objective Optimization.
Multi-Objective Evolutionary Algorithms Matt D. Johnson April 19, 2007.
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Genetic Algorithms in Materials Processing N. Chakraborti Department of Metallurgical & Materials Engineering Indian Institute of Technology Kharagpur.
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #16 3/1/02 Taguchi’s Orthogonal Arrays.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #32 4/19/02 Fuzzy Logic.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #20 3/10/02 Taguchi’s Orthogonal Arrays.
Handling Constraints 報告者 : 王敬育. Many researchers investigated Gas based on floating point representation but the optimization problems they considered.
Multiuser OFDM with Adaptive Subcarrier, Bit, and Power Allocation Wong, et al, JSAC, Oct
Design Optimization School of Engineering University of Bradford 1 Formulation of a multi-objective problem Pareto optimum set consists of the designs.
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
Optimal Arrangement of Ceiling Cameras for Home Service Robots Using Genetic Algorithms Stefanos Nikolaidis*, ** and Tamio Arai** *R&D Division, Square.
Genetic Algorithms: A Tutorial
Efficient Model Selection for Support Vector Machines
On comparison of different approaches to the stability radius calculation Olga Karelkina Department of Mathematics University of Turku MCDM 2011.
Genetic Algorithms Genetic algorithms imitate natural optimization process, natural selection in evolution. Developed by John Holland at the University.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Simplex method (algebraic interpretation)
A two-stage approach for multi- objective decision making with applications to system reliability optimization Zhaojun Li, Haitao Liao, David W. Coit Reliability.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Chapter 7 Handling Constraints
1 Lesson 8: Basic Monte Carlo integration We begin the 2 nd phase of our course: Study of general mathematics of MC We begin the 2 nd phase of our course:
Introduction to multi-objective optimization We often have more than one objective This means that design points are no longer arranged in strict hierarchy.
Genetic Algorithms Genetic algorithms imitate a natural optimization process: natural selection in evolution. Developed by John Holland at the University.
Optimal resource assignment to maximize multistate network reliability for a computer network Yi-Kuei Lin, Cheng-Ta Yeh Advisor : Professor Frank Y. S.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
1 Part IV: Blade Geometry Optimization Philippe Giguère Graduate Research Assistant Steady-State Aerodynamics Codes for HAWTs Selig, Tangler, and Giguère.
Mechanical Engineering Department 1 سورة النحل (78)
Genetic Algorithms Czech Technical University in Prague, Faculty of Electrical Engineering Ondřej Vaněk, Agent Technology Center ZUI 2011.
Kanpur Genetic Algorithms Laboratory IIT Kanpur 25, July 2006 (11:00 AM) Multi-Objective Dynamic Optimization using Evolutionary Algorithms by Udaya Bhaskara.
Multi-objective Optimization
Multiobjective Optimization for Locating Multiple Optimal Solutions of Nonlinear Equation Systems and Multimodal Optimization Problems Yong Wang School.
Introduction to Optimization
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
N- Queens Solution with Genetic Algorithm By Mohammad A. Ismael.
Evolutionary multi-objective algorithm design issues Karthik Sindhya, PhD Postdoctoral Researcher Industrial Optimization Group Department of Mathematical.
Evolutionary Computing Chapter 12. / 26 Chapter 12: Multiobjective Evolutionary Algorithms Multiobjective optimisation problems (MOP) -Pareto optimality.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithm(GA)
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Lesson 8: Basic Monte Carlo integration
MAE 552 Heuristic Optimization
Heuristic Optimization Methods Pareto Multiobjective Optimization
Steady-State Aerodynamics Codes for HAWTs
○ Hisashi Shimosaka (Doshisha University)
Local Search Algorithms
Game Programming Algorithms and Techniques
Implementation of Learning Systems
Chapter 6. Large Scale Optimization
Multiobjective Optimization
Presentation transcript:

MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #36 4/29/02 Multi-Objective Optimization

References: General

Multi-Objective Optimization Last time, we talked about multi-criteria optimization. We discussed the most common method of solution being a linear convex combination of objectives using weights. We also discussed many of the drawbacks associated with this approach. Today, we will look at some other techniques.

Multi-Objective Optimization Let us briefly describe what we would like to see in a Pareto frontier. 1 – Even or uniform sampling. 2 – Wide expanse. 3 – Distinctness of points. That is, 2 points are not distinct if they are so close to one another that the designer does not care to distinguish between them.

Multi-Objective Optimization Let’s first restate the optimization problem:

Multi-Objective Optimization Normal Boundary Intersection method (Dennis and Das): Properties: 1.Extendable to any number of objective functions. 2.Produces an even spread of Pareto Points. 3.Requires first minimizing each function individually. 4.Requires repeated solution of the NBI subproblem.

Multi-Objective Optimization General Implementation: Consider the following Animation for a 2D example: Feasible Space (C) f2f2 f1f1 Convex Hull of Individual Minima (CHIM) Use the NBI subproblem to find the intersections of each normal vector and the efficient frontier.

Multi-Objective Optimization General Implementation: Consider the following Animation for a 2D example: Feasible Space (C) f2f2 f1f1 Convex Hull of Individual Minima (CHIM) The vector parameter β in the subproblem defines the take- off points.

Multi-Objective Optimization General Implementation: Consider the following Animation for a 2D example: Feasible Space (C) f2f2 f1f1 Convex Hull of Individual Minima (CHIM) This approach has the advantage that an even spread of β’s produces an even spread of points on the CHIM which in turn corresponds to an even spread of Pareto points.

Multi-Objective Optimization The subproblem is defined as follows: Terms defined on next page.

Multi-Objective Optimization C we showed as the feasible space. is the optimal design configuration for objective i. is the optimum value of objective i ( = ). F * is the vector of individual objective optimums (denotes the utopia point). Φ is n x n payoff matrix such that each column is given by: So Φ shifts the axes in the performance space such that the utopia point is at the origin.

Multi-Objective Optimization So anyway, NBI works very well for the case of 2 objectives but may potentially be unable to find certain points in 3 or more dimensions termed “peripheral points”. For more information, see: I. Das and J. E. Dennis. Normal-Boundary Intersection: An Alternate Approach for Generating Pareto-optimal Points in Multicriteria Optimization Problems. ICASE-NASA Tech. Report Submitted to SIAM J. on Optimization, July 1996.

Multi-Objective Optimization Goal Programming: Properties: 1.Extendable to any number of objective functions. 2.Not useful for populating the Pareto frontier. 3.Requires relatively intimate knowledge of desired values for various objectives.

Multi-Objective Optimization Essentially, for each objective, the user must set one of the following: A maximum allowable value, A minimum allowable value, A desired range for the value. The job is then to minimize the deviation of each objective from its goal.

Multi-Objective Optimization So if all objectives are minimization, the goal approach limits the value to some ceiling an allows it to be less than that ceiling. So this method is not guaranteed to find a Pareto solution. Nor is it useful to populate the Pareto frontier. It is useful to find the design parameters that will satisfy a given set of performance values.

Multi-Objective Optimization There are some other less interesting but common approaches to multi-criteria optimization. You can read about some of them on the reference web pages. Does anyone know which of the methods we discussed this semester might be well suited to solution of this type of problem? (Recall that we are seeking a group of solutions).

Multi-Objective Optimization Recently, much attention has been paid to the usefulness of Genetic programming for solution to these problems. How might we adapt our genetic algorithm to deal with multiple objectives?

Multi-Objective Optimization Recall the progression of our algorithm as follows: Initialize Population. Evaluate Population. t = 0 while(not converged) t = t + 1; Variation of P(t) Evaluation of P(t) Select P(t) from P(t-1) end

Multi-Objective Optimization So what phase of this algorithm must be adjusted to accommodate multiple objectives? Clearly, the evaluation phase must take into account Pareto optimality. There are a number of ways to account for Pareto optimality in a GA. Some of which involve using a fitness parameter and some of which do not.

Multi-Objective Optimization One method using a fitness parameter is to assign the fitness of a design as the number of designs in the current population that it dominates. A method that does not use a fitness parameter would be conditional entry. It goes as follows: Generate children as you normally would from a population. Then, on an individual basis, do the following:

Multi-Objective Optimization Find a member in the population that the child dominates. If such a design exists, replace it with the child. If such a design does not exist, do the following: Find a member of the population that dominates the child. If such a design exists, discard the child. Otherwise, grow the population and add the child in.

Multi-Objective Optimization So clearly, this is not a steady state type GA but it helps to allow as many efficient points as possible participate in the evolution. So how then do we account for constraints in such a case? Clearly we are not using some penalty function in our assessment of fitness.

Multi-Objective Optimization There are a couple of options that we can choose from. The first of which is to treat each constraint as an implicit objective. That is, consider that the presence of each constraint implies the existence of an objective. That objective is satisfaction of this constraint.

Multi-Objective Optimization We have now converted our n-objective problem into one of n + num_of_cons which is a drawback. Another drawback is that we lose all information that tells us just how much better 1 design is over another. So we have alleviated one of the drawbacks of the previous approach.

Multi-Objective Optimization Another way to do it is to add a single additional objective to our problem whereby that objective is satisfaction of all constraints. So we could pool all of our constraint violations into a term like a penalty term and use a seek 0 (which is minimize by our standard penalty formulations) approach to optimizing it as though it were an objective.

Multi-Objective Optimization Consider the following example of a problem that I solved using a MOGA. b Ld1d1 d2d2 d3d3 Vibrating Motor

Multi-Objective Optimization The objectives are to minimize the material cost and to maximize the fundamental frequency of the beam. There are a number of constraints for geometry, material selections, and mass. The following figure shows two curves. The blue curve is the data generated here and the pink curve is data generated by the I-SHOT method.

Multi-Objective Optimization

The I-SHOT method of Azarm, Reynolds, and Narayanan is in brief a repeated application of a single-objective GA using special combinatorial rules for the objectives. It can be read about in: Azarm, S., Reynolds, B. J., and Narayanan, S., 1999, “Comparison of Two Multiobjective Optimization Techniques with and within Genetic Algorithms,” ASME Design Engineering Technical Conferences, DETC99/DAC-8584.

Multi-Objective Optimization A comparison of the results is as follows: Azarm et. Al.: - 1 Pareto point for every 7,909 points visited. - 53,963 total points visited. - Frequency ranged from ~ 220 – 400 Hz. - Cost ranged from ~ $100 - $180. These Results: - 1 Pareto point for every 195 points visited. - 33,076 total points visited. - Frequency ranged from ~ 88 – 380 Hz. - Cost ranged from ~ $75 - $185.