Download presentation
Presentation is loading. Please wait.
1
5.2.2 Optimization, Search and
Knowledge Component 5: Information Processing Computer-Aided Engineering 5.2.2 Optimization, Search and Exploration 2 2nd Edition
2
Module Information Intended audience Key words Author
Novice Key words Optimization, Optimality, Boundedness, Feasibility, Constraint activity Author Ian Smith, EPFL, Switzerland Reviewers (1st Edition) Esther Obonyo, U of Florida, USA Ni-Bin Chang, U of Central Florida, USA Frederic Bosche, U of Waterloo, Canada Rafal Kicinger, George Mason U, USA 2
3
What there is to learn See the quiz at the end for questions and answers. This sub-module introduces the following ideas: Simple optimization (setting the derivative to zero) is best when there is one variable and if the objective function is continuous and differentiable. The Simplex method is best when the objective function and all constraints are linear Gradient methods are fast when the objective function is continuous and differentiable. They are reliable when the objective function has one minimum (maximum) within the feasible solution space. Pareto analysis helps filter sub-optimal solution without having to use weights on criteria. 3
4
Simple Optimization Using Derivation
Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization
5
Simple optimization using derivation
Given an objective function y = f(x), take the derivative of f with respect to x, set the result to zero and solve for x. x y ∆x ∆y ∆y = 0 Maximum
6
Outline Simple Optimization Using Derivation Linear Programming
Gradient-Based Optimization Multi-Criteria Optimization
7
Linear Programming Linear programming (LP) involves the optimization of a linear objective function, subject to linear equality and inequality constraints. Mathematically, the linear programming problem is defined as follows. Given a matrix A of size m x n, a vector b of size m and a vector c of size n, find a vector x ≥ 0 of size n that minimizes f(x) = cT ∙ x such that A ∙ x = b
8
Linear Programming (cont’d)
Example Minimize Z = 1.8x1 + x2 Constraints: x1 + x2 ≥ 6 x1 + x2 ≤ 12 x2 ≥ 3 x1, x2 ≥ 0 Optimal solutions are found at the extremities of the domain.
9
Simplex Method The Simplex algorithm (George Dantzig, 1947) is a popular strategy for numerical solution of the LP problem. Unfortunately, there are few practical situations in civil engineering (involving multiple variables) that can be modeled using linear constraints and linear objective functions Further information related to linear programming and the Simplex algorithm is out of scope of this course. See textbooks in the field of operations research.
10
Gradient-Based Optimization
Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization
11
Usefulness Complex objective functions may involve several variables. Partial differentiation of the objective function may not yield optimum values. Iterative methods become necessary. The majority of optimization methods in practice employs gradient-based approaches. They are also called hill-climbing methods. There are entire books on this topic. An example of steps of gradient-based optimization (minimization) are listed in the following slide.
12
Steps Start at an initial point x0
Evaluate the objective function F(x) for x0 Take a step xk+1 = xk - αk F'(xk) in the new direction that minimizes the objective function F(x). The value for αk is kept sufficiently small so that incremental changes result in convergence. Determine a strategy for avoiding oscillations. In multivariate optimization, x is a vector.
13
Steps (cont’d.) xk xk+1 Minimum
14
Gradient Descent with Fixed Step Size
Aspects Convergence Divergence Oscillation Difficulty in overcoming local minima xk xk+1 xk xk+1 xk+2 xk+1 xk
15
Multi-Criteria Optimization
Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization
16
Introduction Most engineering tasks involve consideration of multiple criteria. For example, we may wish to minimize f1(x) and maximize f2(x). How can we do this? The following slides present some approaches.
17
Weight Functions Create a single objective function by assigning weights to the different objectives and minimize this objective function. For example, Minimize F(x) = w1f1(x) + w2f2(x) Although this approach is widely used, it can lead to undesirable solutions when weight values are inappropriate. The values chosen for the weights are often subjective. Also, the best values for w may depend on values of f(x).
18
Utility Functions Utility functions: assign monetary values to objectives in order to maximize a certain utility that has been defined. This concept comes from economics. Utility is the measure of goal-attainment to which the given criteria contribute. For example, U(x) = F(f1(x), f2(x), … ) This approach works best when realistic utility estimates are available.
19
Simplification Select the most important criteria and simplify the other criteria by transforming them into constraints. This approach works best when one criterion is much more important than others.
20
Pareto Optimization When engineering tasks involve multiple criteria that have similar importance, there may be solutions that are equivalent considering all criteria. A solution, x0, is Pareto optimal if there is no other solution possible which is better than x0 under all criteria. Equivalent optimal solutions are called Pareto optimal solutions.
21
Application of Pareto Optimization
Pareto optimization is a method that includes multiple criteria without using weights. It is usually used to filter out sub-optimal solutions. Users then select the most attractive Pareto optimal solution interactively through consideration of criteria that are not modeled. Other criteria may include factors such as aesthetics, local conditions, politics, etc.
22
Example of Pareto Filtering
Minimizing all criteria Solutions Criterion 1 Criterion 2 Criterion 3 1 3 5 8 2 6 9 4 7 x x Solution 1 dominates solutions 3 and 5. Therefore, solutions 3 and 5 are not Pareto optimal and are filtered out. Solutions 1, 2, 4 and 6 are not dominated by any other solution.
23
Example of a Pareto Surface
Criterion 1 Criterion 2 X0 Pareto surface Pareto-optimal solution X0 is better than all solutions on the surface left of it in terms of criterion 1 and is worse than all solutions on the surface right of it in terms of criterion 2. Considering both criteria all solutions on the Pareto surface are equivalent.
24
Example Task Task - Design a beam according to the following objectives: Reduce deflections (serviceability) Reduce cost (economy) Three strategies for including these objectives are: Treat cost as primary objective. Deflection is simplified by transformation into a constraint Combine objectives using weight factors Determine a Pareto surface and let users select
25
Review Quiz What are the drawbacks of gradient-based optimization methods? What is the most appropriate approach for multi-criteria optimization?
26
Answers to Review Quiz What are the drawbacks of gradient-based optimization methods? They require existence of derivatives at all points in the domain. Gradient-based methods are also susceptible to divergence, oscillation and termination in local minima. Therefore, the function to be optimized determines whether or not gradient-based optimization is useful.
27
Answers to Review Quiz What is the most appropriate method for multi-criteria optimization? There is no one appropriate method. Task characteristics point to the best approach.
28
Further Reading Introduction to Optimum Design, J.S. Arora, McGraw-Hill Book Company, New York, 1989. Introduction to Knowledge Systems, M. Stefik, Morgan Kaufman Publishers, San Fransisco, California, 1995. Intelligent Systems For Engineering: A Knowledge-Based Approach, Chapter 2: Introduction to Search Methods, R.D. Sriram, Springer-Verlag, Berlin, 1997.
29
Further Reading (cont'd.)
J.S. Arora, O.A. Elwakeil, A.I. Chahande, C.C. Hsieh, “Global Optimization methods for engineering applications: a review,” Structural Optimization, 9, , 1995. B. Raphael and I.F.C. Smith, “A direct stochastic algorithm for global search,” J of Applied Mathematics and Computation, Vol 146, No 2-3, 2003, pp Raphael, B. and Smith, I.F.C. Fundamentals of Computer-Aided Engineering, Wiley, 2003
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.