5.2.2 Optimization, Search and

Slides:



Advertisements
Similar presentations
Nonlinear Programming McCarl and Spreen Chapter 12.
Advertisements

LECTURE SERIES on STRUCTURAL OPTIMIZATION Thanh X. Nguyen Structural Mechanics Division National University of Civil Engineering
Engineering Optimization
Dragan Jovicic Harvinder Singh
Optimization methods Review
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
ENGINEERING OPTIMIZATION
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
The Simplex Method: Standard Maximization Problems
Chapter 10: Iterative Improvement
Design Optimization School of Engineering University of Bradford 1 Formulation of a design improvement problem as a formal mathematical optimization problem.
Reformulated - SVR as a Constrained Minimization Problem subject to n+1+2m variables and 2m constrains minimization problem Enlarge the problem size and.
INTRODUCTORY MATHEMATICAL ANALYSIS For Business, Economics, and the Life and Social Sciences  2007 Pearson Education Asia Chapter 7 Linear Programming.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Solving Linear Programming Problems Using Excel Ken S. Li Southeastern Louisiana University.
Introduction to Optimization (Part 1)
Stevenson and Ozgur First Edition Introduction to Management Science with Spreadsheets McGraw-Hill/Irwin Copyright © 2007 by The McGraw-Hill Companies,
Operations Research Models
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Introduction A GENERAL MODEL OF SYSTEM OPTIMIZATION.
CHAPTER 4, Part II Oliver Schulte Summer 2011 Local Search.
Vaida Bartkutė, Leonidas Sakalauskas
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Written by Changhyun, SON Chapter 5. Introduction to Design Optimization - 1 PART II Design Optimization.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Introduction to Optimization
1 Optimization Techniques Constrained Optimization by Linear Programming updated NTU SY-521-N SMU EMIS 5300/7300 Systems Analysis Methods Dr.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Instructional Design Document Simplex Method - Optimization STAM Interactive Solutions.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Lecture 4 Chapter 3 Improving Search
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Structural Optimization
Linear Programming for Solving the DSS Problems
Function Optimization
EMGT 6412/MATH 6665 Mathematical Programming Spring 2016
Deep Feedforward Networks
Knowledge Component 1: Theoretical Foundations
3.3. Case-Based Reasoning (CBR)
Mathematical Programming
Linear Programming Dr. T. T. Kachwala.
Linear Programming.
Chap 3. The simplex method
Collaborative Filtering Matrix Factorization Approach
Newton-Raphson Method
of the Artificial Neural Networks.
Newton-Raphson Method
It is all in the algorithm
3.1.1 Introduction to Machine Learning
Knowledge Component 6: Knowledge Utilization
5.2.3 Optimization, Search and
Outline Unconstrained Optimization Functions of One Variable
Optimization and Some Traditional Methods
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Structural Optimization Design ( Structural Analysis & Optimization )
5.2.1 Optimization, Search and Exploration 1
1.3 Classifying Engineering Tasks
Local Search Algorithms
Chapter 10: Iterative Improvement
What are optimization methods?
Presentation transcript:

5.2.2 Optimization, Search and Knowledge Component 5: Information Processing Computer-Aided Engineering 5.2.2 Optimization, Search and Exploration 2 2nd Edition

Module Information Intended audience Key words Author Novice Key words Optimization, Optimality, Boundedness, Feasibility, Constraint activity Author Ian Smith, EPFL, Switzerland Reviewers (1st Edition) Esther Obonyo, U of Florida, USA Ni-Bin Chang, U of Central Florida, USA Frederic Bosche, U of Waterloo, Canada Rafal Kicinger, George Mason U, USA 2

What there is to learn See the quiz at the end for questions and answers. This sub-module introduces the following ideas: Simple optimization (setting the derivative to zero) is best when there is one variable and if the objective function is continuous and differentiable. The Simplex method is best when the objective function and all constraints are linear Gradient methods are fast when the objective function is continuous and differentiable. They are reliable when the objective function has one minimum (maximum) within the feasible solution space. Pareto analysis helps filter sub-optimal solution without having to use weights on criteria. 3

Simple Optimization Using Derivation Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization

Simple optimization using derivation Given an objective function y = f(x), take the derivative of f with respect to x, set the result to zero and solve for x. x y ∆x ∆y ∆y = 0 Maximum

Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization

Linear Programming Linear programming (LP) involves the optimization of a linear objective function, subject to linear equality and inequality constraints. Mathematically, the linear programming problem is defined as follows. Given a matrix A of size m x n, a vector b of size m and a vector c of size n, find a vector x ≥ 0 of size n that minimizes f(x) = cT ∙ x such that A ∙ x = b

Linear Programming (cont’d) Example Minimize Z = 1.8x1 + x2 Constraints: x1 + x2 ≥ 6 x1 + x2 ≤ 12 x2 ≥ 3 x1, x2 ≥ 0 Optimal solutions are found at the extremities of the domain.

Simplex Method The Simplex algorithm (George Dantzig, 1947) is a popular strategy for numerical solution of the LP problem. Unfortunately, there are few practical situations in civil engineering (involving multiple variables) that can be modeled using linear constraints and linear objective functions Further information related to linear programming and the Simplex algorithm is out of scope of this course. See textbooks in the field of operations research.

Gradient-Based Optimization Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization

Usefulness Complex objective functions may involve several variables. Partial differentiation of the objective function may not yield optimum values. Iterative methods become necessary. The majority of optimization methods in practice employs gradient-based approaches. They are also called hill-climbing methods. There are entire books on this topic. An example of steps of gradient-based optimization (minimization) are listed in the following slide.

Steps Start at an initial point x0 Evaluate the objective function F(x) for x0 Take a step xk+1 = xk - αk F'(xk) in the new direction that minimizes the objective function F(x). The value for αk is kept sufficiently small so that incremental changes result in convergence. Determine a strategy for avoiding oscillations. In multivariate optimization, x is a vector.

Steps (cont’d.) xk xk+1 Minimum

Gradient Descent with Fixed Step Size Aspects Convergence Divergence Oscillation Difficulty in overcoming local minima xk xk+1 xk xk+1 xk+2 xk+1 xk

Multi-Criteria Optimization Outline Simple Optimization Using Derivation Linear Programming Gradient-Based Optimization Multi-Criteria Optimization

Introduction Most engineering tasks involve consideration of multiple criteria. For example, we may wish to minimize f1(x) and maximize f2(x). How can we do this? The following slides present some approaches.

Weight Functions Create a single objective function by assigning weights to the different objectives and minimize this objective function. For example, Minimize F(x) = w1f1(x) + w2f2(x) Although this approach is widely used, it can lead to undesirable solutions when weight values are inappropriate. The values chosen for the weights are often subjective. Also, the best values for w may depend on values of f(x).

Utility Functions Utility functions: assign monetary values to objectives in order to maximize a certain utility that has been defined. This concept comes from economics. Utility is the measure of goal-attainment to which the given criteria contribute. For example, U(x) = F(f1(x), f2(x), … ) This approach works best when realistic utility estimates are available.

Simplification Select the most important criteria and simplify the other criteria by transforming them into constraints. This approach works best when one criterion is much more important than others.

Pareto Optimization When engineering tasks involve multiple criteria that have similar importance, there may be solutions that are equivalent considering all criteria. A solution, x0, is Pareto optimal if there is no other solution possible which is better than x0 under all criteria. Equivalent optimal solutions are called Pareto optimal solutions.

Application of Pareto Optimization Pareto optimization is a method that includes multiple criteria without using weights. It is usually used to filter out sub-optimal solutions. Users then select the most attractive Pareto optimal solution interactively through consideration of criteria that are not modeled. Other criteria may include factors such as aesthetics, local conditions, politics, etc.

Example of Pareto Filtering Minimizing all criteria Solutions Criterion 1 Criterion 2 Criterion 3 1 3 5 8 2 6 9 4 7 x x Solution 1 dominates solutions 3 and 5. Therefore, solutions 3 and 5 are not Pareto optimal and are filtered out. Solutions 1, 2, 4 and 6 are not dominated by any other solution.

Example of a Pareto Surface Criterion 1 Criterion 2 X0 Pareto surface Pareto-optimal solution X0 is better than all solutions on the surface left of it in terms of criterion 1 and is worse than all solutions on the surface right of it in terms of criterion 2. Considering both criteria all solutions on the Pareto surface are equivalent.

Example Task Task - Design a beam according to the following objectives: Reduce deflections (serviceability) Reduce cost (economy) Three strategies for including these objectives are: Treat cost as primary objective. Deflection is simplified by transformation into a constraint Combine objectives using weight factors Determine a Pareto surface and let users select

Review Quiz What are the drawbacks of gradient-based optimization methods? What is the most appropriate approach for multi-criteria optimization?

Answers to Review Quiz What are the drawbacks of gradient-based optimization methods? They require existence of derivatives at all points in the domain. Gradient-based methods are also susceptible to divergence, oscillation and termination in local minima. Therefore, the function to be optimized determines whether or not gradient-based optimization is useful.

Answers to Review Quiz What is the most appropriate method for multi-criteria optimization? There is no one appropriate method. Task characteristics point to the best approach.

Further Reading Introduction to Optimum Design, J.S. Arora, McGraw-Hill Book Company, New York, 1989. Introduction to Knowledge Systems, M. Stefik, Morgan Kaufman Publishers, San Fransisco, California, 1995. Intelligent Systems For Engineering: A Knowledge-Based Approach, Chapter 2: Introduction to Search Methods, R.D. Sriram, Springer-Verlag, Berlin, 1997.

Further Reading (cont'd.) J.S. Arora, O.A. Elwakeil, A.I. Chahande, C.C. Hsieh, “Global Optimization methods for engineering applications: a review,” Structural Optimization, 9, 137-159, 1995. B. Raphael and I.F.C. Smith, “A direct stochastic algorithm for global search,” J of Applied Mathematics and Computation, Vol 146, No 2-3, 2003, pp 729-758 Raphael, B. and Smith, I.F.C. Fundamentals of Computer-Aided Engineering, Wiley, 2003