Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 123 “True” Constrained Minimization.

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Constraint Optimization We are interested in the general non-linear programming problem like the following Find x which optimizes f(x) subject to gi(x)
ECG Signal processing (2)
Lecture 3 Linear Programming: Tutorial Simplex Method
Solving IPs – Cutting Plane Algorithm General Idea: Begin by solving the LP relaxation of the IP problem. If the LP relaxation results in an integer solution,
Classification / Regression Support Vector Machines
Engineering Optimization
Support Vector Machines Instructor Max Welling ICS273A UCIrvine.
CHAPTER 10: Linear Discrimination
Optimization. f(x) = 0 g i (x) = 0 h i (x)
Adam Networks Research Lab Transformation Methods Penalty and Barrier methods Study of Engineering Optimization Adam (Xiuzhong) Chen 2010 July 9th Ref:
Support Vector Machines
SVM—Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Robust Multi-Kernel Classification of Uncertain and Imbalanced Data
Classification and Decision Boundaries
Nonlinear Optimization for Optimal Control
Dual Problem of Linear Program subject to Primal LP Dual LP subject to ※ All duality theorems hold and work perfectly!
1 Chapter 8: Linearization Methods for Constrained Problems Book Review Presented by Kartik Pandit July 23, 2010 ENGINEERING OPTIMIZATION Methods and Applications.
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Classification Problem 2-Category Linearly Separable Case A- A+ Malignant Benign.
Sample-Separation-Margin Based Minimum Classification Error Training of Pattern Classifiers with Quadratic Discriminant Functions Yongqiang Wang 1,2, Qiang.
Sparse Kernels Methods Steve Gunn.
Unconstrained Optimization Problem
Lecture outline Support vector machines. Support Vector Machines Find a linear hyperplane (decision boundary) that will separate the data.
Support Vector Machines
Lecture 10: Support Vector Machines
Advanced Topics in Optimization
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Linear Discriminant Functions Chapter 5 (Duda et al.)
Constrained Optimization Rong Jin. Outline  Equality constraints  Inequality constraints  Linear Programming  Quadratic Programming.
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization of Linear Problems: Linear Programming (LP) © 2011 Daniel Kirschen and University of Washington 1.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 9. Optimization problems.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Nonlinear programming Unconstrained optimization techniques.
Seungchan Lee Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Software Release and Support.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
296.3Page :Algorithms in the Real World Linear and Integer Programming II – Ellipsoid algorithm – Interior point methods.
An Introduction to Support Vector Machines (M. Law)
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Kernels Usman Roshan CS 675 Machine Learning. Feature space representation Consider two classes shown below Data cannot be separated by a hyperplane.
The Method of Moving Asymptotes
CSE4334/5334 DATA MINING CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai.
Chapter 4 Sensitivity Analysis, Duality and Interior Point Methods.
Inexact SQP methods for equality constrained optimization Frank Edward Curtis Department of IE/MS, Northwestern University with Richard Byrd and Jorge.
Gradient Methods In Optimization
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
Linear Programming Chapter 9. Interior Point Methods  Three major variants  Affine scaling algorithm - easy concept, good performance  Potential.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 1 Primal Methods.
Computational Intelligence: Methods and Applications Lecture 24 SVM in the non-linear case Włodzisław Duch Dept. of Informatics, UMK Google: W Duch.
Support Vector Machine: An Introduction. (C) by Yu Hen Hu 2 Linear Hyper-plane Classifier For x in the side of o : w T x + b  0; d = +1; For.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
Generalization Error of pac Model  Let be a set of training examples chosen i.i.d. according to  Treat the generalization error as a r.v. depending on.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 117 Penalty and Barrier Methods General classical constrained.
Linear Discriminant Functions Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
Geometrical intuition behind the dual problem
Kernels Usman Roshan.
Support Vector Machines Introduction to Data Mining, 2nd Edition by
COSC 4335: Other Classification Techniques
Usman Roshan CS 675 Machine Learning
COSC 4368 Machine Learning Organization
Part 4 Nonlinear Programming
Branch-and-Bound Algorithm for Integer Program
Transformation Methods Penalty and Barrier methods
Presentation transcript:

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 123 “True” Constrained Minimization

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 124 Classification Optimization algorithms for constrained optimization are often classified in primal and transformation methods: By a primal method of solution we mean a search method that works on the original problem directly by searching through the feasible region for the optimal solution. Transformation methods convert a constrained optimization problem to a sequence of unconstrained optimization problems. They include barrier and penalty function methods.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 125 Sequential Linear Programming Sequential Linear Programming was developed in the early 1960s. Also known as Kelley's cutting plane method. Also known as Stewart and Griffith's Method of Approximate Programming. SLP is considered unattractive by theoreticians. However, the concept has proven to be quite powerful and efficient for engineering design. The basic concept is that we first linearize the objective and constraints and then solve this linear problem by an optimizer of choice.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 126 SLP Algorithm Basic idea: Use linear approximations of the nonlinear functions and apply standard linear programming techniques. Process is repeated successively as the optimization process. Major concern: How far from the point of interest are these approximations valid? This problem is generally addressed by introducing move limits (often by trial and error). Note: Move limits depend on degree of nonlinearity. Traditional Stewart and Griffith Method uses first order Taylor expansion:

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 127 Variations on SLP Better results can be obtained by retaining the second-order terms of the Taylor series expansion, i.e.,

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 128 ALP Algorithm in DSIDES ALP algorithm is a variation on SLP. Some unique features: the use of second-order terms in linearization, the normalization of the constraints and goals and their transformation into generally well-behaved convex functions in the region of interest, an “intelligent” constraint suppression and accumulation scheme. ALP uses only diagonal terms of Hessian. Thus 2nd order Taylor series becomes:

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 129 Creation of a Hyperplane Hatched area represents the linear hyperplane in the three dimensional space. Note that g(x) is the objective function value.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 130 Creation of a Second Improved Hyperplane Given the solution of the first approximation, one can make an improved hyperplane by using this solution and the existing quadratic approximation. e q = g h ( X F ) - g q ( X F ) Error test:

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 131 Constraint Accumulation Three dimensional view of constraint accumulation, i.e., you "remember" previous approximations of a constraint in order to improve the overall approximation.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 132 Convexity Accumulation is only useful for convex constraints Why not for non-convex? Low degree of convexity

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 133 Adaptation of Convex Constraint New improved constraint approximation is added to first approximation of a constraint with high degree of convexity

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 134 Adaptation of Non-Convex Constraint Original (first) linear approximation is replaced (or modified) with new approximation.

Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 135 Other ALP Features The software implementation has various features (see manual): Linear solver is a version of Revised Multiplex, hence lexicographic multi-objective optimization is possible. Generation of initial point and exploration of design space Adaptive reduced move (optional) Automatic constraint suppression (embodied in code) Perturbation step size control...