MATLAB Optimization Greg Reese, Ph.D Research Computing Support Group Miami University.

Slides:



Advertisements
Similar presentations
Fin500J: Mathematical Foundations in Finance
Advertisements

Lecture 5.
Lecture 5 Fixed point iteration Download fixedpoint.m From math.unm.edu/~plushnik/375.
FUNCTIONS Section 3.1.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. A Concise Introduction to MATLAB ® William J. Palm III.
MATLAB Functions – Part II Greg Reese, Ph.D Research Computing Support Group Academic Technology Services Miami University September 2013.
MS&E 211 Quadratic Programming Ashish Goel. A simple quadratic program Minimize (x 1 ) 2 Subject to: -x 1 + x 2 ≥ 3 -x 1 – x 2 ≥ -2.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Thursday, April 25 Nonlinear Programming Theory Separable programming Handouts: Lecture Notes.
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Linear Transformations
MIT and James Orlin © Nonlinear Programming Theory.
General Linear Least-Squares and Nonlinear Regression
ENGR 351 Numerical Methods Instructor: Dr. L.R. Chevalier
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
ENGG 1801 Engineering Computing MATLAB Lecture 7: Tutorial Weeks Solution of nonlinear algebraic equations (II)
Optimization Methods One-Dimensional Unconstrained Optimization
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
6. Linear Programming (Graphical Method) Objectives: 1.More than one solution 2.Unbounded feasible region 3.Examples Refs: B&Z 5.2.
MATLAB Cell Arrays Greg Reese, Ph.D Research Computing Support Group Academic Technology Services Miami University.
Why Function Optimization ?
Optimization Methods One-Dimensional Unconstrained Optimization
1 Chapter 5 Nonlinear Programming Chemical Engineering Department National Tsing-Hua University Prof. Shi-Shang Jang May, 2003.
Non-Linear Simultaneous Equations
What is Optimization? Optimization is the mathematical discipline which is concerned with finding the maxima and minima of functions, possibly subject.
Dominant Eigenvalues & The Power Method
KKT Practice and Second Order Conditions from Nash and Sofer
MATLAB Functions – Part I Greg Reese, Ph.D Research Computing Support Group Academic Technology Services Miami University September 2013.
Graphing Linear Equations
Roots of Equations Chapter 3. Roots of Equations Also called “zeroes” of the equation –A value x such that f(x) = 0 Extremely important in applications.
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
CPE 619 Simple Linear Regression Models Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Simple Linear Regression Models
Part 2 Chapter 7 Optimization
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
1 DSCI 3023 Linear Programming Developed by Dantzig in the late 1940’s A mathematical method of allocating scarce resources to achieve a single objective.
Nonlinear Programming (NLP) Operation Research December 29, 2014 RS and GISc, IST, Karachi.
Fin500J: Mathematical Foundations in Finance
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
Ken YoussefiMechanical Engineering Dept. 1 Design Optimization Optimization is a component of design process The design of systems can be formulated as.
Genetic Algorithm and Direct search toolbox in MATLAB
MA/CS 375 Fall MA/CS 375 Fall 2002 Lecture 31.
MATLAB Logical Indexing Greg Reese, Ph.D Research Computing Support Group Academic Technology Services Miami University.
QMB 4701 MANAGERIAL OPERATIONS ANALYSIS
Solution of Nonlinear Functions
Newton’s Method, Root Finding with MATLAB and Excel
Approximation Algorithms Department of Mathematics and Computer Science Drexel University.
More Functions in MATLAB. Functions that operate on other functions A function F() can take another function G() as an argument by using a notation:
Introduction to Optimization Methods
3 Components for a Spreadsheet Optimization Problem  There is one cell which can be identified as the Target or Set Cell, the single objective of the.
Survey of unconstrained optimization gradient based algorithms
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Basis of Mathematical Modeling LECTURE 3 Numerical Analysis with MATLAB Dr. N.K. Sakhnenko, PhD, Professor Associate.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Copyright © Cengage Learning. All rights reserved. 14 Partial Derivatives.
By Liyun Zhang Aug.9 th,2012. * Method for Single Variable Unconstraint Optimization : 1. Quadratic Interpolation method 2. Golden Section method * Method.
MATLAB Numerical Basics. Roots of Polynominals MATLAB can find all roots (both real and imaginary) of polynominals. Store coefficients in a vector v =
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
Finding zeros (also called roots) of a function Overview: Define the problem Methods of solution Graphical Newton’s Bisection Secant.
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
University of Colorado Boulder APPM 4380 October 10th, 2016
Computational Optimization
ENGG 1801 Engineering Computing
Chapter 7: Optimization
Precalculus Essentials
Applications of User Defined functions in MATLAB
Downhill Simplex Search (Nelder-Mead Method)
EEE 244-8: Optimization.
Part 2 Chapter 7 Optimization
Presentation transcript:

MATLAB Optimization Greg Reese, Ph.D Research Computing Support Group Miami University

MATLAB Optimization © Greg Reese. All rights reserved 2

Function Summary 3 FunctionParameterLinearityConstraintsLocation fgoalattain fminbnd fmincon fminimax fminsearch fminunc fzero lsqlin lsqnonlin lsqnonneg

4 Optimization Optimization - finding value of a parameter that maximizes or minimizes a function with that parameter – Talking about mathematical optimization, not optimization of computer code! – "function" is mathematical function, not MATLAB language function

5 Optimization Optimization - finding value of a parameter that maximizes or minimizes a function with that parameter – Can have multiple parameters – Can have multiple functions – Parameters can appear linearly or nonlinearly

6 Linear programming – "programming" means determining feasible programs (plans, schedules, allocations) that are optimal with respect to a certain criterion – Most often used kind of optimization –Tremendous number of practical applications\ Won't discuss further

7 fminbnd The basic optimizing 1D routine is fminbnd, which attempts to find a minimum of a function of one variable within a fixed interval. function must be continuous only real values (not complex) only scalar values (not vector) might only find local minimum

8 fminbnd x = x1, x2 ) x1 < x2 is the minimization interval x is the x-value in [x1,x2] where the minimum occurs fun is the function whose minimum we’re looking for. fun must accept exactly one scalar argument and must return exactly one scalar value Because there are no restrictions on the value of fun, finding the minimum is called unconstrained optimization.

9 fminbnd x = x1, x2 is called a function handle. You must include the when you call fminbnd.

10 fminbnd Try It Find the minimum of f(x) = ( x – 3 ) 2 – 1 on the interval [0,5] Step 1 – make an m-file with the function

11 fminbnd Try It Step 2 – call fminbnd with your function >> x = 0, 5 ) x = 3 If you want to get the value of the function at its minimum, call the function with the minimum x value returned by fminbnd >> parabola(x) ans = -1

12 fminbnd Try It Step 3 – verify result is a global minimum by plotting function over a range, say [-10 10] >> [-10 10] )

13 fminunc The multidimensional equivalent of fminbnd is fminunc. It attempts to find a local minimum of a function of one or more variables. function must be continuous only real values (not complex) only scalar values (not vector) might only find local minimum

14 fminunc x = x0 ) x0 is your guess of where the minimum is fun is the function whose minimum we’re looking for. fun must accept exactly one scalar or vector argument and must return exactly one scalar value

15 fminunc x = x0 ) fun only has one argument. If the argument is a vector, each element represents the corresponding independent variable in the function Example f(x,y,z) = x 2 + y 2 + z 2 function w = f( x ) w = x(1)^2 + x(2)^2 + x(3)^2

16 fminunc Optimizing routines such as fminunc are often sensitive to the initial value. For bad initial values they may converge to the wrong answer, or not converge at all. Tip – if fun is a function of one or two variables, plot it first to get an idea of where the minimum is

17 fminunc Try It Find the minimum of f(x) = ( x – π ) 2 – 1 Step 1 – make an m-file with the function

18 fminunc Try It Step 2 – plot the function to get an estimate of the location of its minimum Looks like min is at about 3

19 fminunc Try It Step 3 – call fminunc with your function and an initial guess of 3 >> xmin = 3 ) Warning: Gradient must be provided for trust-region method; using line-search method instead. > In fminunc at 247 Optimization terminated: relative infinity-norm of gradient less than options.TolFun. xmin = Ignore this stuff

20 fminunc Try It Find the minimum of f(x,y) = ( x–22.22 ) 2 + ( y ) Step 1 – make an m-file with the function

21 fminunc Try It Step 2 – plot the function to get an estimate of the location of its minimum ezsurf expects a function that takes two arguments. Since parabola only takes one, we need to make a different version that takes two. Let's call it paraboloid_xy

22 fminunc Try It Step 2 – plot the function to get an estimate of the location of its minimum >> [ ] ) After zooming and panning some, it looks like the min is at about x=20 and y=50

23 fminunc Try It Step 3 – call fminunc with your function and an initial guess of [20 50] >> x = [20 50] ) x = >> paraboloid( x ) ans =

24 fminunc Try It Find the minimum of a 5D hypersphere of centered at [ ] f(x,y,z,a,b) = (x–1) 2 + (y–2) 2 + (z–3) 2 + (a–4) 2 + (b–5) 2 Step 1 – make an m-file with the function Use 3 dots to continue a line in a file.

25 fminunc Try It Step 2 – can't plot 5D, so guess [ ] Step 3 – call fminunc with your function and an initial guess of [ ] >> xmin = [ ] ) xmin = >> hypersphere( xmin ) ans = e-012 ≈ 0

26 MATLAB Optimization Questions?

27 The End