Institute of Biophysics and Biomedical Engineering - Bulgarian Academy of Sciences OLYMPIA ROEVA 105 Acad. George Bonchev Str. 1113 Sofia, Bulgaria

Slides:



Advertisements
Similar presentations
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Advertisements

P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2013 – 12269: Continuous Solution for Boundary Value Problems.
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2014 – 35148: Continuous Solution for Boundary Value Problems.
Optimization methods Review
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2011 –47658 Determining ODE from Noisy Data 31 th CIE, Washington.
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Institute of Intelligent Power Electronics – IPE Page1 Introduction to Basics of Genetic Algorithms Docent Xiao-Zhi Gao Department of Electrical Engineering.
Visual Recognition Tutorial
D Nagesh Kumar, IIScOptimization Methods: M1L1 1 Introduction and Basic Concepts (i) Historical Development and Model Building.
Spring, 2013C.-S. Shieh, EC, KUAS, Taiwan1 Heuristic Optimization Methods Pareto Multiobjective Optimization Patrick N. Ngatchou, Anahita Zarei, Warren.
INTEGRATED DESIGN OF WASTEWATER TREATMENT PROCESSES USING MODEL PREDICTIVE CONTROL Mario Francisco, Pastora Vega University of Salamanca – Spain European.
Methods For Nonlinear Least-Square Problems
NORM BASED APPROACHES FOR AUTOMATIC TUNING OF MODEL BASED PREDICTIVE CONTROL Pastora Vega, Mario Francisco, Eladio Sanz University of Salamanca – Spain.
Reporter : Mac Date : Multi-Start Method Rafael Marti.
A Comparative Study Of Deterministic And Stochastic Optimization Methods For Integrated Design Of Processes Mario Francisco a, Silvana Revollar b, Pastora.
Efficient Methodologies for Reliability Based Design Optimization
Advanced Topics in Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Optimization Methods One-Dimensional Unconstrained Optimization
Interval-based Inverse Problems with Uncertainties Francesco Fedele 1,2 and Rafi L. Muhanna 1 1 School of Civil and Environmental Engineering 2 School.

Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
UNCONSTRAINED MULTIVARIABLE
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Modeling and simulation of systems Simulation optimization and example of its usage in flexible production system control.
1 Chapter 8 Nonlinear Programming with Constraints.
ENCI 303 Lecture PS-19 Optimization 2
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 101 Quasi-Newton Methods.
Some Key Facts About Optimal Solutions (Section 14.1) 14.2–14.16
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Improvement of Multi-population Genetic Algorithms Convergence Time Maria Angelova, Tania Pencheva
Particle Swarm Optimization (PSO) Algorithm and Its Application in Engineering Design Optimization School of Information Technology Indian Institute of.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
ACO for Parameter Settings of E.coli Fed-batch Cultivation Model Stefka Fidanova, Olympia Roeva Bulgarian Academy of Sciences.
Remarks: 1.When Newton’s method is implemented has second order information while Gauss-Newton use only first order information. 2.The only differences.
Mathematical Models & Optimization?
Exact and heuristics algorithms
Parameter Optimization of a Bioprocess Model using Tabu Search Algorithm Olympia Roeva, Kalin Kosev Institute of Biophysics and Biomedical Engineering.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
McGraw-Hill/Irwin © The McGraw-Hill Companies, Inc., Table of Contents CD Chapter 14 (Solution Concepts for Linear Programming) Some Key Facts.
A comparison between PROC NLP and PROC OPTMODEL Optimization Algorithm Chin Hwa Tan December 3, 2008.
Local Search and Optimization Presented by Collin Kanaley.
Ant Algorithm and its Applications for Solving Large Scale Optimization Problems on Parallel Computers Stefka Fidanova Institute for Information and Communication.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Introduction Metaheuristics: increasingly popular in research and industry mimic natural metaphors to solve complex optimization problems efficient and.
Survey of unconstrained optimization gradient based algorithms
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Genetic Algorithms based Parameter Identification of Yeast Fed-Batch Cultivation Angelova M., Tzonkov St., Pencheva T.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
DEPARTMENT/SEMESTER ME VII Sem COURSE NAME Operation Research Manav Rachna College of Engg.
A Hybrid Optimization Approach for Automated Parameter Estimation Problems Carlos A. Quintero 1 Miguel Argáez 1, Hector Klie 2, Leticia Velázquez 1 and.
Sensitivity Analysis for the Purposes of Parameter Identification of a S. cerevisiae Fed-batch Cultivation Sensitivity Analysis for the Purposes of Parameter.
A Hybrid Optimization Approach for Automated Parameter Estimation Problems Carlos A. Quintero 1 Miguel Argáez 1, Hector Klie 2, Leticia Velázquez 1 and.
Lecture 20 Review of ISM 206 Optimization Theory and Applications.
Purposeful Model Parameters Genesis in Simple Genetic Algorithms
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
Goal We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid.
Bulgarian Academy of Sciences
Digital Optimization Martynas Vaidelys.
Meta-heuristics Introduction - Fabien Tricoire
A Comparison of Simulated Annealing and Genetic Algorithm Approaches for Cultivation Model Identification Olympia Roeva.
C.-S. Shieh, EC, KUAS, Taiwan
GENETIC ALGORITHMS FOR FEED RATE PROFILES DESIGN
metaheuristic methods and their applications
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Presentation transcript:

Institute of Biophysics and Biomedical Engineering - Bulgarian Academy of Sciences OLYMPIA ROEVA 105 Acad. George Bonchev Str Sofia, Bulgaria ; 1. INTRODUCTION Within the broad field of nonlinear fermentation processes systems modelling and management studies, the objective of global optimization is the analysis and application of [35]:  nonlinear decision models that (may) possess multiple optimal solutions;  suitable solution algorithms that are able to find the absolutely best (global) solution(s). The dynamic modelling of the nonlinear fermentation processes systems is formulated as a reverse problem, which requires a well-suited mathematical model and a very efficient computational method to achieve the model structure and parameters. Numerical integration for differential equations and finding global parameter values are still two major challenges in this field of the parameter estimation of nonlinear dynamic fermentation processes (FP) systems. The usage of optimization in engineering is getting larger every day as the computational capabilities of the computers are increasing. Today calculations could be performed in a fraction of the time it took just a couple of years ago. Therefore, the applications for numerical optimization have increased dramatically. A great part of the design process is and will always be intuitive; however analytical techniques as well as numerical optimization could be of great value and can permit vast improvement in designs. Real engineering design problems are generally characterized by the presence of many often conflicting and incommensurable objectives. This raises the issue about how different objectives should be combined to yield a final solution. There is also the question on how to search for an optimal solution to the design problem. This chapter presents a survey of some methods and techniques to conduct numerical optimization in a parameter estimation context and different ways of developing hybrids among them Parameter estimation of a nonlinear fed-batch fermentation model The mathematical formulation of a nonlinear fed-batch fermentation model is described according to the mass balance as follows: 2.1. Problem formulation Basic ingredients 1. Objective function. The two interesting exceptions are: No objective function and Multiple objective functions. 2. Unknowns or variables. In fitting-the-data problem, the unknowns are the parameters that define the FP model. 3. A set of constraints. The optimization problem is then: Find values of the variables that minimize or maximize the objective function while satisfying the constraints Complexity of the problem 2.2. General optimization methods Local search Global search Global optimization For example the optimization methods could be classified as follows: Unconstrained versus Constrained methods; Derivative versus Non-derivative methods; Deterministic versus Stochastic methods; Local versus Global methods; Continuous versus Discrete methods Unconstrained optimization Newton method and some modifications like Line-search variant, Trust-region variant and Truncated Newton methods. Difference approximations, Quasi-Newton methods, Nonlinear conjugate gradient methods and the nonlinear Simplex method. Nonlinear least squares (Gauss-Newton method, Levenberg- Marquardt method and Hybrid methods (hybrid strategy combines the Gauss-Newton and BFGS Quasi-Newton algorithms)). Systems of nonlinear equations (Trust Region and Line-search methods, Truncated Newton method, Broyden method, Tensor methods, Homotopy and Continuation methods) Constrained optimization Nonlinearly constrained optimization Bound-constrained optimization Quadratic programming Linear programming Simplex algorithm Interior-point algorithms Primal-dual interior-point algorithms Semidefinite programming 2. METHODOLOGY 2.3. Global optimization methods Elements of global optimization methods Strategies in choosing points Stopping conditions and solvability Convergence with probability one Global optimization strategies Branch and bound Multistart and clustering methods Evolutionary algorithms These include: Genetic programming, which evolve programs; Evolutionary programming, which focuses on optimizing continuous functions without recombination; Evolutionary strategies, which focuses on optimizing continuous functions with recombination; Genetic algorithms (GAs), which focuses on optimizing general combinatorial problems. Genetic algorithms There is an abundance of different type of GAs such as simple GAs, steady state GAs, GAs with multiple populations, GAs with crowding and sharing techniques, and many, many more. The different GAs have different features in order to solve different type of problems. GAs are very robust and can handle all type of fitness landscapes and mixture of real and discrete parameters. Simulated annealing Other meta-heuristics Tabu search (TS), Ant colony optimization (ACO), and Particle swarm methods. Adaptive stochastic search algorithms Statistical global optimization algorithms Hybrid methods There are a wide range of hybrid global optimization algorithms that have been developed. Some of hybrid algorithms are: Mixed Integer Nonlinear Programming (MINLP). Tree Annealing. Simulated annealing Pipelining hybrids. Asynchronous hybrids. Hierarchical hybrids. Additional operators. 3. RESULTS AND DISCUSSION The following optimization methods are compared: Simple GA; Multipopulation GA and Modified GA; as well as the methods available in Matlab: Sequential quadratic programming method; Nelder-Mead Simplex method; BFGS Quasi-Newton method; Steepest descent and Minimax method. The Matlab implementation is considered of optimization procedures for all considered tests. The computations are performed using a PC/Pentium IV (3 GHz) platform running Windows XP. All methods are compared based on following criteria: number of iterations, CPU time, parameters value and function value Maximization a function of two variables Test function, shown on Fig. 1, is a 2-D landscape to be maximized. It is defined by: Fig. 1. Test function The maximum f(х 1, х 2 ) = 1 is at (х 1, х 2 ) = (0.6, 0.1) and corresponds to the peak of the second, narrowed Gaussian. The test function is a hard global optimization problem. There are only two local maxima, with the global maximum covering about 1% of parameter space. For this function moving to the secondary maximum pulls solutions away from the global maximum. Detailed results of function optimization are given in Table 1. Time variations of f(x 1, x 2 ), as well as parameters variation are presented in Fig. 2 for Simplex method. Table 1. Optimization results In this case generated data for state variables X and S are used. Substrate concentration in the feed is 100 g/l. The initial values of variables are: X(0) = 1.25 g/l, S(0) = 0.81 g/l and V(0) = 1.35 l. Parameter estimation problem of presented nonlinear dynamic system is stated as the minimization of a distance measure J between generated and model predicted values of the considered state variables (X and S): (7) (8) Table 2. Optimization results a) time variation of x1 and x2 b) time variation of f(x1, x2) Fig. 2. Simplex method a) biomass concentration b) substrate concentration c) acetate concentration Fig. 3. Time profiles of the state variables 3.3. Parameter estimation of fed-batch fermentation model of E. coli MC4110 A parameter identification using Multipopulation GA, Quasi-Newton method, Simplex Search and Steepest Descent methods is performed. The model predictions of the state variables, based on four sets of search parameters are compared to experimental data points of E. coli cultivation. The simulation results are presented in Fig CONCLUSION The concurrent nature implies that GAs are much more likely to locate a global peak than the traditional techniques. The conventional search methods work extremely well provided it knows where to start. In the problem considered here the proper initial values of parameters are unknown. Due to the parallel nature of the genetic algorithm, the performance is much less sensitive to the initial conditions. In fact, GAs make hundreds, or even thousands, of initial guesses. Compared with traditional optimization methods, GA simultaneously evaluates many points in the parameter space. It is more probable to converge towards the global solution. A GA does not assume that the space is differentiable or continuous and can also iterate many times on each data received. A GA requires only information concerning the quality of the solution produced by each parameter set (objective function value information). This characteristic differs from optimization methods that require derivative information or, worse yet, complete knowledge of the problem structure and parameters. Since GAs do not demand such problem-specific information, they are more flexible than most search methods. Also GAs do not require linearity in the parameters which is needed in iterative searching optimization techniques. Simulation results reveal that accurate and consistent results can be obtained using GAs. The GAs property makes them suitable and more applicable for parameter estimation of fermentation processes models. ACKNOWLEDGEMENTS: The present work is supported by the Bulgarian National Science Fund grants № DMU 02/4, 2009 and DID 02-29/2009.