Presentation on theme: "INCLUDING UNCERTAINTY MODELS FOR MORE EFFICIENT SURROGATE BASED DESIGN OPTIMIZATION The EGO algorithm STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP."— Presentation transcript:
INCLUDING UNCERTAINTY MODELS FOR MORE EFFICIENT SURROGATE BASED DESIGN OPTIMIZATION The EGO algorithm STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP Raphael T. Haftka Felipe A. C. Viana
2 BACKGROUND: SURROGATE MODELING Forrester AIJ and Keane AJ, Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, Vol. 45, No. 1-3, pp , Surrogates replace expensive simulations in design optimization. Kriging (KRG) Polynomial response surface (PRS) Support vector regression Radial basis neural networks Example: is an estimate of.
3 BACKGROUND: UNCERTAINTY Some surrogates also provide an uncertainty estimate: standard error, s(x). Example: kriging and polynomial response surface. These are used in EGO
Outline Efficient Global Optimization (EGO) Importing uncertainty models in order to use EGO with surrogates without uncertainty models Simple multiple-surrogate EGO to benefit from parallel computations 4
5 KRIGING FIT AND PRESENT BEST SOLUTION Jones DR, Schonlau M, Welch W, Efficient global optimization of expensive black-box functions, Journal of Global Optimization, 13(4), pp , First we sample the function. And we fit a kriging model. We note the present best sample (PBS)
6 THE EXPECTED IMPROVEMENT QUESTION Then we ask: Of all the points where we will improve, where are we most likely to improve significantly upon the present best sample?
7 WHAT IS EXPECTED IMPROVEMENT? Consider the point x=0.8, and the random variable Y, which is the possible values of the function there. Its mean is the kriging prediction, which is near zero.
8 EXPLORATION AND EXPLOITATION EGO maximizes E[I(x)] to find the next point to be sampled. The expected improvement balances exploration and exploitation because it can be high either because of high uncertainty or low surrogate prediction. When can we say that the next point is “exploration?”
9 THE BASIC EGO WORKS WITH KRIGING (a) Kriging (b) Support vector regression Why not run EGO with the most accurate surrogate? Considering the root mean square error, :
10 BUT SVR DOES NOT HAVE UNCERTAINTY Say we have two surrogates: 1)Kriging (KRG): with uncertainty 2)Support vector regression (SVR): NO uncertainty But we want SVR for EGO. Viana FAC and Haftka RT, Importing Uncertainty Estimates from One Surrogate to Another, in: 50th AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Palm Springs, USA, May 4 - 7, AIAA
11 IMPORTATION AT A GLANCE First, we generate the standard error of kriging.
12 IMPORTATION AT A GLANCE Then, we combine the prediction from support vector regression and the standard error from kriging.
13 BENEFITS OF IMPORTATION OF UNCERTAINTY Hartman3 function (initially fitted with 20 points): Run EGO with non-kriging models!!! After 20 iterations (i.e., total of 40 points), improvement (I) over initial best sample:
14 TWO OTHER DESIGNS OF EXPERIMENTS FIRST: SECOND:
15 SUMMARY OF THE HARTMAN3 EXERCISE In 34 DOEs (out of 100) KRG outperforms RBNN (in those cases, the difference between the improvements has mean of only 0.8%). Box plot of the difference between improvement offered by different surrogates (out of 100 DOEs)
16 EGO WITH MULTIPLE SURROGATES Traditional EGO uses kriging to generate one point at a time. We use multiple surrogates to get multiple points.
17 POTENTIAL OF EGO WITH MULTIPLE SURROGATES Hartman3 function (100 DOEs with 20 points) Overall, surrogates are comparable in performance.
18 POTENTIAL OF EGO WITH MULTIPLE SURROGATES “krg” runs EGO for 20 iterations adding one point at a time. “ krg-svr ” and “ krg-rbnn ” run 10 iterations adding two points. Multiple surrogates offer good results in half of the time!!!