Research:

Joshua Knowles
Contact
News
Research
Publications
CV
Links

ParEGO: an algorithm for multiobjective optimization of expensive functions

Supporting material for the papers:

  • B.-C. Cristescu and J. Knowles (2015) Surrogate-Based Multiobjective Optimization: ParEGO Update and Test, Workshop on Computational Intelligence (UKCI). PDF | Slides | NEW ParEGO Software
  • Knowles, J. (2006) ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Transactions on Evolutionary Computation. 10 (1): 50-66. Preprint version: PDF (981 KB) Published version from IEEE Explore. BibTeX
  • Knowles, J. (2004) ParEGO: A Hybrid Algorithm with On-line Landscape Approximation for Expensive Multiobjective Optimization Problems. Technical Report TR-COMPSYSBIO-2004-01, University of Manchester, Manchester, UK. September 2004. PDF BibTeX
  • Knowles, J and Hughes, E. J. (2005) Multiobjective optimization on a budget of 250 evaluations. Evolutionary Multi-Criterion Optimization (EMO 2005), LNCS 3410, pp. 176-190, Springer-Verlag. PDF BibTeX

Downloads:

Abstract of the original 2004 technical report:-

A great many optimization scenarios, particularly those arising in the experimental sciences and engineering, involve fitness evaluations that are expensive to perform, either financially or temporally, or both. In such applications, the computational overhead of the optimization algorithm is effectively irrelevant, whereas the quality of solutions obtained after realistically small numbers of function evaluations assumes primary significance, with worst-case performance, as well as average-case, being especially pertinent. These simple principles apply equally to multiobjective optimization as to optimization of a single objective function, yet nearly all multiobjective evolutionary algorithm (MOEA) studies (even some of those relating to expensive fitness functions) report performance after tens of thousands of evaluations, leaving open the question of performance on much shorter runs. In this paper, we concentrate on relatively low-dimensional, non-pathological, real-valued functions, such as arise in many applications, and assess the performance of two algorithms at just 100 and 250 function evaluations. The results show that NSGA-II, a popular MOEA, performs well compared to random search, even under the restricted number of evaluations used. A better performance (particularly in the worst case) is, however, achieved on our test set by an algorithm proposed herein - ParEGO - which is a simple extension of the single-objective efficient global optimization (EGO) algorithm of Jones et al. ParEGO uses a design-of-experiments inspired initialization procedure and learns a Gaussian process model of the search landscape, which is updated after every function evaluation. Overall, ParEGO exhibits a promising performance for multiobjective optimization problems where evaluations are expensive or otherwise restricted in number.

Keywords:- multiobjective optimization, expensive black-box functions, EGO, DACE, NSGA-II, landscape approximation, response surfaces, Pareto optima, test suites, performance assessment

Some keywords to help GOOGLE find this page are: evolutionary computation, Pareto optimization, multiobjective optimization, MOEA, EMOO, mass spectrometry optimization, high-throughput metabolomics, NSGA, NSGA-II, NSGA2, response surfaces, Kriging, landscape approximation, on-line learning, meta-models, EGO, DACE, ParEGO, test suites, test functions, performance assessment