Tutorial (IEEE CEC 2023) [Link]
How to Compare Evolutionary Multi-Objective Optimization Algorithms: Parameter Specifications, Indicators and Test Problems
IEEE 2023 Congress on Evolutionary Computation (CEC)
1-5 July, 2023, Chicago, USA
Short introduction:
Evolutionary multi-objective optimization (EMO) has been a very active research area in recent years. Almost every year, new EMO algorithms are proposed. When a new EMO algorithm is proposed, computational experiments are usually conducted in order to compare its performance with existing algorithms.
Then, experimental results are summarized and reported as a number of tables together with statistical significance test results. Those results usually show higher performance of the new algorithm than existing algorithms. However, fair comparison of different EMO algorithms is not easy since the evaluated performance of each algorithm usually depends on experimental settings.
This is also because solution sets instead of solutions are evaluated.
In this tutorial, we will first explain some commonly-used software platforms and experimental settings for the comparison of EMO algorithms. Then, we will discuss how to specify the common setting of computational experiments, which is used by all the compared EMO algorithms. More specifically, the focus of this tutorial is the setting related to the following four issues: (i) termination condition, (ii) population size, (iii) performance indicators, (iv) test problem. For each issue, we will provide a clear demonstration of its strong effects on comparison results of EMO algorithms. Following that, we will discuss how to handle each of these issues for fair comparison. These discussions aim to encourage the future development of the EMO research field without focusing too much on the development of overly-specialized new algorithms in a specific setting. Finally, we will also suggest some promising future research topics related to each issue.
Outline of the tutorial:
The duration of the tutorial will be two hours. It will be composed of the following parts:
1. Brief introduction to multi-objective optimization
2. Explanations on some commonly-used software platforms and experimental settings for the comparison of EMO algorithms
3. The difficulties in fair performance comparison of EMO algorithms related to the following four issues and how to handle them:
    - Termination condition
    - Population size
    - Performance indicators
    - Test problem
4. Future research topics related to each issue
Speakers:
Lie Meng Pang, Southern University of Science and Technology, China.
Lie Meng Pang received her Bachelor of Engineering degree in Electronic and Telecommunication Engineering and Ph.D. degree in Electronic Engineering from the Faculty of Engineering, Universiti Malaysia Sarawak, Malaysia, in 2012 and 2018, respectively.
She is currently a Research Associate with the Department of Computer Science and Engineering, Southern University of Science and Technology (SUSTech), China.
Her current research interests include evolutionary multi-objective optimization and fuzzy systems.
Ke Shang, Southern University of Science and Technology, China.
Ke Shang received the B.S. and Ph.D. degrees from Xi’an Jiaotong University, China, in 2009 and 2016, respectively.
He is currently a Research Associate Professor at Southern University of Science and Technology, China.
His current research interests include evolutionary multi-objective optimization and its applications.
He received GECCO 2018 Best Paper Award, CEC 2019 First Runner-up Conference Paper Award, GECCO 2021 Best Paper Award and best paper nomination at PPSN 2020.