On Scalability of Multi-Objective Evolutionary Algorithms on Combinatorial Optimisation Problems
Menghao Tang, Zimin Liang, Miqing Li
TLDR
Study reveals simpler MOEAs like SEMO decline in convergence on large-scale combinatorial problems due to lack of crossover.
Key contributions
- Empirically investigates MOEA scalability on combinatorial problems (size 50-5,000).
- SEMO's convergence speed declines significantly as problem size increases.
- Absence of crossover is a key reason for SEMO's underperformance on large problems.
- Adding crossover to SEMO substantially accelerates convergence, despite impacting solution spread.
Why it matters
Multi-objective combinatorial optimization problems pose unique challenges for MOEAs. This paper clarifies how MOEAs, particularly simpler designs, scale on these complex problems. It underscores the critical role of crossover for efficient convergence in large-scale discrete optimization.
Original Abstract
Scalability of evolutionary algorithms refers to assessing how their performance changes as problem size increases. In the area of multi-objective optimisation, research on the scalability of multi-objective evolutionary algorithms (MOEAs) has predominantly focussed on continuous problems. However, multi-objective combinatorial optimisation problems (MOCOPs) differ from continuous ones. Their discrete and rigid structure often brings rugged landscape, numerous local optimal solutions and disjoint global optimal regions. This leads to different behaviour of MOEAs. For example, SEMO, a simple MOEA without mating selection and diversity maintenance mechanisms, has been shown to be highly competitive, and in many cases to outperform more sophisticated MOEAs on MOCOPs. Yet, it remains unclear whether such findings hold for large-scale cases. In this paper, we conduct an empirical investigation into the scalability of MOEAs on combinatorial problems, with problem size from 50 to 5,000. Our results show that SEMO experiences a decline in convergence speed as dimensionality increases, compared to other MOEAs such as NSGA-II, SMS-EMOA and MOEA/D. We further demonstrate that the absence of crossover is a major contributor to SEMO's underperformance in large-scale problems, and that incorporating crossover into SEMO can substantially accelerate convergence in general, despite being detrimental in spreading solutions over the Pareto front.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.