Codes for "A Generalized Scalarization Method for Evolutionary Multi-objective Optimization" (AAAI-23 Oral).
MOEA/D-GGR are implemented on the PlatEMO platform (version 3.x). Please place the codes in path "PlatEMO\Algorithms\Multi-objective optimization\MOEADGGR".
We develop a generational version of MOEA/D-GGR (denoted as gMOEA/D-GGR
), whose framework is similar to gMOEA/D-AGR. gMOEA/D-GGR could perform better when the reference point changes frequently. Optionally, one can employ a greedy trick in the replacement procedure of the algorithm, which enhances its convergence performance. Particularly, This trick, along with setting
% index_Pt(i) = i; % default
[~,index_Pt(i)] = min(g(:,i)); % greedy
We integrate a weight vector adjustment strategy, similar to that proposed in AdaW, into gMOEA/D-GGR (with the greedy trick). Specifically, cone dominance is employed to replace Pareto dominance during the archive updating process in AdaW. The new algorithm is denoted as gMOEA/D-GGRAW
.