2024 : 11 : 23
Yousef Seyfari

Yousef Seyfari

Academic rank: Assistant Professor
ORCID:
Education: PhD.
ScopusId:
HIndex:
Faculty: 1
Address:
Phone:

Research

Title
μMOSM: A hybrid multi-objective micro evolutionary algorithm
Type
JournalPaper
Keywords
Multi-objective optimization, Micro algorithm, Multi-operator, Hybrid evolutionary algorithm
Year
2023
Journal ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE
DOI
Researchers Yousef Abdi ، Mohammad Asadpour ، Yousef Seyfari

Abstract

In multi-objective optimization problems (MOPs), several mutually conflicting objectives are optimized simultaneously. In such scenarios, there is not a unique solution to the problem; instead, there is a set of solutions known as the Pareto front, representing the trade-off between objectives. Multi-objective evolutionary algorithms (MOEAs) can approximate these solutions in a single run. However, due to their resource-intensive nature, MOEAs are not suitable for solving real-time and engineering MOPs such as the optimization of manufacturing processes and energy consumption in wireless networks, where a fast convergence rate with less computational cost is required. Fortunately, micro versions of MOEAs can meet this requirement by utilizing a tiny population size. However, this can result in a rapid loss of diversity and the algorithm may easily fall into a local optimum. While some approaches such as the restart technique have been proposed to address this issue, hybrid techniques such as integrative, collaborative, and decomposition-based methods have not been effectively considered in the design of micro algorithms, despite hybridization being a widely accepted method for enhancing the diversity of evolutionary algorithms. In this study, we propose a hybrid micro MOEA called μMOSM that can effectively tackle the diversity loss problem and accelerate the convergence rate in approximating Pareto front solutions. Experimental results on benchmark test suites and a real-world MOP demonstrate the advantages of our proposed algorithm and confirm that μMOSM outperforms state-of-the-art MOEAs and micro MOEAs such as MOSM, ADE-MOIA, MMOPSO, NSGA-III, MOEA/D-FRRMAB, μFAME, and ASMiGA.