Abstract
|
Optimization challenges are often defined by the existence of multiple feasible solutions. Optimization refers to the process of selecting the most optimal solution from a set of alternatives for a given problem [1]. An optimization problem is typically composed of three key components [2]: an objective function, constraints, and decision variables. The main aim of optimization is to assign values to decision variables in a way that maximizes the objective function while respecting the constraints imposed by the problem. As technology advances, the complexity of optimization problems has grown, necessitating the use of more advanced techniques to address these challenges [3].
Optimization studies classify problem-solving techniques into two main categories: deterministic and stochastic approaches [4]. Deterministic methods are effective for addressing problems that are linear, convex, continuous, low-dimensional, and differentiable [5]. However, their performance declines in more complex optimization tasks, as they can become trapped in local optima. Many modern optimization problems, along with their real-world applications, exhibit characteristics such as non-linearity, non-convexity, discontinuity, non-differentiability, and high dimensionality. Due to the limitations and inefficiency of deterministic methods in solving these complex problems, researchers have developed stochastic approaches. These methods do not rely on derivative information from the objective function and constraints but instead utilize a random search within the solution space to solve optimization challenges effectively [6,7]. Metaheuristic algorithms, a type of stochastic approach, have proven to be highly efficient in tackling optimization problems. Their popularity stems from several advantages, including simplicity in design and implementation, applicability to a wide range of problem types, independence from derivative information, and their ability to handle nonlinear, non-convex, high
|