1. Trang chủ
  2. » Ngoại Ngữ

New Criteria for Comparing Global Stochastic Derivative-Free Opti

1 2 0

Đang tải... (xem toàn văn)

THÔNG TIN TÀI LIỆU

New Criteria for Comparing Global Stochastic Derivative - Free Optimization Algorithms Jonathan McCart Mathematics Department, SUNY Geneseo Advisor: Dr Ahmad Almomani Abstract For many situations, the function that best models a situation or data set can have a derivative that may be difficult or impossible to find Thus, numerical methods for finding these important values without the direct involvement of the derivative have been developed to find the optimal value of the function This is our motivation to use Derivative-free optimization (DFO) algorithms In our analysis of these algorithms, we tested three global solvers: Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Simulating Annealing (SA) on a set of 25 problems of varying in convex/non-convex, separable/non-separable, differentiable/nondifferentiable, and unimodal/multimodal For each algorithm, we used the built-in code from MATLAB, unedited or revised For all problems, we varied the number of dimensions, increasing from dimensions to 100 dimensions We introduce new criteria to compare DFO solver performance using certain generalized characteristics: Speed, Accuracy, Efficiency, and Difficulty Numerical results proposed for most known standard benchmark problems Fundamental Question Is there a set of characteristics that, based on output data, can describe a set of attributes that all solvers share? If so, what are these attributes? How are they calculated? What would be the significance? Objective We seek to look more carefully at what behaviors and trends arise in performance and see which of these trends generalize into patterns we can define as attributes common to all solvers Our motivation to answer this fundamental question comes from an interest in the comparison of optimization algorithms for the purpose of quantifying how often solvers can converge on the global minimum, how fast the minimum can be found, how many evaluations it takes to find the minimum, and where algorithms may fall short in completing the desired optimization problem Additionally, we seek to find the specific strengths and weaknesses of the algorithms tested Algorithms Tested These algorithms are global solvers which are stochastic in nature We tested the following: Genetic Algorithm (GA) Particle Swarm Optimization (PSO) Simulating Annealing (SA) Groups We selected a set of 25 n-dimensional benchmark problems varying in convexity, separability, differentiability, and modality We then organized the problems into groups according to type and designated groups: G1: non-convex, differentiable, continuous, non-separable, multimodal G2: convex, non-differentiable, continuous, separable, unimodal G3: non-convex, non-differentiable, non-separable, multimodal G4: 25 test problems of any type Results Difficulty A consolidated representation of the results from the numerical experiments are described below: When any solver is applied to a problem requiring a high accuracy, we expect difficulties for the solver resulting in more CPU time or more function evaluations The chance of failure is very high for achieving a highly accurate solution We define the difficulty here as the following: [1] 𝐃𝐢𝐟𝐟𝐢𝐜𝐮𝐥𝐭𝐲 = 𝟏 − 𝐥𝐧(𝐟𝐚𝐢𝐥𝐮𝐫𝐞) = 𝐥𝐧(𝐬𝐮𝐜𝐜𝐞𝐬𝐬) G4 Success Rate at and 100 dimensions Accuracy The Accuracy of a solver defined as the percentage of successful runs (success rate) within a given tolerance level ε Here, ε represents the difference between the best value found by the solver and the optimal value The success rate is represented as a decimal ranging from to Accuracy will be represented by the variable 𝐴ε G4: Speed Speed Speed is the number of function evaluations per unit CPU time The value for Speed is the following ratio: 𝐀𝐯𝐞𝐫𝐚𝐠𝐞 𝐍𝐮𝐦𝐛𝐞𝐫 𝐨𝐟 𝐄𝐯𝐚𝐥𝐮𝐚𝐭𝐢𝐨𝐧𝐬 𝐒𝐩𝐞𝐞𝐝 = 𝐀𝐯𝐞𝐫𝐚𝐠𝐞 𝐂𝐏𝐔 𝐓𝐢𝐦𝐞 Efficiency The Efficiency of a solver is designed to quantify how well an algorithm solves a problem in terms of how fast and how accurate the algorithm is as well as quantify the solver’s ability to achieve the global minimum in as few evaluations as possible, and also the ability to obtain the minimum in a low number of evaluations on average: G4: Efficiency at 100 and 10−10 • There in fact exist general characteristics of algorithms, demonstrated by the adherence of distinct metaheuristics to patterns in performance • These methods for comparing performance led to clear trends in performance and uncovering of unexpected behavior on problems of the type seen in G3 • The stochastic nature of the solvers did not lead to significant deviations from the major trends in larger problem sets • The ability to accurately reveal strengths and weaknesses demonstrates promise for using Speed, Accuracy, and Efficiency as general characteristics for comparison • It was NOT the case that solver behavior was completely described by the observed trends in dimensionality, as G3 displayed unanticipated and counterintuitive results Acknowledgements 𝐄𝐟𝐟𝐢𝐜𝐢𝐞𝐧𝐜𝐲 = (𝐌 ∗ 𝝌 ∗ 𝐒)𝐀𝛆 This work was partially supported by funds made available under a SUNY Expanded Investment and Performance Award to SUNY Geneseo 𝐥𝐨𝐠 𝑴∗ 𝐥𝐨𝐠 𝝌∗ 𝐥𝐨𝐠 𝑺𝒊 𝑴= ;𝝌 = ;𝑺 = 𝐥𝐨𝐠 𝑴𝒊 𝐥𝐨𝐠 𝝌𝒊 𝐥𝐨𝐠 𝑺∗ Defined by: 𝑴𝒊 = the mean number of evaluations taken by solver i for a given problem 𝑴∗ = the lowest mean number of evaluations taken by any solver for a given problem 𝝌𝒊 = the mean lowest number of evaluations taken by solver i for a given problem 𝝌∗ =the lowest number of evaluations taken by any solver for a given problem 𝑺𝒊 = the speed of solver i for a given problem 𝑺∗ =the speed of the fastest solver for a given problem Conclusion G1 and G2 Efficiency at 10−10 References [1] Maurice Clerc Particle Swarm Optimization ISTE, 2006 [2] Benchmark Functions http://benchmarkfcns.xyz/fcns, accessed: 2018-06-23

Ngày đăng: 30/10/2022, 20:18

Xem thêm:

w