1. Trang chủ
  2. » Công Nghệ Thông Tin

Simulated Annealing potx

428 256 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Cấu trúc

  • Preface&Contents Simulated Annealing

  • 01_Anghinolfi

  • 02_Saefai

  • 03_Werner

  • 04_Dawei

  • 05_Ghazanfari

  • 06_Duh

  • 07_ChenHuang

  • 08_Kokubugata

  • 09_Wah

  • 10_Liang

  • 11_Politis

  • 12_Ming

  • 13_Lin

  • 14_Sonmez

  • 15_Yepes

  • 16_Orsilla

  • 17_Mascarenhas

  • 18_Pardalos

  • 19_Duczmal

  • 20_Ledesma

Nội dung

Simulated Annealing Simulated Annealing Edited by Cher Ming Tan I-Tech IV Published by In-Teh In-Teh is Croatian branch of I-Tech Education and Publishing KG, Vienna, Austria Abstracting and non-profit use of the material is permitted with credit to the source Statements and opinions expressed in the chapters are these of the individual contributors and not necessarily those of the editors or publisher No responsibility is accepted for the accuracy of information contained in the published articles Publisher assumes no responsibility liability for any damage or injury to persons or property arising out of the use of any materials, instructions, methods or ideas contained inside After this work has been published by the In-Teh, authors have the right to republish it, in whole or part, in any publication of which they are an author or editor, and the make other personal use of the work © 2008 In-teh www.in-teh.org Additional copies can be obtained from: publication@ars-journal.com First published September 2008 Printed in Croatia A catalogue record for this book is available from the University Library Rijeka under no 111224063 Simulated Annealing, Edited by Cher Ming Tan p cm ISBN 978-953-7619-07-7 Simulated Annealing Cher Ming Tan Preface Optimization is important in all branches of engineering due to limited resources available Through optimization, maximum usage of the resource can be achieved However, global optimization can be difficult due to the requirement of the knowledge of the system behavior under analysis and the possible large solution space Without this knowledge, the optimization thus obtained may only be a local optimization Metaheuristic algorithms, on the other hand, are effective in exploring the solution space Often, they are referred to as “black box” algorithms as they use very limited knowledge about the specific system to be tackled, and often it does not require a mathematical model of the system under study Hence it can be used to solve a broad range of problem, and has thus receiving increasing attention One of the commonly used metaheuristic algorithms is the Simulated Annealing (SA) SA is an optimization algorithm that is not fool by false minima and is easy to implement It is also superior as compared to many other metaheuristic algorithms as presented in this book In this book, the different applications of the Simulated Annealing will be presented The first 11 chapters are devoted to the applications in Industrial engineering such as the scheduling problem, decision making, allocation problem, routing problem and general optimization problem The subsequent chapters of this book will focus on the application of the Simulated Annealing in Material Engineering on porous material study, Electrical Engineering on integrated circuit technology, Mechanical Engineering on mechanical structure design, Structural Engineering on concrete structures, Computer Engineering on task mapping and Bio-engineering on protein structure The last three Chapters will be on the methodology to optimize the Simulated Annealing, its comparison with other metaheuristic algorithms and the various practical considerations in the application of Simulated Annealing This book provides the readers with the knowledge of Simulated Annealing and its vast applications in the various branches of engineering We encourage readers to explore the application of Simulated Annealing in their work for the task of optimization Editor Cher Ming Tan Nanyang Technological University Singapore Contents Preface Simulated Annealing as an Intensification Component in Hybrid Population-Based Metaheuristics V 001 Davide Anghinolfi and Massimo Paolucci Multi-objective Simulated Annealing for a Maintenance Workforce Scheduling Problem: A case Study 027 Nima Safaei, Dragan Banjevic and Andrew K.S Jardine Using Simulated Annealing for Open Shop Scheduling with Sum Criteria 049 Michael Andresen, Heidemarie Bräsel, Mathias Plauschin and Frank Werner Real Time Multiagent Decision Making by Simulated Annealing 077 Dawei Jiang and Jingyu Han Learning FCM with Simulated Annealing 089 M.Ghazanfari and S Alizadeh Knowledge-Informed Simulated Annealing for Spatial Allocation Problems 105 Jiunn-Der Duh An Efficient Quasi-Human Heuristic Algorithm for Solving the Rectangle-Packing Problem 119 Wenqi Huang and Duanbing Chen Application of Simulated Annealing to Routing Problems in City Logistics 131 Hisafumi Kokubugata and Hironao Kawashima Theory and Applications of Simulated Annealing for Nonlinear Constrained Optimization Benjamin W Wah, Yixin Chen and Tao Wang 155 VIII 10 Annealing Stochastic Approximation Monte Carlo for Global Optimization 187 Faming Liang 11 Application of Simulated Annealing on the Study of Multiphase Systems 207 Maurice G Politis, Michael E Kainourgiakis, Eustathios S Kikkinides and Athanasios K Stubos 12 Simulated Annealing for Mixture Distribution Analysis and its Applications 227 to Reliability Testing Cher Ming Tan and Nagarajan Raghavan 13 Reticle Floorplanning and Simulated Wafer Dicing for Multiple-project Wafers by Simulated Annealing 257 Rung-Bin Lin, Meng-Chiou Wu and Shih-Cheng Tsai 14 Structural Optimization Using Simulated Annealing 281 Fazil O Sonmez 15 Optimization of Reinforced Concrete Structures by Simulated Annealing 307 F González-Vidosa, V Yepes, J Alcalá, M Carrera, C Perea and I PayáZaforteza 16 Best Practices for Simulated Annealing in Multiprocessor Task Distribution Problems 321 Heikki Orsila, Erno Salminen and Timo D Hämäläinen 17 Simulated Annealing of Two Electron Density Solution Systems 343 Mario de Oliveira Neto, Ronaldo Luiz Alonso, Fabio Lima Leite, Osvaldo N Oliveira Jr, Igor Polikarpov and Yvonne Primerano Mascarenhas 18 Improving the Neighborhood Selection Strategy in Simulated Annealing using the Optimal Stopping Problem 363 Saed Alizamir, Steffen Rebennack and Panos M Pardalos 19 A Comparison of Simulated Annealing, Elliptic and Genetic Algorithms for Finding Irregularly Shaped Spatial Clusters 383 Luiz Duczmal, Andrộ L F Canỗado, Ricardo H C Takahashi and Lupércio F Bessegato 20 Practical Considerations for Simulated Annealing Implementation Sergio Ledesma, Gabriel Aviña and Raul Sanchez 401 Simulated Annealing as an Intensification Component in Hybrid Population-Based Metaheuristics Davide Anghinolfi and Massimo Paolucci Department of Communication, Computer and Systems Sciences University of Genova Italy Introduction The use of hybrid metaheuristics applied to combinatorial optimization problems received a continuously increasing attention in the literature Metaheuristic algorithms differ from most of the classical optimization techniques since they aim at defining effective general purpose methods to explore the solution space, avoiding to tailor them on the specific problem at hand Often metaheuristics are referred to as “black-box” algorithms as they use limited knowledge about the specific problem to be tackled, instead usually taking inspiration from concepts and behaviours far from the optimization field This is exactly the case of metaheuristics like simulated annealing (SA), genetic algorithm (GA), ant colony optimization (ACO) or particle swarm optimization (PSO) Metaheuristics are based on a subset of features (e.g., the use of exploration history as short or long term memory, that of learning mechanisms or of candidate solution generation techniques) that represent a general algorithm fingerprint which usually can be easily adapted to face different complex real world problems The effectiveness of any metaheuristic applied to a specific combinatorial problem may depend on a number of factors: most of the time no single dominating algorithm can be identified but several distinct mechanisms exploited by different metaheuristics appear to be profitable for searching high quality solutions For this reason a growing number of metaheuristic approaches to combinatorial problems try to put together several techniques and concepts from different methods in order to design new and highly effective algorithms Hybrid approaches in fact usually seem both to combine complementary strengths and to overcome the drawbacks of single methods by embedding in them one or more steps based on different techniques As an example, in (Anghinolfi & Paolucci, 2007a) the SA probabilistic candidate solution acceptance rule is coupled with the tabu list and neighbourhood change mechanisms respectively characterizing tabu search (TS) and variable neighbourhood search (VNS) approaches to face parallel machine total tardiness scheduling problems Several surveys exist proposing both classifications of metaheuristics and unified views of hybrid metaheuristics (e.g., (Blum & Roli, 2003), (Doerner et al., 2007), (Raidl, 2006) and (Talbi, 2002)) We would avoid to replicate here the various definitions and classifications through which the different approaches can be analysed and organized (the interested reader can for example refer to (Blum & Roli, 2003) Simulated Annealing for a valuable review) However, we should underline few basic concepts that allow us to focus on the different characteristics of the kinds of methods used in the hybrid algorithms presented in this chapter SA, ACO and PSO are all stochastic algorithms, but SA is commonly classified as a trajectory-based method since it determines at each iteration a new single current solution, whereas ACO and PSO are population-based methods since they explore at each iteration a set of distinct solutions which they make evolve iteration after iteration The concept behind these two population-based methods is that the overall exploration process can be improved by learning from the single exploring experiences of a population of very simple agents (the ants or the particles) As will be cleared in the following of the chapter, ACO explicitly exploits a learning mechanism in order to identify, iteration after iteration, which features should characterize good, i.e., the most promising, solutions If in ACO the communication among the exploring agents (the ants) is indirect, PSO, on the other hand, drives the search of the population of agents (the swarm of particles) on the basis of simple pieces of information (e.g., where the current best is located), making the agents moving towards promising solutions Therefore, both ACO and PSO use memory structures, more complex in ACO, simpler in PSO, to elaborate their exploration strategies; agents in ACO and PSO perform a learning or information driven sampling of the solution space that could in general be considered wide but also quite coarse, and that can be trapped in local optima (the so-called stagnation (Dorigo & Stutzle, 2004)) SA, on the other hand, is a memoryless method which combines the local search aptitude of exploring in depth regions in the solution space with the ability, ruled by the cooling schedule mechanism, of escaping from local optima From this brief overview the possible advantage of coupling the different complementary abilities of the two types of metaheuristics should begin to emerge Therefore in this chapter our purpose is to focus the attention on hybrid population-based metaheuristic algorithms with a specific reference to the use of SA as a hybridizing component Then, according to the classification proposed in (Raidl, 2006), the kind of hybrid algorithms here considered result from the combination of two distinct metaheuristics (the “what is hybridized” aspect) among which a low-level strong coupling is established (the “level of hybridization” aspect), in particular the execution of SA is interleaved with the iterations of the population-based metaheuristics (the “order of execution” aspect) so that SA can be viewed as an integrated component of these latter (the “control strategy” aspect) Several works recently appeared in the literature show the interest of embedding SA into population-based approaches as ACO, PSO and GA Examples of PSO hybridized by incorporating SA intensification can be found in (Liu et al., 2008), where the proposed hybrid PSO (HPSO), which includes a probabilistically applied local search (LS) and a learning-guided multi-neighbourhood SA, is applied to makespan minimization in a permutation flow shop scheduling problem with the limited buffers between consecutive machines; in (He & Wang, 2007), where constrained optimization problems are faced by a HPSO which applies the SA search from the best solution found by the swarm in order to avoid the premature convergence; in (Li et al., 2006), where the hybrid algorithm, named PSOSA, is used for non-linear systems parameter estimation; in (Ge et al., 2007) where the HPSO is used to face the job shop scheduling Differently, in (Xia & Wu, 2005) multiobjective flexible job shop scheduling problems are confronted by a hierarchical approach exploiting PSO to assign operations to machines and then SA to schedule operations on each machine Hybrid ACO approaches, which combine pheromone trail based learning 406 Simulated Annealing negative the solution is always accepted However, the algorithm may accept a new solution even if the solution has not a smaller error than the previous one (a positive ΔE), and the probability to this decreases when the temperature decreases or when ΔE increases Consequently, at high temperatures the algorithm may wander wildly accepting bad solutions; as the temperature decreases, the algorithm is more selective and accepts perturbed solutions only when the respective ΔE is small This is the theory behind simulated annealing and should be clearly understood to properly implement the algorithm Acceptance Probability (Pa ) 0.5 -1 k ΔΕ T Fig Probability of acceptance following the Metropolis algorithm Consider now Figure which shows the probability of acceptance as a function of ΔE for several values of k/T From this figure, it can be seen that the constant k plays an important role on the algorithm success; if k is equal to T, the algorithm will accept solutions with high probability even if ΔE is not small This is not good as the method will spend great time trying with bad solutions; even if an excellent solution is found, the method will easily discard it Generally, a medium ration k/T is desired at the beginning of the process The authors suggest estimating the value of k as a previous step of the annealing process This can save a lot of time, as there is not unique value of k that can be used for all optimization problems Estimating k When an optimization problem is not properly solved using simulated annealing, it may sound suitable to increase the number of temperatures and the number of iterations at each temperature Additionally, it may sound logical to start at a high temperature and end with a very low final temperature However, it is most recommended to carefully choose the simulated annealing parameters in order to minimize the number of calculations and, consequently, reduce the time spend on vain perturbations 407 Practical Considerations for Simulated Annealing Implementation Acceptance Probability (Pa) k/T = 0.5 k/T = k/T = 0 0.5 ΔE 1.0 Fig Probability of acceptance for several values of k/T At the beginning of annealing, it is necessary to have an initial solution to start working with Let X0 be the initial solution before applying any perturbation to it, and E0 the error associated with X0 Typically, X0 may be created by assigning random values to {x1, x2, x3, …,xM}, however, in most cases, it is strongly recommended to use the problem requirements to create X0., this will warranty at least a good starting point As it was described before, the constant k plays an important role on simulated annealing for global optimization In this section, the authors suggest a simple method to estimate k using the essential operations of simulated annealing (perturb and evaluate) After inspecting Equation 9, it is clear that the value of k must be estimated using the initial temperate and the delta error An estimate for ΔE can be computed from ΔE = σE (6) which can be estimated as ΔE ≈ Q Q Σ Ei Σ (Ei ) Q - i=1 Q(Q - 1) i=1 (7) that is, the sample variance of E when the solution X0 has been perturbed Q times In practice, Equation is an excellent estimator of the initial value of ΔE as long as Q is at least 1000 or more It is important to mention that an exact value of ΔE is not required as this value is used only to a get rough estimate of k; this implies that a big value for Q is not necessary Once an estimate for the delta error of the solution has been found, finding an estimate for k is straightforward as Equation can be directly used to solve for k However, an initial value for the probability of acceptance needs to be defined It is clear that the initial probability of acceptance must not be close to one, neither must be close to zero A value 408 Simulated Annealing between 0.7 and 0.9 is recommended A probability of acceptance bigger than 0.9 has not practical purpose as the algorithm will accept too many bad solutions On the other hand, a value that is less than 0.7 will rob the algorithm the opportunity to search abroad, loosing one of the main advantages of simulated annealing In general, an initial value for the probability of acceptance should be 0.8 Thus, an estimate of k can be express as k= T0 ln (0.8) σE (8) where an estimate for the standard deviation of the solution error can be computed using Equation The performance of the algorithm is dramatically increased when Equation is used because unnecessary and vain perturbations are not computed; instead the algorithm uses this precious CPU time on doing actual work Implementing simulated annealing For now, the reader should have a sense of how simulated annealing works However, the reader may have some doubts on how to implement it As it was established before, common sense is required for proper implementation of the algorithm as there are not hard rules This section describes how to use a programming language to correctly implement simulated annealing Figure shows the UML diagram for a class to implement simulated annealing At the top of the diagram the class name (SimulatedAnnealing) is shown, the second block contains the member variables and the third block the member functions The member variables' names are self explanatory However, note that k and finalTemp are declared as private as the class itself will compute these values from the other setup parameters The only public function is Start, it should be called once we are ready to start the annealing process SimulatedAnnealing +numTemps : int +numIterations : int +initialTemp : double -finalTemp : double +isCoolingScheduleLinear : bool +cycles : int -k : double +SimulatedAnnealing() +~SimulatedAnnealing() +Start(Solution& solution, Solution& wk1, Solution& wk2, double goal) : double -GetTemperature(int index) : double -IsAcceptedByMetropolis(double temperature, double deltaError) : bool -Anneal(Solution& solution, Solution& wk1, Solution& wk2, double goal) : double -EstimateK(Solution& solution, int N) : double Fig UML diagram for a class to implement simulated annealing Practical Considerations for Simulated Annealing Implementation 409 The class of Figure makes reference to the abstract class Solution depicted in Figure The class Solution contains the actual implementation of the problem that maps the real problem to the solution coding Figure describes two classes: Solution at the top and NumEq at the bottom The class NumEq will be discussed on the next section, for now, just note that NumEq implements the pure abstract functions of the class Solution: operator=, OnInitialize, OnPerturb and OnComputeError These are the four functions that need to be implemented to solve a global optimization problem by simulated annealing Note that these functions corresponds to the basic operations required by annealing (perturb and evaluate) plus two extra more: OnInitialize to initialize the solution, and the operator= that is useful whenever a solution needs to be copied from one variable to another one It is important to mention that for some optimization problems, it may be inefficient to implement the operator= as this operator consumes a considerable amount of CPU time; for this cases other techniques to store and manipulate the solution may be used Solution #error : double +Solution() +~Solution() +Initialize() : double +Perturb(temperature : double, initialTemperature double) : double +GetError() : double +operator=(init : const Solution&) : Solution& -OnInitialize() -OnPertub(temperature : double, initialTemperature : double) -OnComputeError() double NumEq + x : double + y : double +NumEq() +~NumEq() +operator=(init : const Solution&) : Solution& -OnInitialize() -OnPertub(temperature : double, initialTemperature : double) -OnComputeError() double Fig UML diagram of a class to implement the solution of an optimization problem Figure and show a typical implementation using the C++ language for the Simulated Annealing class The class may be implemented in others programming languages such as Java or C# with minor changes Let now discuss briefly the implementation of this class 410 Simulated Annealing There are several private functions on this class and are used only by the class itself The function GetTemperature() is called every time the temperatures changes, its implementation is straightforward once the cooling scheduled has been defined; on the shown code there are two cooling schedules: exponential and linear The function IsAcceptedByMetropolis() implements the metropolis algorithm of Equation 5, returns true when the perturbed solution must be accepted, and returns false otherwise The function EstimateK() implements Equation All the magic of the process is implemented in the function Anneal(), which is called several times if temperature cycling is used (i.e., the variable 'cycles' has a value bigger than one) To use the SimulatedAnnealing class described, the function Start() must be called, this function requires three variables of the class Solution, namely 'solution', 'wk1' and 'wk2' ('solution' is the variable where the actual solution is stored; 'wk1' and 'wk2' are working solutions to perform the annealing process.) In the next section, it will be discussed how to use the SimulatedAnnealing class to solve a simple optimization problem SimulatedAnnealing.h #pragma once #include "Solution.h" class SimulatedAnnealing { public: SimulatedAnnealing(void); ~SimulatedAnnealing(void); int numTemps; int numIterations; double initialTemp; bool isCoolingScheduleLinear; int cycles; double Start(Solution& solution, Solution& wk1, Solution& wk2, double goal); private: double GetTemperature(int index); bool IsAcceptedByMetropolis(double temperature, double deltaError); double Anneal(Solution& solution, Solution& wk1, Solution& wk2, double goal); double EstimateK(Solution& solution, int N); double finalTemp; double k; }; Fig Header file using C++ to implement the SimulatedAnnealing class of Figure SimulatedAnnealing.cpp #include "SimulatedAnnealing.h" SimulatedAnnealing::SimulatedAnnealing(void) { numTemps=100; numIterations=100; initialTemp=100.0; finalTemp=0.0001; isCoolingScheduleLinear=false; Practical Considerations for Simulated Annealing Implementation 411 k = 10; cycles = 4; } SimulatedAnnealing::~SimulatedAnnealing(void) { } double SimulatedAnnealing::Start(Solution& solution, Solution& wk1, Solution& wk2, double goal) { for(int i=0; i

Ngày đăng: 26/06/2014, 23:20

TỪ KHÓA LIÊN QUAN