1. Trang chủ
  2. » Ngoại Ngữ

A COMPARISON APPROACH BETWEEN GA AND PSO

84 3 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 84
Dung lượng 1,54 MB

Nội dung

Republic of Iraq Ministry of Higher Education and Scientific Research Al-Nahrain University College of Science A COMPARISON APPROACH BETWEEN GA AND PSO A Thesis Submitted to the College of Science, Al-Nahrain University in Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Science By Azhar Waleed Hammad (B.Sc 1999) Supervisor Dr Ban Nadeem Thanoon December 2006 Thou AL-Hujah 1427                        ‫صدق الله العظيم‬         ‫بسم الله الرحمن‬ ‫الرحيم‬ ‫أ‬ ‫ر‬ ‫ل‬ ‫ر‬ ‫ذيِ ر‬ ‫ اقأررأ مباِسم م ررب بك ال م‬ ‫خلقر‬ ‫خل رق الن مساِن من ع رل رق اقأرأ أ‬ ‫ر ر‬ ‫ر‬ ‫ر‬ ‫ق‬ ‫ر‬ ‫ل‬ ‫ل‬ ‫أ‬ ‫ورررب ب ر‬ ‫م‬ ‫ل‬ ‫ع‬ ِ‫ذي‬ ‫ل‬ ‫ا‬ ‫م‬ ‫ر‬ ‫ك‬ ‫ر‬ ‫م‬ ‫ك ال ر م‬ ‫ر‬ ‫ل‬ ‫ر‬ ‫أ‬ ‫م‬ ‫ا‬ ‫م‬ ‫ل‬ ‫ع‬  ‫م‬ ‫ل‬ ‫ق‬ ‫ر‬ ‫مباِل‬ ‫ر‬ ‫ساِ ر‬ ‫لن ر‬ ‫ماِ ل ر أ‬ ‫ن ر‬ ‫ر‬ ‫م‬ ‫م‬ ‫ر‬  ‫م‬ ‫ي رعأل أ‬ Dedication To The Wind Beneath My Wings My Beloved Mother Acknowledgment Great thanks to Allah who gave me the ability to complete this work I would like to express my deepest gratefulness and thanks to all those who helped me in bringing this thesis to actuality and in particular to supervisor Dr Ban Nadeem who without her help and encouragement, this thesis would not have seen the light of reality Grateful thanks for the Head of Department of Computer Science Dr Taha S Bashaga I wish thank the staff of both Computer Science and Mathematics and Computer Application Departments at the Al-Nahrain University for their help Abstract Genetic Algorithms (GAs) are general-purpose search and optimization procedures They were inspired by the biological evolution principle of survival of the fittest This led to the metaphoric use of terminology borrowed from the field of biological evolution Another method of optimization, Particle Swarm Optimization (PSO), is able to accomplish the same goal as GA in a new way The thought process behind the algorithm was inspired by social behavior of animals, such as bird flocking PSO is similar to the GA in that it begins with a random population, unlike the GA; PSO has no evolution operators such as crossover and mutation In this thesis, three problems were chosen to compare between GA and PSO performance These problems are (Solving Linear Algebraic Equations (SLAE), Solving N-Queens problem (SNQP), and Substitution Cipher (SC)) The Solving Linear Algebraic problem is a simple problem with (2 12) search space to find the solution for linear equations; both GA and PSO consistently find good solutions The Solving N-Queens problem is the problem of putting n queens on an n×n chessboard with ((n-k)Ị) search space such that non of them is to be able to attack any other Eight and sixteen queens were tried in this implementation Good results are obtained, but the GA performance is better and faster than PSO when the number of queens is increased Finally, the Substitution Cipher problem is a complete problem, with (26Ị) search space size the full key space of all possible substitution ciphers was searched, and this implementation was met with limited success Abbreviations BGA BP EA EAs EP ES G GA GAs GP GPGA NoG PMX PSO SC SLAE SNQP TGA Abstract Breeder Genetic Algorithm Back Propagation Evolutionary Algorithm Evolutionary Algorithms Evolutionary Programming Evolutionary Strategies Current number of Generations Genetic Algorithm Genetic Algorithms Genetic Programming Global Parallel Genetic Algorithm Maximum Number of Generations Partial Matched Crossover Particle Swarm Optimization Substitution Cipher Solving Linear Algebraic Equations Solving N-Queens problem Traditional Genetic Algorithm i Abbreviations ii Chapter One: Introduction 1.1 Background 1.2 Literature Survey 1.3 Thesis Objective 1.4 Thesis Layout Chapter Two: Genetic Algorithms and Particle Swarm Optimization 2.1 Introduction 2.2 Evolutionary Algorithms 2.3 Genetic Algorithms 2.3.1 Elements of GAs 2.3.2 The GA Procedure 2.3.3 Advantages and Disadvantages of GAs 2.4 Particle Swarm Optimization 2.4.1 PSO Topology 2.4.2 PSO Neighborhood Topologies 2.4.3 PSO Algorithm 2.4.4 The pseudo code of the PSO 2.4.5 Binary PSO 2.4.6 PSO Drawbacks 2.5 GA versus PSO 2.6 Summary Chapter Three: Design and Implementation 3.1 Introduction 3.2 System Design 3.3 Coding, Fitness function, and stopping conditions 3.4 Experiments 3.4.1 Experiment One: (Solving Linear Algebraic Equations) 9 2 20 22 23 23 24 25 29 3 32 33 34 34 35 43 43 3.4.2 Experiment Two: (Solving N-Queens Problem) 3.4.3 Experiment Three: (The Substitution Cipher) 3.5 Summary 46 49 53 Chapter Four: The Results and Comparisons 4.1 Introduction 4.2 Individuals Description 4.3 Initialization 4.4 Fitness Function 4.5 Experiments Results 4.5.1 Solving Linear Algebraic Equations 4.5.2 Solving N-Queens Problem 4.5.3 The Substitution Cipher 4.6 Discussion 4.7 Summary 54 54 55 55 55 56 58 61 65 65 Chapter Five: Conclusions and Future Works 5.1 Conclusions 5.2 Suggestion Works 66 67 References Chapter One Introduction 1.1 Background In the early 1950s computer scientists studied evolutionary systems as an optimization tool, introducing the basic of evolutionary computing Until 1960s, the field of evolutionary systems was working in parallel with genetic Algorithm (GA) research When the started to interact, a new field of evolutionary programming appeared by introducing new concepts of evolution, selection and mutation Holland defined the concept of the GA as a metaphor of the Darwinian theory of evolution applied to biology Implementation of a GA begins with a population of random chromosomes The algorithm then evaluates these structures and allocates reproductive opportunities such that chromosomes which represent better solutions to the problem are given more chance to “reproduce” In selection, the best candidates, new fitter offspring are produced and reinserted, and the less fit removed In using operators such as crossover and mutation the chromosomes exchange their characteristics The suitability of solution is typically defined with respect to the current population GA techniques have a solid theoretical foundation GAs are often viewed as function optimizers, although the range of problems to which they have been applied is broad [Hol75, Neg02] The implicit rules followed by the members of fish schools and bird flocks, that allow them to undertake synchronized movement, without colliding, has been studied by several scientists There is general belief that social sharing of information among individual of a population, may provide an evolutionary advantage, and there are numerous examples coming from nature to support this This was the core idea behind the development of Particle Swarm Optimization (PSO) The PSO method is a member of the wide category of swarm intelligence methods [Rey87, Hep90, Ken01] Kennedy originally proposed PSO as a simulation of social behaviour, and it was initially introduced as an optimization method in 1995 PSO can be easily implemented and is computationally inexpensive since its memory and CPU speed requirements are low Furthermore, it dose not require gradient information of the objective connection being considered, only its values PSO has proved to be efficient methods for numerous general optimization problems and in some cases it dose not suffer from the problems encountered by other Evolutionary Computation techniques PSO typically moves quickly towards the best general area in the solution space for a problem [Ken95, Ebe96, Jon05] Plain Text: geneticalgorithmsandparticleswarmoptimization Key for Ciphering: kxahfwequodmczjbpygnriltsv Cipher Text: efzfnuakmejyunqcgkzhbkynuamfglkycjbnucuvknujz Figure (4.1): plain text, ciphering key, and cipher text For SC problem, tables (4.8) and (4.9) summarize the parameters and operators for both GA and PSO algorithm respectively Table (4.8): summarize GA’s operators and parameters for solving SC Initialization Random Representation ASCII Code String of Length=26 Selection Tournament Selection Recombination Partial Matched Crossover (PMX) Mutation Swap Mutation Probability 0.5 Population Size 10, 20, 30,…, 100 Maximum Number of Iterations(NoG) 100, 500, 1000, 2000 Stopping Condition Solution or Number of Generations= NoG Table (4.9): summarize PSO parameters for solving SC Initialization Representation w c1 and c2 r1 and r2 Neighborhood Topology Population Size or Number of Particles Maximum Number of Iterations(Tmax) Stopping Condition Random ASCII Code String of Length=26 w= ((Tmax - G) * (0.9 - 0.4) / Tmax) + 0.4 Random Star Topology 10, 20, 30,…, 100 100, 500, 1000, 30000 Solution or Number of Iterations= Tmax Table (4.10) shows the results by using GA for solving SC problem Table (4.10): Results of GA to solve SC PopSize 10 20 30 40 50 60 70 80 GA No of Generations Time/Sec 604 37.0000 974 59.9999 1500 91.9999 213 25.9999 320 40.0000 451 56.0000 177 32.9999 185 34.0000 267 48.9999 118 28.9999 144 35.9999 146 35.9999 99 29.9999 102 31.9999 114 35.9999 72 26.9999 74 27.9999 93 34.9999 35 14.9999 98 43.0000 114 50.0000 40 21.0000 45 22.9999 90 100 99 50.0000 46 26.0000 71 41 74 41.9999 32 19.9999 37 23.9999 58 37.0000 In this problem (SC), the PSO algorithm does not reach to any solutions with different population/Swarm size (Pop/Swarm Size) and different maximum number of generations (NoG) 4.6 Discussion The three problems, (SLAE, SNQP, and SC), have been solved by both GA and PSO, ten times for each population size were carried out for each problem In the first problem, SLAE, both GA and PSO consistently found good solutions for 10, 20, 30, 40 and 50 population size as shown in table (4.3) In the second problem, SNQP, the results of both GA and PSO are satisfactory, they obtained when number of queens (n) equal to 8, but when is increased, the GA does better and faster than PSO, as illustrated in table (4.7) In the third problem SC, the GA find a solution with a long running time and with large number of generations as shows the results listed in table (4.10), but when number of generations its small reaches to an indication to correct solution, while the PSO algorithm does not reach or converge to any possible solution The Overall results indicate that GA determines solutions which are closer to the optimal solutions than PSO does 4.7 Summary The result of three problems (SLAE, SNQP, and SC), using GA and PSO algorithm were analyzed in this chapter Also comparison between them is done In the next chapter, the conclusions of this work will be presented, and some future work will be suggested Chapter Five Conclusions and Suggestion Works 5.1 Conclusions For linear problem (Solving Linear Algebraic Problem), and with 12 search space size, both PSO and GA make good performance from time and No of Generations point of view For Non-linear problems (Solving N-Queen Problem with n=8 and n=16, and Substitution Cipher), with ((8-k)Ị ,(16-k)Ị), and 26Ị search space size respectively, we conclude that GA with its own simple operators (selection , crossover and mutation), was stable in its performance under different search space, the GA can reach optimal or near optimal solution While the PSO, perform well in small search space size but decrease its capabilities with more complicated problem when it has large space size The main difference between the PSO approach compared to GA is that PSO does not have genetic operators such as crossover and mutation Particles update themselves with the internal velocity; they also have a memory that is important to the algorithm In PSO, only the ‘best’ particle gives out the information to others It is a one-way information sharing mechanism, the evolution only looks for the best solution The computational effort of PSO is less than of the GA because there are few parameters to adjust, so it is faster than GA for the same problem 5.2 Suggestion Works A hybrid system between GA and PSO may be implemented and evaluated under different applications Try to apply PSO algorithm to solve those problems attacked successfully using GA (such as: The Transposition Cipher, The Traveling Salesman Problem, Image and Voice Registration) Compare GA and PSO with the other evolutionary optimization algorithm (such as: Memetic Algorithm, Ant Colony Optimization, and Shuffled Frog Leaping Algorithm) References [Bar97] M Baright, J Timmins, G Heliker, “Multiobjective Optimization of Control Systems Via Genetic Algorithms”, University of Illinois, IlliGAL Report NO.07009, 1997 [Boz03] M Bozikovic, M Golub, L Budin, “Solving n-Queen problem using Global Parallel Genetic Algorithm”, Ljubljana, Solenia, EUROCON 2003,pdf [Bry00] K Bryant, A Benjamin, “Genetic Algorithms and the Traveling Salesman Problem”, Harvey Mudd College, December 2000, http://www.math.hmc.edu/seniorthesis/archives/2001/kbryant_2001_thesis pdf [Cha99] L D Chambers, Practical Handbook of Genetic Algorithms, complex coding system Volume III, CRC Prees LLC, 1999 [Cor01] O Cordon, F Herrera, F Hoffmann, L Magdalena, Genetic Fuzzy Systems, world scientific publishing co.pte.ltd, 2001 [Del04] B Delman, “Genetic Algorithms In Cryptography”, Rochester Institute of Technology, Kate Gleason College of Engineering, Computer Engineering, M.Sc thesis, 2004 [Ebe96] R.C Eberhart, P Simpson, and R Dobbins, “Computational Intelligence PC Tools”, Academic Press, 1996 [Gol89] D E Goldberg, Genetic Algorithms in search, optimization, and Machine Learning, Addison-Wesley Publishing company, INC., 1989 [Gra95] K Grant, ”An Introduction to Genetic Algorithms”, C++ user Journal, March 1995 [Has05] R Hassan, B Cohanim, O de Weck, “A Comparison of Particle Swarm Optimization and The Genetic Algorithm”, AIAA, Massachusetts Institute of technology, Cambridge, MA, o2139, 2005 See also:http://web.mit.edu/deweck/www/pdf_archive/3%20Refereed%20 conference/3-50-ATAA-2005-1897.pdf [Hep90] F Heppener and U Grenander, “A Stochastic Nonlinear Model for Coordinate Bird Flocks”, In Krasner, S (Ed.) The Ubiquity of Chaos AAAS publications, USA, 1990 [Hol75] J Holland, Adaptation in Neural and Artificial Systems, University of Michigan Press, 1975 [Hos00] A.S Hosagrahara, “Development of a Pattern Recognition Program Using Genetic Algorithms”, IlliGAL Report No.2000003, USA, January 2000 [Hun94] S L Hung and H Adeli, “A Parallel Genetic/Neural Network Learning Algorithm for MIND-Shared Memory Machines”, IEEE Transaction on neural networks, Vol.5, No.6, November, 1994 [Jon05] O Jones, “Comparison of Genetic Algorithm and Particle Swarm Optimization”, Liverpool join Moores University, 2005 k.o.jones@livjm.as.uk See also: http://ecet.ecs.ru.acad.bg/cstos/Docs/cp/SIII/IIIA.1.pdf [Kad00] A M Kadim, “Effects of Selection Schemes and Scaling Strategies on the Performance of Genetic Algorithms”, Al-Nahrain University, M.Sc Thesis, 2000 [Ken95] J Kennedy and R Eberhart, “Particle Swarm Optimization”, Proc IEEE International Conference on Neural Network, Vol.4, pp1942-198, 1995 [Ken97] J Kennedy and R.C Eberhart “A discrete binary version of the particle swarm algorithm”, Proc Cont on System Man and Cybernetic, 4104, 4109 Piscatawy, NJ: IEEE Service Center, 1997 [Ken01] J Kennedy and R Eberhart, Swarm Intelligence Morgan Kaufmann Publishers, Inc., San Francisco, CA, 2001 [Ken02] J Kennedy and R Mendes, “Population Structure and Particle Swarm Performance”, Preceding of the 2002 Congress on Evolutionary Computation, Honolulu, Hawaii, may 2002 [Løv02] M Løvbjerg, and T Krink, “Extending Particle Swarms with SelfOrganized Criticality Proceedings of the fourth Congress on Evolutionary Computation”, (CEC-2002), 2002 [Mah05] A G Mahmoud, “Evolving Neural Networks Using Particle Swarm Optimization”, Al-Mustansiriyah University, M.Sc Thesis, 2005 [Mic95] Z Michalewicz, A Survey of Constraint Handling Techniques in Evolutionary Computation Methods, in: Proceedings of the 4th Annual Conference on Evolutionary Programming, San Diego, CA, 1995,pp.35-155 [Mit96] M Mitchell, An Introduction to Genetic Algorithms, Massachusetts Institute of Technology, 1996 [Mou05] C.R Mouser, S.A Dunn, “Comparing Genetic Algorithms and Particle Swarm Optimization for Inverse Problem Exercise”, Austral Mathematical Soc, 2005 See: http://anziamj.austms.org.au/V 46/CTAC2004/Mous/Mous.pdf [Neg02] M Negnevitsky, Artificial Intelligence, Addison Wesley, 2002 [Nih98] A I Nihad, “Image Registration Using Genetic Algorithms”, AlNahrain University, M.Sc Thesis, 1998 [Omr05] M G H Omran, “Particle Swarm Optimization Methods for Pattern Recognition and Image Processing”, University of Pretoria, Ph.D thesis,2005,http://www.upetd.up.ac.za/thesis/aveilable/etd_021720_110834/un ivericted/pdf [Pha95] D T Pham and L, Xing, “Neural network for Identification, Predication and Control”, 1995 [Rey87] C.W Reynolds, Flocks, herds, and schools: A Distributed Behavioral Model Computer Graphics, P 25-34 [Sch98] M Schoenauer, Z Michalewicz, Sphere Operators and their Applicability for Constrained Parameter Optimization Problems, in Proceeding of the th Annual Conference on Evolutionary Programming, San Diego, CA, 1998, pp 241-250 [Set05] M Settles, “An Introduction to Particle Swarm Optimization”, Department of Computer Science, University of Idaho, Moscow, Idaho U.S.A 838, November 7, 2005 [Shi04] Y Shi, “Particle Swarm Optimization”, Electronic Data Systems, Inc Kokomo, IN 46902,USA Feature Article, IEEE Neural Networks Society, February 2004 [Spi93] R Spilman, M Janssen, B Nelson, M Kepner, “Use of Genetic Algorithm in the Cryptanalysis of Simple Substitution Cipher”, Crypto logia, Vol XVII, No.1, January 1993 [Whi94] D Whitley, ”A Genetic Algorithms Tutorial”, Colorado State University, Fort Collins, Co 80523, 1994, whitley@cs.colostate.edu, http://www.cs.uga.edu/~potter/comIntell/ga-tutorial.pdf [Zho03] Y Zhou, G Zeng, and F Yu, “Particle Swarm Optimization-Based Approach for Optical Finite Impulse Response Filter Design”, Optical Society of America, 2003 Web Sites: [Web1] “Solving Substitution Ciphers with Genetic Algorithm”, Joe Gester, 2003, http://www.cs.rochester.edu/u/brown/Crypto/studprojs/susbstGen.pdf [Web2] “What is Genetic http://www.solver.com/gabasics.htm or Evolutionary Algorithm”, ‫]‪[Web3‬‬ ‫‪“Genetic Algorithm”, http://www.subsimple.com/genealgo.asp.‬‬ ‫]‪[Web4‬‬ ‫‪“Particle Swarm Optimization and Neural Network Application for‬‬ ‫‪QSAR”, http://www.hicomb.org/papers/HICOMB2004-13.pdf, 2004.‬‬ ‫]‪[Web5‬‬ ‫‪Tutorial”,‬‬ ‫‪“Particle‬‬ ‫‪Swarm‬‬ ‫‪Optimization:‬‬ ‫‪http://www.swarmintelligence.org/tutorials.php.‬‬ ‫الخلصة‬ ‫ق‬ ‫الخوارزمية الجينية )‪ (GA‬بح ث‬ ‫ث عاِم وإجراءات تحقي م‬ ‫م‬ ‫يِ للبقاِء للصلح‪ ‬والأستعماِل‬ ‫أمثلية‪ ‬ألهم بمبدأ م التطور الحيو م‬ ‫يِ‪.‬‬ ‫التجريديِ للمصطلح أستعير م‬ ‫ن حق م‬ ‫م أ‬ ‫ل التطورم الحيو م‬ ‫ق المثلية‪ ،‬تحقيق أمثلية حشد م‬ ‫الطريقة الخرى لتحقي م‬ ‫ف كاِلخوارزمية‬ ‫الجزيئاِت )‪ ،(PSO‬قاِدر على إأنجاِز نفس الهد ر‬ ‫على نحو جديد‪ ‬عملية الفكر وراء الخوارزمي ر م‬ ‫ت‬ ‫الجينية ر‬ ‫ة ألهم أ‬ ‫ر ق‬ ‫مع الطيور‪.‬‬ ‫ن الحيواناِ م‬ ‫باِلسلوك الجتماِعي م‬ ‫ت‪ ،‬مثل ت ر ر‬ ‫ج ب‬ ‫م أ‬ ‫تحقيق أمثلية حشد م الجزيئاِت مشاِبهة الى الخوارزمية‬ ‫ن رتأبدأ م باِلتوليد العشوائي‪ ،‬على خلف الخوارزمية‬ ‫الجينية بأ ن‬ ‫ه عوامل‬ ‫س ل م‬ ‫الجينية؛ فإن تحقيق أمثلية حشد م الجزيئاِت ل ري أ ر‬ ‫ل والتغيير‪.‬‬ ‫تطورم مثل النتقاِ م‬ ‫قاِرررنة ماِبين‬ ‫م ر‬ ‫ل لل م‬ ‫في هذه الطروحة‪ ،‬تم إختياِر ثلث مشاِك م‬ ‫ق‬ ‫الخوارزمية الجينية و أمثلية حشد م الجزيئاِت لأدارء تحقي م‬ ‫ح ب‬ ‫ة‬ ‫ة الخطي ن ر‬ ‫ت الجبري ر‬ ‫ل هي ) أ‬ ‫ل المعاِدل ر‬ ‫الأمثلية‪ ‬وهذه المشاِك م‬ ‫ح ن‬ ‫فرة‬ ‫ت )‪ ،(SNQP‬و الش ن‬ ‫ل مشكلة ‪ N‬من الملكاِ م‬ ‫)‪ ،(SLAE‬ر‬ ‫الستبدالية )‪.((SC‬‬ ‫ح ن‬ ‫ل المشكلةم الجبريةم الخطي نةم هي مشكلة بسيطة مع )‪(212‬‬ ‫ر‬ ‫ح ل‬ ‫ت الخطنية‪ ،‬كلتاِ‬ ‫ل للمعاِدل م‬ ‫حجم فضاِء بحثي ليجاِد ال ر‬ ‫الخوارزمية الجينية و أمثلية حشد م الجزيئاِت روجدتاِ حلولاِ ا‬ ‫ضع ‪ n‬من‬ ‫جيدة ر‪ ‬مشكلة ‪ N‬من الملكاِ م‬ ‫ت هي مشكلة ور أ‬ ‫الملكاِت على ‪ n×n‬رقعة شطرنج‪،‬مع))‪ (!(n-k‬حجم فضاِء‬ ‫بحثي بحيث ل تتقاِطع أحداهماِ مع الأخرى‪ ‬تم استخدام‬ ‫ثماِنية وست ع أ‬ ‫ق‪ ‬الخوارزمية‬ ‫شرة ملك ر‬ ‫ة في هذا التطبي م‬ ‫ن أداء أمثلية حشد م الجزيئاِت‬ ‫الجينية كاِنت أحسن وأسرع م‬ ‫م أ‬ ‫فرة الستبدالية‬ ‫ت‪ ‬أخيراا‪ ،‬مشكلة الش ن‬ ‫زيد م عدد ر الملكاِ م‬ ‫عندماِ ري م‬ ‫هي من المشاِكل الصعبة‪ ‬والفضاِء الرئيسي الكاِمل لك م ن‬ ‫ل‬ ‫ت مع)‪ (!26‬حجم فضاِء بحثي‪ ،‬وهذا‬ ‫المفاِتيح المحتملةم فمنتش أ‬ ‫ق كاِن نجاِحه محدوداا‪.‬‬ ‫التطبي م‬ ‫جمهورية العراق‬ ‫وزارة التعليم العالي‬ ‫والبحث العلمي‬ ‫جامعة النهرين‬ ‫كلية العلوم‬ ‫مقاِرنة تقريبية بين‬ ‫الخوارزمية الجينية وأمثلية‬ ‫حشد الجزيئاِت‬ ‫رساِلة مقدمة الى كلية العلوم في جاِمعة النهرين‬ ‫كجزء من متطلباِت نيل شهاِدة الماِجستير في‬ ‫علوم الحاِسوب‬ ‫مقدمة من قبل أزهار وليد حماد‬ ‫)بكالوريوس علوم الحاسوب ‪(1999‬‬ ‫بإشراف‬ ‫د‪.‬بان نديم ذنون‬ ‫كاِنون الول‬ ‫‪2006‬‬ ‫ذو الحجة‬ ‫‪1427‬‬ ... representation, evaluation function and genetic operator are presented Also, a Global Parallel Genetic Algorithm (GPGA) is demonstrated as a possible way to increase GA speed and performance GA is able... Crossover and Mutation Operation Generation No.+1 No Is Stopping condition Satisfied? End Figure (2.1): flowchart of GA 2.3.3 Advantages and Disadvantages of GAs GAs has a number of advantages and. .. topologies, PSO algorithm, fitness criterion, Binary PSO, PSO drawbacks, and GA versus PSO  Chapter Three: Design and Implementation Introduces the design and implementation details of GA and PSO Also

Ngày đăng: 19/10/2022, 03:53

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w