1. Trang chủ
  2. » Luận Văn - Báo Cáo

2 nguyen xuan hung tom tat luan an

26 2 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 26
Dung lượng 674,59 KB

Nội dung

MINISTRY OF EDUCATION AND TRAINING MINISTRY OF NATIONAL DEFENCE MILITARY TECHNICAL ACADEMY NGUYEN XUAN HUNG Objective reduction methods in evolutionary many-objective optimization Major : Mathematical Foundation for Informatics Code: : 46 01 10 THE SUMMARY OF DOCTORAL THESIS IN MATHEMATICS Hanoi - 2022 THIS THESIS IS COMPLETED AT MILITARY TECHNICAL ACADEMY – MINISTRY OF NATIONAL DEFENCE Supervisor: Assoc Prof Dr Bui Thu Lam Reviewer 1: Assoc Prof Dr Tran Quang Anh Posts and Telecommunications Institute of Technology Reviewer 2: Assoc Prof Dr Tran Dang Hung Hanoi National University of Education Reviewer 3: Assoc Prof Dr Nguyen Long National Defense Academy The thesis will be defended at academy-level evaluating council, under the Decision No: 4840/QÐ-HV, November 9th , 2022 signed by President of Military Technical Academy, at on The thesis can be found at: - The library of Military Technical Academy - The national library of Vietnam Introduction Optimization is one of the classical topics in mathematics that influences most areas of social life In fact, finding the optimal solution to a problem is at the heart of any decision-making issues Decision-making usually requires choosing between different alternatives The optimal plan is the most reasonable, cost and resource saving but highly effective While single-objective optimization problems (single- problems) involve optimizing one objective function, hence the task of seeking for the optimal solution is called single-objective optimization; multi-objective optimization problems (multiproblems) involve more than one objective that is conflict with each others One of effective approaches for solving multi- problems, which utilizes evolutionary algorithms, is called multi-objective evolutionary algorithm (multi- algorithm) Multi- algorithms use Pareto dominance relations to compare solutions while optimizing As a result, a set of non-dominated solutions, representing different tradeoffs among objectives, is obtained and called Pareto optimal solutions (PS) Multi- problems with more than three objectives are considered as many- objective optimization problems (many- problems) These problems cause several challenges to multi- algorithms, lead to the performance of multi- algorithms reduce severely In order to alleviate the challenges, many-objective evolutionary algorithms (many- algorithms) are proposed Although many- algorithms are powerful for dealing with many- problems, they still have to face “the curse of dimensionality” Numerous practical optimization problems can be easily listed on a large of number of objectives (often more than ten), as numerous distinctive objectives or criteria are often of practitioners’ interest A question arising is whether all objectives are actually necessary or if some may be omitted without−or with only slightly−changing the problem characteristics Methods for objective reduction are helpful for both decision making and search On the one hand, the decision makers would have to consider fewer objective values per solution, and the number of non-dominated solutions is likely to decrease On the other hand, the search algorithms may work more efficiently and require less computational resources, especially for problems with computationally expensive objectives Aim of the study The study was carried with a view to understanding and improving the performance of many- algorithms by objective dimensionality reduction Objectives of the study In order to achieve the aforementioned aim, the study is designed to fulfill the following objectives: • Investigating the efficiency of combining many- algorithms with an objective dimensionality reduction (ODR) in solving many- problems For this objective, the linear principal component analysis (L-PCA) is used to reduce dimensionality of the non-dominated sets that are generated by many- algorithms First, in order to examine how multi- algorithms/manyalgorithms affect the performance of an ODR, the thesis designs experiments to compare the performance of an ODR when being combined with either multi- algorithms or many- algorithms Second, to check whether a many- algorithm can obtain advantages when it combines with an ODR, the thesis designs experiments to compare the performance of a many- algorithm integrated with an ODR against manyalgorithm alone • Proposing methods that can eliminate the redundant objectives when solving many- problems by using machine learning algorithms The first proposed method uses many- algorithms to generate the complete PF The objectives in many- problems are regarded as objects (or points) in space Partitioning Around Medoids (PAM) algorithm is employed to cluster these objects, then retains one object per cluster Objects, which are retained, are essential objectives of the problem The second one takes advantage of Pareto corner search evolutionary algorithm (PCSEA) to generate only several parts of PF, then uses machine learning algorithms such as L-PCA, k -means or DBSCAN1 clustering algorithms to keep essential objectives and discard redundant ones Research questions In order to achieve the aforementioned objectives, the three following research questions were proposed: What are impacts of using multi- algorithms/many- algorithms in combination with an ODR for solving many- problems? How effective is using many- algorithms in combination with an ODR for solving many- problems? What are proposals to improve the performance many- algorithms in terms of objective dimensionality reduction? Contributions The main contributions of the thesis are as follows: First, the thesis makes contributions to validating the performance objective reduction strongly depends on which multi- algorithms/many- algorithms generate non-dominated solution sets It shows that many- algorithms give better results than multi- algorithms when combining with an ODR It also reveals that the combination give better results in the case of many- algorithms alone, as well as, it demonstrates that combining with an ODR to remove redundant objectives can significantly improve the performance of many- algorithms This contribution has been published in [J1] (listed on p 24) Second, the thesis proposes a complete PF-based ORA The algorithm utilizes non-dominated set generated by many- algorithms then uses PAM algorithm for removing redundant objectives in many- problems To determine the number of clusters, the algorithm uses Silhouette index This contribution has been published in [J2] (listed on p 24) Third, the thesis proposes two partial PF-based ORAs The algorithms employ PCSEA for obtaining a partial PF, namely only “corner” solutions of PF Based on Density-Based Spatial Clustering of Applications with Noise these solutions, machine learning algorithms are then used to remove the redundant objectives The results show that the integration between PCSEA and ODRs gives better performance in finding the essential objectives and eliminate redundant ones for problems with a large number of objectives than other ORAs This contribution has been published in [J3], [C1] and [C2] (listed on p 24) This thesis has three chapters Chapter surveys background literature and related works Chapter investigates the impact of many- algorithms with an ORA and vice versa It also proposes COR objective reduction algorithm that utilizes many- algorithms to geneate complete PFs, then employs PAM algorithm to cluster then remove the redundant objective of many- problems Chapter proposes two objective reduction algorithms, namely PCS-LPCA and PCS-Cluster objective reduction algorithms The algorithms utilize Pareto corner search evolutionary algorithm to generate a partial PF then employ L-PCA and clustering algorithms to keep essential objectives and discard redundant objectives Chapter Literature Review 1.1 Background Definition 1.1 (Optimization problem): A general optimization problem is defined as minimize f (x) (1.1) subject to x ∈ Ω The function f (x) : Rn → R, which is a real-value, needs to be minimized in terms of the objective function This problem is considered as a single- problem Definition 1.2: A multi- problem is demonstrated as follows T minimize f(x) = [f1 (x), f2 (x), , fM (x)] subject to x ∈ Ω (1.2) where there are M objective functions1 , which are mapping of Rn to R, and are in conflict with each other and need to be minimized simultaneously An approach of using evolutionary algorithms for solving multi- problems is called multi- algorithm It uses Pareto dominance relations to compare solutions while optimizing As a result, a set of non-dominated solutions, representing different trade-offs among objectives, is obtained and called PS As multi- problems containing more than three objectives, they are regarded as many- problems These many- problems cause challenges for multi- algorithms To more effectively dealing with these many- problems, many- algorithms are proposed However, “the curse of dimensionality” is unavoidable Moreover, it is difficult to know whether all objectives are essential or not Objective reduction is method that determines a set of objectives which can generate the same PF as all objectives 1.2 Related works 1.2.1 Many-objective optimization There is no generally agreed definition of many- problems However, as number objectives of multi- problems is greater than three, multi- problems are regarded as many- problems The primary motivation behind many- problems is to highlight challenges posed by a large number of objectives to existing multi- algorithms Hence the definition presented here is an evolving one and serves a more practical than theoretical purpose As dealing with these many- problems, multi- algorithms encounter a number of obstacles which have been identified First, some Pareto-dominance based multialgorithms such as NSGA-II, SPEA2 have difficulty in determining which solutions are better for next generation since a large portion of population becomes equally good when solving many- problems at even early stages Second, in order to depict the front result, it is a must to increase the size of population exponentially While the size of population is limited, under non-degenerate scenarios, PF of an m–objective problem is (m-1)–dimension manifold Third, once a PF is obtained, M is greater than or equal to decision makers choose the final solution At this stage, visualization of solutions alternatives becomes very important Fourth, it is difficult to evaluate the performance of algorithm If distance-based performance measures such as GD and IGD are deployed, PF of problem must be known prior to evaluation and a huge number of solutions in PF are required within the reference set to get more reliable results If an HV is utilized, it requires large computational complexity Based on the difficulties facing many- problems, non-Pareto based algorithms, Pareto-based algorithms and their improvements; algorithms for solving many- problems can be classified into six approaches They are relaxed dominance based, diversity based, aggregation based, indicator based, reference set based and objective reduction based approaches 1.2.2 Objective reduction Objectives in multi- problems or many- problems are regarded as features Applying dimensionality reduction in solving many- problems is called objective reduction Objective reduction procedures essentially have two major modules as shown in Figure 1.1 They are generator module and reduction one The first module, which is a multi-/many- algorithm, generates an approximation of PF The second module, which has responsibility of objective reduction by using reduction techniques, called objective dimensionality reduction (ODR) Start multi/many- algorithm (generate nondominated set) objective dimensionality reduction (remove redundant objectives) stop condition? YES End NO Figure 1.1: Flowchart of objective reduction There are several types of classifying objective reductions They are objective selection and objective extraction reductions; online and offline objective reductions; Pareto dominance structure based approach and correlation based one; without error type and allowing error one The first categorization, which bases on each type of objectives after objective reduction, refers to objective selection reduction and objective extraction reduction To explain the data, the objective extraction reduction creates novel objective(s) from the original objective(s) It formulates the essential/reduced objective as a linear combination of the original objectives The objective selection reduction finds the smallest subset of the given objectives in order to generate the same PF as the set of original objectives does Another classification, which bases on the time of incorporating ODR into multialgorithms/many- algorithms, objective reductions can be categorized into two classes They are offline and online classes Offline class: in order to effectively assist the decision making after the search, objective reduction is considered as offline It means that ODR is used right after many- algorithms finish their search Online class: while offline objective reduction is used in the decision making, online objective reduction can also improve the search itself Objective reduction is also categorized as Pareto dominance structure based approach and correlation based one Pareto dominance structure based approach bases on preserving the dominance relations in the given non-dominated solutions in order to retain the number of those solutions as many as possible after redundant objectives are removed Correlation based approach, which bases on the correlation between each of pairs of objectives, aims to keep the most conflict objectives and to remove the objectives that are low or non-conflict with each other The correlation between objectives may be correlation coefficient or mutual information The above objective reductions can efficiently remove redundant objectives However, these algorithms have a number of limitations First, although many objective reductions have been proposed, many of them have not considered reducing objectives of problems or validated by testing redundant multi-/many- problems Second, almost all objective reductions use NSGA-II, which is a multi- algorithm, for generating non-dominated set, while many state-of-the-art many- algorithms with improvements have been proposed Third, among objective reductions are validated with redundant many- problems, they often are tested with only one redundant many- problem, namely, DTLZ5(I,M); or only with instances of small a number of objectives Furthermore, in several objective reductions, parameters must be provided be7 fore objective reduction is performed (MICA-NORMOEA, PCSEA-based objective reduction) 1.3 1.3.1 Benchmarks and performance measures Benchmark methods The thesis compares performance of the proposed algorithms with two other existing methods (OC-ORA and L-PCA) Five many- algorithms are used to get non-dominated solutions for both the proposed and existing methods They are grid-based evolutionary algorithm (GrEA), knee point driven evolutionary algorithm (KnEA), reference-point based non-dominated sorting (NSGA-III), reference vector guided evolutionary algorithm (RVEA*), new dominance relation-based evolutionary algorithm (θ-DEA) The partial PF-based ORAs in chapter also compare to a Pareto dominance based one, namely PCSEA-based) 1.3.2 Benchmark problems In order to compare and challenge the reduction capabilities of objective dimensionality reduction algorithms, the thesis uses two test problems, namely, DTLZ5(I,M) and WFG3(M) DTLZ5(I,M) problem has three properties First, the dimensionality (I) of PF can be changed by setting I to an integer between and M Second, PF is nonconvex and follows the relationship: M i=1 (fi ) = Third, first M − I + objec- tives are perfectly correlated, while the others and one of first M − I + objectives are in conflict with each other Pareto optimal front of WFG3(M) problem degenerates into a linear hyperplane such that M i=1 fi = 1, which, the first M − objectives are perfectly correlated, while the last objective is in conflict with each one in the problem in turn 1.3.3 Performance measures Two typical metrics to measure algorithms used in chapter are GD and IGD GD represents how ‘far’ P Fknow is from P Ftrue IGD can reflect both the convergence and diversity of multi-/many- algorithm Mean and standard deviations of the 2.1 Efficiency in many-objective evolutionary algorithms in objective reduction This section is designated to investigate the impact between multi-/many- algorithm and an ODR First, the study examines how multi-/many- algorithms affect the performance of an ODR Second, it also evaluates what benefits many- algorithms can obtain when they combine with an ODR 2.1.1 Method Figure 2.1 shows the method The objective set of problems is input for method The loop starts with multi- algorithms or many- algorithms These algorithms are utilized to generate non-dominated solution set An ODR analyzes this set to determine which objective is essential or which one is redundant The redundant objectives are removed while the essential ones are kept The loop is continued until no more objective is discarded Start input/output: objective set multi/many- algorithm nondominated solution set objective dimensionality reduction selected objective set stop condition? no objective set ← selected objective set yes End Figure 2.1: The integration of an ODR into multi- algorithms/many- algorithms In order to examine whether many- algorithms can obtain advantages when they 10 Start Start input/output: objective set input/output: objective set many- algorithm many- algorithm nondominated solution set nondominated solution set objective dimensionality reduction selected objective set stop condition? no objective set ← selected objective set yes End End (a) Integration of an ODR into many- algorithms (b) Using many- algorithms Figure 2.2: Two ways using many- algorithms to deal with many- problems combine with an ODR, experiments are designed to compare the performance of the integration of a many- algorithm with an ODR against the performance of manyalgorithm alone Figure 2.2 reveals two ways using many- algorithms to deal with many- problems Figure 2.2a uncovers the integration of an ODR into a manyalgorithm while Figure 2.2b unseals a common way to use a many- algorithm for dealing with a many- problem Operation of Figure 2.2a is similar to Figure 2.1 The method, with L-PCA is used to remove the redundant objectives 2.1.2 Experiment The method is validated with (i) two pairs of multi-/many- algorithms, namely, SPEA2/SPEA2+SDE, NSGA-II/NSGA-III; and (ii) five many- algorithms; namely; GrEA, KnEA, NSGA-III, RVEA*, and θ–DEA are used in Figure 2.2 to search for non-dominated solutions Problem test used in this method is DTLZ5(I,M) problem 2.1.3 Result The results shows that the quality of the non-dominated solution sets generated by multi- or many- algorithms plays an important role in the performance of an 11 Table 2.5: Means, standard deviations of the number of objectives retained; and the number of successes when integrating objective reduction (L-PCA) into multialgorithms/many- algorithms Retain Retain Success Retain Success Retain Success DTLZ5(3,5) DTLZ5(5,10) DTLZ5(7,10) DTLZ5(5,20) Success Retain Success DTLZ5(2,5) NSGA-II 2.00 ±0.00 20 3.00±0.00 20 9.25±1.83 10.0±0.00 20.00±0.00 SPEA2 2.00 ±0.00 20 3.00±0.00 20 9.25±1.83 10.0±0.00 20.00±0.00 NSGA-III 2.00 ±0.00 20 3.00±0.00 20 5.00±0.00 20 7.00±0.00 20 4.90±0.31 18 SPEA2+SDE 2.00 ±0.00 20 3.00±0.00 20 5.00±0.00 20 7.00±0.00 20 5.00±0.00 20 objective reduction algorithm The combination of an ODR with many- algorithms can successfully remove redundant objectives even if the number of original objectives is large However, the combination of an ODR with multi- algorithms often can only remove redundant objectives when the number of original objectives is small Other words, many- algorithms give better results than multi- algorithms Results in method (Figure 2.2) show that many- algorithms combined with an ODR can achieve significantly better performance than using the algorithms alone in terms of removing redundant objectives 12 13 θ-DEA RVEA* NSGA-III KnEA GrEA 1.193E-04±0.000E-15 1.950E-06±0.000E-15↑ GD2 IGD1 GD1 8.212E-03±1.072E-03 Retain 2.600E-06±0.000E-15↑ 2.000E+00±0.000E-15 GD2 IGD2 3.151E-04±0.000E-15 1.054E-04±0.000E-15↑ GD1 1.062E-04±0.000E-15↑ 1.296E-03±4.472E-04 IGD1 IGD2 2.000E+00±0.000E-15 1.155E-05±0.000E-15↑ 1.852E-03±0.000E-15 1.155E-05±0.000E-15↑ 1.057E-02±1.140E-03 3.000E+00±0.000E-15 1.809E-03±0.000E-15↑ 3.246E-03±0.000E-15 1.773E-03±0.000E-15↑ 3.917E-03±0.000E-15 3.000E+00±0.000E-15 2.407E-03±3.873E-04↑ 1.338E-04±0.000E-15 ↓ GD2 Retain 5.786E-03±0.000E-15↑ 7.297E-03±6.325E-04 3.000E+00±0.000E-15 2.446E-03±0.000E-15 4.261E-03±1.183E-03 IGD1 1.133E-04±0.000E-15 2.000E+00±0.000E-15 Retain 1.997E-03±0.000E-15↑ GD1 1.308E-04±0.000E-15 GD2 2.217E-03±0.000E-15 1.113E-03±0.000E-15↑ 1.345E-04±0.000E-15 GD1 2.851E-03±0.000E-15 2.809E-03±0.000E-15 IGD2 3.220E-04±0.000E-15 ↓ 2.111E-04±0.000E-15 IGD1 IGD2 9.877E-03±2.366E-03 2.000E+00±0.000E-15 9.050E-06±0.000E-15↑ 1.584E-02±1.673E-03 9.050E-06±0.000E-15↑ 2.617E-02±2.739E-03 5.000E+00±0.000E-15 1.131E-02±5.916E-04↑ 1.874E-02±1.500E-03 9.925E-03±3.162E-04↑ 1.873E-02±1.396E-03 5.000E+00±0.000E-15 1.532E-02±1.265E-03 1.468E-02±1.658E-03 2.063E-02±8.367E-04 2.165E-02±3.066E-03 5.000E+00±0.000E-15 1.281E-02±5.950E-03↑ 1.279E-02±2.291E-03 1.320E-02±2.335E-03 1.287E-02±1.000E-03 5.000E+00±0.000E-15 2.232E-03±0.000E-15 ↓ 1.247E-04±0.000E-15↑ GD2 Retain 3.000E+00±0.000E-15 1.047E-02±8.944E-04 2.144E-03±0.000E-15 1.322E-04±0.000E-15 1.462E-02±1.497E-02↑ 2.033E-02±2.500E-03 4.950E+00±2.179E-01 GD1 5.334E-03±0.000E-15↑ 5.946E-03±2.236E-04 3.000E+00±0.000E-15 10 DTLZ5 2.522E-03±0.000E-15↑ 2.680E-03±0.000E-15 DTLZ5 IGD2 2.000E+00±0.000E-15 M IGD1 I Retain DTLZ5 Problems 1.180E-04±0.000E-15↑ 2.356E-02±2.500E-03 1.180E-04±0.000E-15↑ 2.803E-02±1.643E-03 7.000E+00±0.000E-15 1.404E-02±2.236E-04↑ 2.294E-02±7.416E-04 1.391E-02±2.236E-04↑ 2.204E-02±8.367E-04 7.000E+00±0.000E-15 2.041E-02±1.323E-03↑ 2.297E-02±2.012E-03 2.501E-02±9.220E-04↑ 2.789E-02±5.822E-03 7.000E+00±0.000E-15 2.951E-02±1.047E-02↑ 5.611E-02±4.067E-02 2.218E-02±2.510E-03↑ 2.271E-02±1.803E-03 7.000E+00±0.000E-15 1.641E-02±5.000E-04↑ 1.809E-02±3.873E-04 1.887E-02±5.477E-04↑ 2.387E-02±2.133E-03 7.000E+00±0.000E-15 10 DTLZ5 4.959E-04±2.074E-03↑ 1.646E-02±3.399E-03 5.956E-04±2.490E-03↑ 3.696E-02±3.302E-03 4.950E+00±2.179E-01 9.721E-03±2.898E-03↑ 2.730E-02±5.710E-03 1.483E-02±7.849E-03↑ 2.306E-02±5.196E-03 4.850E+00±6.538E-01 4.355E-01±1.831E+00↑ 9.310E-01±2.145E+00 3.290E-02±2.828E-03 3.447E-02±6.771E-03 4.900E+00±3.000E-01 9.049E-01±2.131E+00↑ 1.178E-02±1.500E-03 2.029E-02±2.213E-02 1.172E-02±1.162E-03 5.000E+00±0.000E-15 1.040E+01±2.816E+00 1.032E+01±2.840E+00 1.500E+00±7.864E-01 1.476E+00±8.208E-01 1.865E+01±3.623E+00 20 DTLZ5 Table 2.6: Means and standard deviations of the number of objective retained (Re- tain), those of IGD, GD of approximate PFs (IGD1 , GD1 ); and those of IGD, GD (IGD2 , GD2 ) after carrying out objective reduction (L-PCA) 2.2 COR objective reduction algorithm An improvement for MICA-NORMOEA and OC-ORA algorithms is proposed The algorithm has two key ideas The first idea is that the proposed algorithm can self-determine the number of clusters when it uses PAM for partitioning objects (set of objectives in this case) instead of requiring a specific number of clusters before running PAM The second idea is that the proposed algorithm does not need to use a threshold parameter while determining and removing redundant objectives Algorithm 2.1 (on p 15) describes the proposed method 2.2.1 The proposed algorithm COR objective reduction algorithm has two variants Based on objectives in the solution set, a distance coefficient matrix is computed This proposed algorithm uses two methods in calculating the distance coefficient matrix The first distance coefficient matrix is interdependence coefficient as proposed in OC-ORA algorithm The first COR’s variant using this method is called CORi The second distance coefficient matrix is correlation The second COR’s variant using this method is coined CORc 2.2.2 Experiment The thesis tests both the proposed and OC-ORA and L-PCA objective reductions on instances of redundant problem DTLZ5(I,M) with I relevant objectives and M − I redundant ones 2.2.3 Result The results in Table 2.7 show that the performance of CORc variant is equivalent to other algorithms, namely OC-ORA and L-PCA, in statistical meaning 14 Algorithm 2.1: COR algorithm Input: iteration counter t ← 0; original objectives Ft ← {f1 , f2 , , fM } Output: reduced objective set Fs begin repeat Run a many- algorithm to obtain a non-dominated set At , corresponding to Ft /* Calculate the distance coefficient matrix based on the non-dominated set At , PAM and silhouette are used to cluster the objectives Ft into k clusters /* Determine the number of clusters Silhouettemax ← −∞; i ← 2; while i ≤ sizeof (Ft ) */ /* Partition Ft into i clusters (basing on Dt ) Execute PAM for Ft with i clusters, calculate Silhouettei ; if Silhouettei > Silhouettemax then 10 k ← i; 11 Silhouettemax ← Silhouettei ; */ store k clusters (with replacement); 12 end 13 i ← i + 1; 14 end 15 Fs ← ∅; 16 /* for each cluster: one is retained, discard the others for i = to k 17 Fs ← Fs ∪ (an objective in cluster ith ) 18 19 end 20 if Ft = Fs then stop ←true 21 else 22 ← t + 1; 23 t 24 Ft ← Fs ; stop ←f alse; 25 end 26 until stop; 27 28 */ Basing on the non-dominated set At , calculate distance coefficient matrix Dt end 15 */ Table 2.7: The number of times out of 20 runs, ORAs has successfully found a set of conflicting objectives when combining with many- algorithms many- algorithms GrEA KnEA NSGA-III RVEA* θ-DEA + DTLZ5 ORAs I M L-PCA OC-ORA CORi CORc 20 15 20 20 20 20 20 20 10 20 20 20 20 10 20 19 20 20 20 11 1 20 15 20 20 20 19 18 20 10 20 19 18 20 10 20 16 10 20 20 20 19 18 11 20 20 20 20 20 17 20 20 10 19 20 20 10 20 20 15 20 20 18 14 20 15 20 20 20 19 20 20 10 20 20 20 20 10 20 20 16 20 20 19 13 20 20 20 20 20 17 20 20 10 20 19 20 10 20 20 20 20 20 19 12 17 476 415 376 456 Chapter The partial PF-based objective reduction algorithms Chapter utilized multi-/many- algorithms to generate a complete PF nondominated solution set for objective reduction However, to generate a complete PF, multi- algorithms, especially, many- algorithms have to run for a large number of generation, resulting in excessive number of function evaluations and might 16 be impractical, especially if each function evaluation is an expensive computation Further, even after running for so long, the population may still be far from the true PF, which might lead to information extraction for objective reduction is inefficient, even meaningless In order to overcome these difficulties, PCSEA is proposed to generate a partial PF By taking advantages of PCSEA algorithm, this chapter proposes two objective reduction algorithms (ORAs), viz PCS-LPCA and PCS-Cluster to remove redundant objectives and keeping essential objectives as solving redundant manyproblems PCS-LPCA using PCSEA to generate a partial PF, then using linear principal component analysis to analyze objectives of obtained solutions which are generated by PCSEA algorithm PCS-LPCA is published in [J3] and [C1] PCS-Cluster, which is published in [J3] and [C2], using PCSEA to generate a partial PF, then using clustering machine learning algorithms to analyze in order to keeping the essential objectives 3.1 PCS-LPCA objective reduction algorithm 3.1.1 The proposed algorithm The main purpose of this algorithm is to take advantage of PCSEA and alleviate the limitations of the PCSEA-based objective reduction algorithm The proposed algorithm uses PCSEA to generate non-dominated solutions which are then used by L-PCA1 to eliminate redundant objectives The thesis names it, which is proposed and figured out in Algorithm 3.1 (on p 18), as PCS-LPCA 3.1.2 Experiment the thesis designs an experiment with a view to making comparison between the existing algorithms with PCS-LPCA First, the thesis does comparison between the PCS-LPCA algorithm and four well-known many- algorithms incorporated with L-PCA so as to know whether PCSEA is better than others in terms of generating non-dominated solution set for objective reduction Four well-known many1 Linear principal component analysis 17 Algorithm 3.1: PCS-LPCA algorithm ← 0; Input: t Ft ← {f1 , f2 , , fM } ; // original objective set Output: reduced objective set Fs begin repeat P ← PCSEA(Ft ) ; // Get corner solutions corresponding to remaining objective set Pu ← Unique-Nondominated(P ) ; Fs ← L-PCA(Pu ) ; if Ft = Fs then stop ←true else t ← t + 1; Ft ← Fs ; 10 stop ←f alse; 11 end 12 until stop; 13 14 // Retain the unique solutions // Perform objective reduction using L-PCA end algorithms; namely, NSGA-III, GrEA, RVEA*, θ-DEA are adopted to search for non-dominated solution sets Second, the comparison between PCS-LPCA and PCSEA-based objective reduction is executed to know which objective reduction is better while the same PCSEA is used for generating non-dominated solutions The thesis does experiment on the algorithms on the bases of two problems; namely, DTLZ5(I,M) problem, and WFG3(M) problem; with 36 instances of DTLZ5(I,M) problem, and 10 instances of WFG3(M) problem, respectively 3.1.3 Result The results is presented in Table 3.12 except two last columns The table reveals that performance of PCS-LPCA is better than PCSEA-based objective reduction in statistical meaning at level 0.5 The results also indicate that the performance of PCS-LPCA is better than the best case of L-PCA being combined with θ-DEA 18 3.2 PCS-Cluster objective reduction algorithm 3.2.1 The proposed algorithm The proposed method uses the PCSEA to generate non-dominated solutions The objectives in the solution set are considered as objects then for clustering to eliminate redundant objectives Algorithm 3.2 uncovers the main steps of the proposed algorithm or PCS-Cluster algorithm Algorithm 3.2: PCS-Cluster algorithm ←0 Input: t // step Ft ← {f1 , f2 , , fM } // original objective set Output: Fs // reduced objective set begin repeat P ← PCSEA(Ft ) ; // Get corner solutions corresponding to remaining (current) objective set /* Partition Ft into k clusters, (Ft = C1 ∪ C2 ∪ ∪ Ck ) Fs ← ∅ /* For each cluster: retain one, discard the others */ for i = to k Fs ← Fs ∪ (an objective in cluster Ci ) end /* Compare two sets before and after reduction if Ft = Fs // */ If they are same then 10 stop ←true 11 else 12 ← t + 1; 13 t 14 Ft ← Fs ; stop ←f alse; 15 end 16 until stop; 17 18 */ {C1 , C2 , , Ck } ← Clustering {Ft (P )} end 3.2.2 Experiment Thesis designs experiment to compare PCS-Cluster with algorithms which have been mentioned and compared in 3.1 in solving two redundant problems, namely, DTLZ5(I,M) and WFG3(M) Two clustering algorithms, which are used in PCS19 Cluster algorithm , are k-means and DBSCAN Hence, PCS-Cluster(k−means) and PCS-Cluster(DBSCAN) are included in doing comparison 3.2.3 Result The results is shown in two last columns in Table 3.12 According to results, it can be concluded that the results of the two algorithms PCS-Cluster(DBSCAN) and PCS-LPCA are the best with no significant difference between them in statistical sense Table 3.12: Comparison of the number of successes in finding the correct relevant objective set in the total 30 runs of PCS-Clusters with PCSEA-based, many- algorithms and L-PCA, and PCS-LPCA many- algorithms I M L-PCA L-PCA L-PCA L-PCA k -means DBSCAN DTLZ5 10 30 30 23 30 30 30 30 30 DTLZ5 20 30 18 28 29 30 30 30 DTLZ5 30 30 21 29 27 30 30 30 DTLZ5 40 30 19 29 24 30 30 30 DTLZ5 50 30 14 30 18 30 30 30 DTLZ5 60 30 28 18 30 30 30 DTLZ5 70 27 28 19 30 30 30 DTLZ5 80 28 4 29 30 30 30 DTLZ5 90 23 30 11 30 30 30 DTLZ5 100 22 30 30 30 30 DTLZ5 10 20 21 14 17 28 29 29 29 DTLZ5 10 30 25 24 23 30 28 29 28 DTLZ5 10 40 26 25 24 28 30 30 30 DTLZ5 10 50 27 24 19 22 28 29 29 DTLZ5 10 60 23 22 21 25 28 29 29 DTLZ5 10 70 22 19 24 24 27 30 29 DTLZ5 10 80 21 23 27 24 26 27 26 DTLZ5 10 90 23 20 26 21 28 30 29 DTLZ5 10 100 25 25 21 19 27 29 29 θ-DEA RVEA* NSGA-III GrEA PCS-LPCA Problems PCSEA-based and L-PCA (continued on next page) 20 PCS-Cluster Table 3.12 continued many- algorithms I M L-PCA L-PCA k -means DBSCAN DTLZ5 15 20 6 25 23 23 21 DTLZ5 15 30 23 11 30 23 24 22 DTLZ5 15 40 13 25 19 20 19 DTLZ5 15 50 16 16 20 17 20 18 DTLZ5 15 60 13 16 22 22 22 22 DTLZ5 15 70 4 13 11 16 17 19 18 DTLZ5 15 80 13 18 20 23 20 DTLZ5 15 90 13 22 16 20 16 DTLZ5 15 100 10 19 21 23 23 DTLZ5 20 30 0 12 17 14 12 DTLZ5 20 40 0 2 16 10 DTLZ5 20 50 0 13 DTLZ5 20 60 0 13 11 11 11 DTLZ5 20 70 0 13 9 DTLZ5 20 80 0 0 13 8 DTLZ5 20 90 0 0 14 10 10 DTLZ5 20 100 0 11 10 521 84 435 599 715 805 837 + θ-DEA RVEA* NSGA-III L-PCA GrEA L-PCA DTLZ5 PCS-LPCA Problems PCSEA-based and L-PCA PCS-Cluster 808 WFG3 10 19 30 20 18 30 30 WFG3 20 24 22 10 10 11 26 29 WFG3 30 30 19 14 13 28 30 WFG3 40 24 18 12 14 27 30 WFG3 50 22 14 26 30 WFG3 60 22 13 11 17 27 30 WFG3 70 16 12 11 17 27 30 WFG3 80 19 15 28 30 WFG3 90 21 7 12 15 28 30 WFG3 100 17 13 13 28 30 214 137 86 122 136 275 11 299 735 221 521 721 851 1080 848 1107 WFG3 + + Summary This chapter proposed two objective reduction algorithms They are PCS-LPCA and PCS-Cluster In the first step, they use PCSEA for generating non-dominated 21 solutions set located only in the “corner” of PF In the second step, while PCSLPCA uses L-PCA for reduction, PCS-Cluster uses clustering algorithm: k -means and DBSCAN, for clustering objectives of the non-dominated solution set Two steps are repeated until the algorithms cannot reduce any more objectives Experiments pinpoint that the PCS-LPCA algorithm and PCS-Cluster (using DBSCAN) give better results than others Conclusion and future works In this thesis, many- problems which have redundant objectives have been settled To that; approaches for solving many- problems have been investigated, analyzed; the impacts of multi-/many- algorithms on objective reduction have been pinpointed and three ORAs have also been proposed The thesis has given a overview of multi- problems together with common approaches to solve them, and focused on multi- algorithms It has also mentioned many- problems and the difficulties the multi- algorithms had to face Many manyalgorithms have been appearing in the hope of eliminating or reducing the difficulties However, in several many- problems, it still remains unknown if all objectives are essential or not An approach to eliminating the difficulties has been proposed to determine which objectives in problems are redundant, then the redundant objectives have been discarded This approach is called objective reduction This thesis has investigated the interaction between objective dimensionality reduction (ODR) and many- algorithms The experiments were designed to evaluate the impacts of many- algorithms on the performance of an ODR Then, it examined the benefits which many- algorithms can achieve when being combined with an ODR to remove redundant objectives The results unveiled that the performance of an ODR strongly depends on multi-/many- algorithms which generate non-dominated solutions By combining with many- algorithms, an objective reduction can successfully remove redundant objectives even when the number of original objectives is getting larger The results also demonstrated that combining with an ODR to remove redundant objectives can significantly improve the performance of many22 algorithms This thesis focused on L-PCA for performing objective reduction Three ORAs have been proposed to deal with redundant many- problems Those algorithms use multi-/many- algorithms to generate PFs then use machine learning algorithm to perform objective reduction (1) COR algorithm utilizes multi-/many- algorithms to generate complete PF then uses PAM algorithm to determine redundant objectives and essential ones Further more, it considered objectives as objects (or points) in space, then partitions them into clusters The number of cluster is determined by using silhouette index It keeps one objective in each cluster and discards other ones The proposed algorithm have been compared with two other existing ones in different instances of DTLZ5(I,M) problem The result unsealed that the proposed algorithm was not only equivalent to or more efficient than existing ones (2) PCS-LPCA algorithm, which uses PCSEA to generate a partial PF then utilize the L-PCA for identifying the relevant and redundant objectives It proved that the proposed algorithm was better than the original ones It also examined the affects of threshold values for PCSEA-based objective reduction It indicated that PCSEA-based objective reduction was very efficient in solving redundant problems with a small number of relevant objectives It also suggested that PCSEA-based objective reduction become inefficient for the larger number (3) PCS-Cluster algorithm uses PCSEA to generate a partial PF and clustering algorithms for identifying the relevant and redundant objectives It takes advantage of PCSEA and the simple of clustering algorithms: k -means and DBSCAN Experiments with wider range instances of DTLZ5(I,M) problem have been designed and 30 times have been run independently The results proved that the proposed algorithm was better than the original ones 23 Publications [J1] HX Nguyen, LT Bui, and CT Tran “Improving many objective optimisation algorithms using objective dimensionality reduction” In: Evolutionary Intelligence ISSN 1864-5909 Springer, 2020, 13.3, pp 365–380 [J2] HX Nguyen, LT Bui, and CT Tran “An improvement of clustering-based objective reduction method for many-objective optimization problems” In: Journal of Science and Technique - Le Quy Don Technical University ISSN-18590209 No 202 (10-2019)-Section on Information and Communication Technology (ICT)- No 14 2019 [J3] HX Nguyen, CT Tran, and LT Bui “Improve performance of pareto corner search-based objective reduction in many-objective optimization” In: Evolutionary Intelligence ISSN 1864-5909 Springer, 2022 Oct (accepted ) doi: 10.1007/ s12065-022-00787-y [C1] HX Nguyen, LT Bui, and CT Tran “A Pareto Corner Search Evolutionary Algorithm and Principal Component Analysis for Objective Dimensionality Reduction” In: 2019 11th International Conference on Knowledge and Systems Engineering (KSE’19) (IEEE) Da Nang, Vietnam, Oct 2019 [C2] HX Nguyen, LT Bui, and CT Tran “Clustering Method Using Pareto Corner Search Evolutionary Algorithm for Objective Reduction in Many-Objective Optimization Problems” In: Proceedings of the Tenth International Symposium on Information and Communication Technology SoICT 2019 Hanoi, Ha Long Bay, Viet Nam: ACM, 2019, pp 78–84 isbn: 978-1-4503-7245-9 24 ... CORc 20 15 20 20 20 20 20 20 10 20 20 20 20 10 20 19 20 20 20 11 1 20 15 20 20 20 19 18 20 10 20 19 18 20 10 20 16 10 20 20 20 19 18 11 20 20 20 20 20 17 20 20 10 19 20 20 10 20 20 15 20 20 18... 17 28 29 29 29 DTLZ5 10 30 25 24 23 30 28 29 28 DTLZ5 10 40 26 25 24 28 30 30 30 DTLZ5 10 50 27 24 19 22 28 29 29 DTLZ5 10 60 23 22 21 25 28 29 29 DTLZ5 10 70 22 19 24 24 27 30 29 DTLZ5 10 80 21 ... 10 19 20 20 10 20 20 15 20 20 18 14 20 15 20 20 20 19 20 20 10 20 20 20 20 10 20 20 16 20 20 19 13 20 20 20 20 20 17 20 20 10 20 19 20 10 20 20 20 20 20 19 12 17 476 415 376 456 Chapter The partial

Ngày đăng: 08/12/2022, 22:12