At present, salient object detection (SOD) has achieved considerable progress. However, the methods that perform well still face the issue of inadequate detection accuracy. For example, sometimes there are problems of...At present, salient object detection (SOD) has achieved considerable progress. However, the methods that perform well still face the issue of inadequate detection accuracy. For example, sometimes there are problems of missed and false detections. Effectively optimizing features to capture key information and better integrating different levels of features to enhance their complementarity are two significant challenges in the domain of SOD. In response to these challenges, this study proposes a novel SOD method based on multi-strategy feature optimization. We propose the multi-size feature extraction module (MSFEM), which uses the attention mechanism, the multi-level feature fusion, and the residual block to obtain finer features. This module provides robust support for the subsequent accurate detection of the salient object. In addition, we use two rounds of feature fusion and the feedback mechanism to optimize the features obtained by the MSFEM to improve detection accuracy. The first round of feature fusion is applied to integrate the features extracted by the MSFEM to obtain more refined features. Subsequently, the feedback mechanism and the second round of feature fusion are applied to refine the features, thereby providing a stronger foundation for accurately detecting salient objects. To improve the fusion effect, we propose the feature enhancement module (FEM) and the feature optimization module (FOM). The FEM integrates the upper and lower features with the optimized features obtained by the FOM to enhance feature complementarity. The FOM uses different receptive fields, the attention mechanism, and the residual block to more effectively capture key information. Experimental results demonstrate that our method outperforms 10 state-of-the-art SOD methods.展开更多
In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update ...In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update based on two extreme values: personal best and global best, which limits the diversity of information. Ideally, particles should learn from multiple advantageous particles to enhance interactivity and optimization efficiency. Accordingly, this paper proposes a PSO that simulates the evolutionary dynamics of species survival in mountain peak ecology (PEPSO) for feature selection. Based on the pyramid topology, the algorithm simulates the features of mountain peak ecology in nature and the competitive-cooperative strategies among species. According to the principles of the algorithm, the population is first adaptively divided into many subgroups based on the fitness level of particles. Then, particles within each subgroup are divided into three different types based on their evolutionary levels, employing different adaptive inertia weight rules and dynamic learning mechanisms to define distinct learning modes. Consequently, all particles play their respective roles in promoting the global optimization performance of the algorithm, similar to different species in the ecological pattern of mountain peaks. Experimental validation of the PEPSO performance was conducted on 18 public datasets. The experimental results demonstrate that the PEPSO outperforms other PSO variant-based feature selection methods and mainstream feature selection methods based on intelligent optimization algorithms in terms of overall performance in global search capability, classification accuracy, and reduction of feature space dimensions. Wilcoxon signed-rank test also confirms the excellent performance of the PEPSO.展开更多
In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature sel...In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature selection aims to alleviate this issue by minimizing the number of features in the subset while simultaneously minimizing the classification error rate.Single-objective optimization approaches employ an evaluation function designed as an aggregate function with a parameter,but the results obtained depend on the value of the parameter.To eliminate this parameter’s influence,the problem can be reformulated as a multi-objective optimization problem.The Whale Optimization Algorithm(WOA)is widely used in optimization problems because of its simplicity and easy implementation.In this paper,we propose a multi-strategy assisted multi-objective WOA(MSMOWOA)to address feature selection.To enhance the algorithm’s search ability,we integrate multiple strategies such as Levy flight,Grey Wolf Optimizer,and adaptive mutation into it.Additionally,we utilize an external repository to store non-dominant solution sets and grid technology is used to maintain diversity.Results on fourteen University of California Irvine(UCI)datasets demonstrate that our proposed method effectively removes redundant features and improves classification performance.The source code can be accessed from the website:https://github.com/zc0315/MSMOWOA.展开更多
The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques we...The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.展开更多
The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,a...The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.展开更多
Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In...Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In recent years,meta-heuristic algorithms have been widely used in FS problems,so a Hybrid Binary Chaotic Salp Swarm Dung Beetle Optimization(HBCSSDBO)algorithm is proposed in this paper to improve the effect of FS.In this hybrid algorithm,the original continuous optimization algorithm is converted into binary form by the S-type transfer function and applied to the FS problem.By combining the K nearest neighbor(KNN)classifier,the comparative experiments for FS are carried out between the proposed method and four advanced meta-heuristic algorithms on 16 UCI(University of California,Irvine)datasets.Seven evaluation metrics such as average adaptation,average prediction accuracy,and average running time are chosen to judge and compare the algorithms.The selected dataset is also discussed by categorizing it into three dimensions:high,medium,and low dimensions.Experimental results show that the HBCSSDBO feature selection method has the ability to obtain a good subset of features while maintaining high classification accuracy,shows better optimization performance.In addition,the results of statistical tests confirm the significant validity of the method.展开更多
In the realm of data privacy protection,federated learning aims to collaboratively train a global model.However,heterogeneous data between clients presents challenges,often resulting in slow convergence and inadequate...In the realm of data privacy protection,federated learning aims to collaboratively train a global model.However,heterogeneous data between clients presents challenges,often resulting in slow convergence and inadequate accuracy of the global model.Utilizing shared feature representations alongside customized classifiers for individual clients emerges as a promising personalized solution.Nonetheless,previous research has frequently neglected the integration of global knowledge into local representation learning and the synergy between global and local classifiers,thereby limiting model performance.To tackle these issues,this study proposes a hierarchical optimization method for federated learning with feature alignment and the fusion of classification decisions(FedFCD).FedFCD regularizes the relationship between global and local feature representations to achieve alignment and incorporates decision information from the global classifier,facilitating the late fusion of decision outputs from both global and local classifiers.Additionally,FedFCD employs a hierarchical optimization strategy to flexibly optimize model parameters.Through experiments on the Fashion-MNIST,CIFAR-10 and CIFAR-100 datasets,we demonstrate the effectiveness and superiority of FedFCD.For instance,on the CIFAR-100 dataset,FedFCD exhibited a significant improvement in average test accuracy by 6.83%compared to four outstanding personalized federated learning approaches.Furthermore,extended experiments confirm the robustness of FedFCD across various hyperparameter values.展开更多
Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the i...Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the input samples is particularly important.Commonly used feature optimization methods increase the interpretability of gas reservoirs;however,their steps are cumbersome,and the selected features cannot sufficiently guide CML models to mine the intrinsic features of sample data efficiently.In contrast to CML methods,deep learning(DL)methods can directly extract the important features of targets from raw data.Therefore,this study proposes a feature optimization and gas-bearing prediction method based on a hybrid fusion model that combines a convolutional neural network(CNN)and an adaptive particle swarm optimization-least squares support vector machine(APSO-LSSVM).This model adopts an end-to-end algorithm structure to directly extract features from sensitive multicomponent seismic attributes,considerably simplifying the feature optimization.A CNN was used for feature optimization to highlight sensitive gas reservoir information.APSO-LSSVM was used to fully learn the relationship between the features extracted by the CNN to obtain the prediction results.The constructed hybrid fusion model improves gas-bearing prediction accuracy through two processes of feature optimization and intelligent prediction,giving full play to the advantages of DL and CML methods.The prediction results obtained are better than those of a single CNN model or APSO-LSSVM model.In the feature optimization process of multicomponent seismic attribute data,CNN has demonstrated better gas reservoir feature extraction capabilities than commonly used attribute optimization methods.In the prediction process,the APSO-LSSVM model can learn the gas reservoir characteristics better than the LSSVM model and has a higher prediction accuracy.The constructed CNN-APSO-LSSVM model had lower errors and a better fit on the test dataset than the other individual models.This method proves the effectiveness of DL technology for the feature extraction of gas reservoirs and provides a feasible way to combine DL and CML technologies to predict gas reservoirs.展开更多
The world produces vast quantities of high-dimensional multi-semantic data.However,extracting valuable information from such a large amount of high-dimensional and multi-label data is undoubtedly arduous and challengi...The world produces vast quantities of high-dimensional multi-semantic data.However,extracting valuable information from such a large amount of high-dimensional and multi-label data is undoubtedly arduous and challenging.Feature selection aims to mitigate the adverse impacts of high dimensionality in multi-label data by eliminating redundant and irrelevant features.The ant colony optimization algorithm has demonstrated encouraging outcomes in multi-label feature selection,because of its simplicity,efficiency,and similarity to reinforcement learning.Nevertheless,existing methods do not consider crucial correlation information,such as dynamic redundancy and label correlation.To tackle these concerns,the paper proposes a multi-label feature selection technique based on ant colony optimization algorithm(MFACO),focusing on dynamic redundancy and label correlation.Initially,the dynamic redundancy is assessed between the selected feature subset and potential features.Meanwhile,the ant colony optimization algorithm extracts label correlation from the label set,which is then combined into the heuristic factor as label weights.Experimental results demonstrate that our proposed strategies can effectively enhance the optimal search ability of ant colony,outperforming the other algorithms involved in the paper.展开更多
High-dimensional datasets present significant challenges for classification tasks.Dimensionality reduction,a crucial aspect of data preprocessing,has gained substantial attention due to its ability to improve classifi...High-dimensional datasets present significant challenges for classification tasks.Dimensionality reduction,a crucial aspect of data preprocessing,has gained substantial attention due to its ability to improve classification per-formance.However,identifying the optimal features within high-dimensional datasets remains a computationally demanding task,necessitating the use of efficient algorithms.This paper introduces the Arithmetic Optimization Algorithm(AOA),a novel approach for finding the optimal feature subset.AOA is specifically modified to address feature selection problems based on a transfer function.Additionally,two enhancements are incorporated into the AOA algorithm to overcome limitations such as limited precision,slow convergence,and susceptibility to local optima.The first enhancement proposes a new method for selecting solutions to be improved during the search process.This method effectively improves the original algorithm’s accuracy and convergence speed.The second enhancement introduces a local search with neighborhood strategies(AOA_NBH)during the AOA exploitation phase.AOA_NBH explores the vast search space,aiding the algorithm in escaping local optima.Our results demonstrate that incorporating neighborhood methods enhances the output and achieves significant improvement over state-of-the-art methods.展开更多
Feature selection(FS)is a data preprocessing step in machine learning(ML)that selects a subset of relevant and informative features from a large feature pool.FS helps ML models improve their predictive accuracy at low...Feature selection(FS)is a data preprocessing step in machine learning(ML)that selects a subset of relevant and informative features from a large feature pool.FS helps ML models improve their predictive accuracy at lower computational costs.Moreover,FS can handle the model overfitting problem on a high-dimensional dataset.A major problem with the filter and wrapper FS methods is that they consume a significant amount of time during FS on high-dimensional datasets.The proposed“HDFS(PSO-MI):hybrid distribute feature selection using particle swarm optimization-mutual information(PSO-MI)”,is a PSO-based hybrid method that can overcome the problem mentioned above.This method hybridizes the filter and wrapper techniques in a distributed manner.A new combiner is also introduced to merge the effective features selected from multiple data distributions.The effectiveness of the proposed HDFS(PSO-MI)method is evaluated using five ML classifiers,i.e.,logistic regression(LR),k-NN,support vector machine(SVM),decision tree(DT),and random forest(RF),on various datasets in terms of accuracy and Matthew’s correlation coefficient(MCC).From the experimental analysis,we observed that HDFS(PSO-MI)method yielded more than 98%,95%,92%,90%,and 85%accuracy for the unbalanced,kidney disease,emotions,wafer manufacturing,and breast cancer datasets,respectively.Our method shows promising results comapred to other methods,such as mutual information,gain ratio,Spearman correlation,analysis of variance(ANOVA),Pearson correlation,and an ensemble feature selection with ranking method(EFSRank).展开更多
Additive manufacturing(AM)has made significant progress in recent years and has been successfully applied in various fields owing to its ability to manufacture complex geometries.This method efficiently expands the de...Additive manufacturing(AM)has made significant progress in recent years and has been successfully applied in various fields owing to its ability to manufacture complex geometries.This method efficiently expands the design space,allowing for the creation of products with better performance than ever before.With the emergence of new manufacturing technologies,new design methods are required to efficiently utilize the expanded design space.Therefore,topology optimization methods have attracted the attention of researchers because of their ability to generate new and optimized designs without requiring prior experience.The combination of AM and topology optimization has proven to be a powerful tool for structural innovation in design and manufacturing.However,it is important to note that AM does not eliminate all manufacturing restrictions but instead replaces them with a different set of design considerations that designers must consider for the successful implementation of these technologies.This has motivated research on topology optimization methods that incorporate manufacturable constraints for AM structures.In this paper,we present a survey of the latest studies in this research area,with a particular focus on developments in China.Additionally,we discuss the existing research gaps and future development trends.展开更多
[Objective] The aim was to study the feature extraction of stored-grain insects based on ant colony optimization and support vector machine algorithm, and to explore the feasibility of the feature extraction of stored...[Objective] The aim was to study the feature extraction of stored-grain insects based on ant colony optimization and support vector machine algorithm, and to explore the feasibility of the feature extraction of stored-grain insects. [Method] Through the analysis of feature extraction in the image recognition of the stored-grain insects, the recognition accuracy of the cross-validation training model in support vector machine (SVM) algorithm was taken as an important factor of the evaluation principle of feature extraction of stored-grain insects. The ant colony optimization (ACO) algorithm was applied to the automatic feature extraction of stored-grain insects. [Result] The algorithm extracted the optimal feature subspace of seven features from the 17 morphological features, including area and perimeter. The ninety image samples of the stored-grain insects were automatically recognized by the optimized SVM classifier, and the recognition accuracy was over 95%. [Conclusion] The experiment shows that the application of ant colony optimization to the feature extraction of grain insects is practical and feasible.展开更多
The suddenness, uncertainty, and randomness of rockbursts directly affect the safety of tunnel construction. The prediction of rockbursts is a fundamental aspect of mitigating or even eliminating rockburst hazards. To...The suddenness, uncertainty, and randomness of rockbursts directly affect the safety of tunnel construction. The prediction of rockbursts is a fundamental aspect of mitigating or even eliminating rockburst hazards. To address the shortcomings of the current rockburst prediction models, which have a limited number of samples and rely on manual test results as the majority of their input features, this paper proposes rockburst prediction models based on multi-featured drilling parameters of rock drilling jumbo. Firstly, four original drilling parameters, namely hammer pressure (Ph), feed pressure (Pf), rotation pressure (Pr), and feed speed (VP), together with the rockburst grades, were collected from 1093 rockburst cases. Then, a feature expansion investigation was performed based on the four original drilling parameters to establish a drilling parameter feature system and a rockburst prediction database containing 42 features. Furthermore, rockburst prediction models based on multi-featured drilling parameters were developed using the extreme tree (ET) algorithm and Bayesian optimization. The models take drilling parameters as input parameters and rockburst grades as output parameters. The effects of Bayesian optimization and the number of drilling parameter features on the model performance were analyzed using the accuracy, precision, recall and F1 value of the prediction set as the model performance evaluation indices. The results show that the Bayesian optimized model with 42 drilling parameter features as inputs performs best, with an accuracy of 91.89%. Finally, the reliability of the models was validated through field tests.展开更多
Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is ext...Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.展开更多
One exciting area within computer vision is classifying human activities, which has diverse applications like medical informatics, human-computer interaction, surveillance, and task monitoring systems. In the healthca...One exciting area within computer vision is classifying human activities, which has diverse applications like medical informatics, human-computer interaction, surveillance, and task monitoring systems. In the healthcare field, understanding and classifying patients’ activities is crucial for providing doctors with essential information for medication reactions and diagnosis. While some research methods already exist, utilizing machine learning and soft computational algorithms to recognize human activity from videos and images, there’s ongoing exploration of more advanced computer vision techniques. This paper introduces a straightforward and effective automated approach that involves five key steps: preprocessing, feature extraction technique, feature selection, feature fusion, and finally classification. To evaluate the proposed approach, two commonly used benchmark datasets KTH and Weizmann are employed for training, validation, and testing of ML classifiers. The study’s findings show that the first and second datasets had remarkable accuracy rates of 99.94% and 99.80%, respectively. When compared to existing methods, our approach stands out in terms of sensitivity, accuracy, precision, and specificity evaluation metrics. In essence, this paper demonstrates a practical method for automatically classifying human activities using an optimal feature fusion and deep learning approach, promising a great result that could benefit various fields, particularly in healthcare.展开更多
Particle Swarm Optimization (PSO) is a popular and bionic algorithm based on the social behavior associated with bird flocking for optimization problems. To maintain the diversity of swarms, a few studies of multi-s...Particle Swarm Optimization (PSO) is a popular and bionic algorithm based on the social behavior associated with bird flocking for optimization problems. To maintain the diversity of swarms, a few studies of multi-swarm strategy have been reported. However, the competition among swarms, reservation or destruction of a swarm, has not been considered further. In this paper, we formulate four rules by introducing the mechanism for survival of the fittest, which simulates the competition among the swarms. Based on the mechanism, we design a modified Multi-Swarm PSO (MSPSO) to solve discrete problems, which consists of a number of sub-swarms and a multi-swarm scheduler that can monitor and control each sub-swarm using the rules. To further settle the feature selection problems, we propose an Improved Feature Selection (1FS) method by integrating MSPSO, Support Vector Machines (SVM) with F-score method. The IFS method aims to achieve higher generalization capa- bility through performing kernel parameter optimization and feature selection simultaneously. The performance of the proposed method is compared with that of the standard PSO based, Genetic Algorithm (GA) based and the grid search based mcthods on 10 benchmark datasets, taken from UCI machine learning and StatLog databases. The numerical results and statistical analysis show that the proposed IFS method performs significantly better than the other three methods in terms of prediction accuracy with smaller subset of features.展开更多
Considering that the surface defects of cold rolled strips are hard to be recognized by human eyes under high-speed circumstances, an automatic recognition technique was discussed. Spectrum images of defects can be go...Considering that the surface defects of cold rolled strips are hard to be recognized by human eyes under high-speed circumstances, an automatic recognition technique was discussed. Spectrum images of defects can be got by fast Fourier transform (FFF) and sum of valid pixels (SVP), and its optimized center region, which concentrates nearly all energies, are extracted as an original feature set. Using genetic algorithm to optimize the feature set, an optimized feature set with 51 features can be achieved. Using the optimized feature set as an input vector of neural networks, the recognition effects of LVQ neural networks have been studied. Experiment results show that the new method can get a higher classification rate and can settle the automatic recognition problem of surface defects on cold rolled strips ideally.展开更多
Feature optimization is important to agricultural text mining. Usually, the vector space model is used to represent text documents. However, this basic approach still suffers from two drawbacks: thecurse of dimension...Feature optimization is important to agricultural text mining. Usually, the vector space model is used to represent text documents. However, this basic approach still suffers from two drawbacks: thecurse of dimension and the lack of semantic information. In this paper, a novel ontology-based feature optimization method for agricultural text was proposed. First, terms of vector space model were mapped into concepts of agricultural ontology, which concept frequency weights are computed statistically by term frequency weights; second, weights of concept similarity were assigned to the concept features according to the structure of the agricultural ontology. By combining feature frequency weights and feature similarity weights based on the agricultural ontology, the dimensionality of feature space can be reduced drastically. Moreover, the semantic information can be incorporated into this method. The results showed that this method yields a significant improvement on agricultural text clustering by the feature optimization.展开更多
Dipper throated optimization(DTO)algorithm is a novel with a very efficient metaheuristic inspired by the dipper throated bird.DTO has its unique hunting technique by performing rapid bowing movements.To show the effi...Dipper throated optimization(DTO)algorithm is a novel with a very efficient metaheuristic inspired by the dipper throated bird.DTO has its unique hunting technique by performing rapid bowing movements.To show the efficiency of the proposed algorithm,DTO is tested and compared to the algorithms of Particle Swarm Optimization(PSO),Whale Optimization Algorithm(WOA),Grey Wolf Optimizer(GWO),and Genetic Algorithm(GA)based on the seven unimodal benchmark functions.Then,ANOVA and Wilcoxon rank-sum tests are performed to confirm the effectiveness of the DTO compared to other optimization techniques.Additionally,to demonstrate the proposed algorithm’s suitability for solving complex realworld issues,DTO is used to solve the feature selection problem.The strategy of using DTOs as feature selection is evaluated using commonly used data sets from the University of California at Irvine(UCI)repository.The findings indicate that the DTO outperforms all other algorithms in addressing feature selection issues,demonstrating the proposed algorithm’s capabilities to solve complex real-world situations.展开更多
文摘At present, salient object detection (SOD) has achieved considerable progress. However, the methods that perform well still face the issue of inadequate detection accuracy. For example, sometimes there are problems of missed and false detections. Effectively optimizing features to capture key information and better integrating different levels of features to enhance their complementarity are two significant challenges in the domain of SOD. In response to these challenges, this study proposes a novel SOD method based on multi-strategy feature optimization. We propose the multi-size feature extraction module (MSFEM), which uses the attention mechanism, the multi-level feature fusion, and the residual block to obtain finer features. This module provides robust support for the subsequent accurate detection of the salient object. In addition, we use two rounds of feature fusion and the feedback mechanism to optimize the features obtained by the MSFEM to improve detection accuracy. The first round of feature fusion is applied to integrate the features extracted by the MSFEM to obtain more refined features. Subsequently, the feedback mechanism and the second round of feature fusion are applied to refine the features, thereby providing a stronger foundation for accurately detecting salient objects. To improve the fusion effect, we propose the feature enhancement module (FEM) and the feature optimization module (FOM). The FEM integrates the upper and lower features with the optimized features obtained by the FOM to enhance feature complementarity. The FOM uses different receptive fields, the attention mechanism, and the residual block to more effectively capture key information. Experimental results demonstrate that our method outperforms 10 state-of-the-art SOD methods.
文摘In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update based on two extreme values: personal best and global best, which limits the diversity of information. Ideally, particles should learn from multiple advantageous particles to enhance interactivity and optimization efficiency. Accordingly, this paper proposes a PSO that simulates the evolutionary dynamics of species survival in mountain peak ecology (PEPSO) for feature selection. Based on the pyramid topology, the algorithm simulates the features of mountain peak ecology in nature and the competitive-cooperative strategies among species. According to the principles of the algorithm, the population is first adaptively divided into many subgroups based on the fitness level of particles. Then, particles within each subgroup are divided into three different types based on their evolutionary levels, employing different adaptive inertia weight rules and dynamic learning mechanisms to define distinct learning modes. Consequently, all particles play their respective roles in promoting the global optimization performance of the algorithm, similar to different species in the ecological pattern of mountain peaks. Experimental validation of the PEPSO performance was conducted on 18 public datasets. The experimental results demonstrate that the PEPSO outperforms other PSO variant-based feature selection methods and mainstream feature selection methods based on intelligent optimization algorithms in terms of overall performance in global search capability, classification accuracy, and reduction of feature space dimensions. Wilcoxon signed-rank test also confirms the excellent performance of the PEPSO.
基金supported in part by the Natural Science Youth Foundation of Hebei Province under Grant F2019403207in part by the PhD Research Startup Foundation of Hebei GEO University under Grant BQ2019055+3 种基金in part by the Open Research Project of the Hubei Key Laboratory of Intelligent Geo-Information Processing under Grant KLIGIP-2021A06in part by the Fundamental Research Funds for the Universities in Hebei Province under Grant QN202220in part by the Science and Technology Research Project for Universities of Hebei under Grant ZD2020344in part by the Guangxi Natural Science Fund General Project under Grant 2021GXNSFAA075029.
文摘In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature selection aims to alleviate this issue by minimizing the number of features in the subset while simultaneously minimizing the classification error rate.Single-objective optimization approaches employ an evaluation function designed as an aggregate function with a parameter,but the results obtained depend on the value of the parameter.To eliminate this parameter’s influence,the problem can be reformulated as a multi-objective optimization problem.The Whale Optimization Algorithm(WOA)is widely used in optimization problems because of its simplicity and easy implementation.In this paper,we propose a multi-strategy assisted multi-objective WOA(MSMOWOA)to address feature selection.To enhance the algorithm’s search ability,we integrate multiple strategies such as Levy flight,Grey Wolf Optimizer,and adaptive mutation into it.Additionally,we utilize an external repository to store non-dominant solution sets and grid technology is used to maintain diversity.Results on fourteen University of California Irvine(UCI)datasets demonstrate that our proposed method effectively removes redundant features and improves classification performance.The source code can be accessed from the website:https://github.com/zc0315/MSMOWOA.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program(Grant no.2019QZKK0904)Natural Science Foundation of Hebei Province(Grant no.D2022403032)S&T Program of Hebei(Grant no.E2021403001).
文摘The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.
基金funded by the University of Jeddah,Jeddah,Saudi Arabia,under Grant No.(UJ-23-DR-26)。
文摘The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.
基金This research was funded by the Short-Term Electrical Load Forecasting Based on Feature Selection and optimized LSTM with DBO which is the Fundamental Scientific Research Project of Liaoning Provincial Department of Education(JYTMS20230189)the Application of Hybrid Grey Wolf Algorithm in Job Shop Scheduling Problem of the Research Support Plan for Introducing High-Level Talents to Shenyang Ligong University(No.1010147001131).
文摘Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In recent years,meta-heuristic algorithms have been widely used in FS problems,so a Hybrid Binary Chaotic Salp Swarm Dung Beetle Optimization(HBCSSDBO)algorithm is proposed in this paper to improve the effect of FS.In this hybrid algorithm,the original continuous optimization algorithm is converted into binary form by the S-type transfer function and applied to the FS problem.By combining the K nearest neighbor(KNN)classifier,the comparative experiments for FS are carried out between the proposed method and four advanced meta-heuristic algorithms on 16 UCI(University of California,Irvine)datasets.Seven evaluation metrics such as average adaptation,average prediction accuracy,and average running time are chosen to judge and compare the algorithms.The selected dataset is also discussed by categorizing it into three dimensions:high,medium,and low dimensions.Experimental results show that the HBCSSDBO feature selection method has the ability to obtain a good subset of features while maintaining high classification accuracy,shows better optimization performance.In addition,the results of statistical tests confirm the significant validity of the method.
基金the National Natural Science Foundation of China(Grant No.62062001)Ningxia Youth Top Talent Project(2021).
文摘In the realm of data privacy protection,federated learning aims to collaboratively train a global model.However,heterogeneous data between clients presents challenges,often resulting in slow convergence and inadequate accuracy of the global model.Utilizing shared feature representations alongside customized classifiers for individual clients emerges as a promising personalized solution.Nonetheless,previous research has frequently neglected the integration of global knowledge into local representation learning and the synergy between global and local classifiers,thereby limiting model performance.To tackle these issues,this study proposes a hierarchical optimization method for federated learning with feature alignment and the fusion of classification decisions(FedFCD).FedFCD regularizes the relationship between global and local feature representations to achieve alignment and incorporates decision information from the global classifier,facilitating the late fusion of decision outputs from both global and local classifiers.Additionally,FedFCD employs a hierarchical optimization strategy to flexibly optimize model parameters.Through experiments on the Fashion-MNIST,CIFAR-10 and CIFAR-100 datasets,we demonstrate the effectiveness and superiority of FedFCD.For instance,on the CIFAR-100 dataset,FedFCD exhibited a significant improvement in average test accuracy by 6.83%compared to four outstanding personalized federated learning approaches.Furthermore,extended experiments confirm the robustness of FedFCD across various hyperparameter values.
基金funded by the Natural Science Foundation of Shandong Province (ZR2021MD061ZR2023QD025)+3 种基金China Postdoctoral Science Foundation (2022M721972)National Natural Science Foundation of China (41174098)Young Talents Foundation of Inner Mongolia University (10000-23112101/055)Qingdao Postdoctoral Science Foundation (QDBSH20230102094)。
文摘Conventional machine learning(CML)methods have been successfully applied for gas reservoir prediction.Their prediction accuracy largely depends on the quality of the sample data;therefore,feature optimization of the input samples is particularly important.Commonly used feature optimization methods increase the interpretability of gas reservoirs;however,their steps are cumbersome,and the selected features cannot sufficiently guide CML models to mine the intrinsic features of sample data efficiently.In contrast to CML methods,deep learning(DL)methods can directly extract the important features of targets from raw data.Therefore,this study proposes a feature optimization and gas-bearing prediction method based on a hybrid fusion model that combines a convolutional neural network(CNN)and an adaptive particle swarm optimization-least squares support vector machine(APSO-LSSVM).This model adopts an end-to-end algorithm structure to directly extract features from sensitive multicomponent seismic attributes,considerably simplifying the feature optimization.A CNN was used for feature optimization to highlight sensitive gas reservoir information.APSO-LSSVM was used to fully learn the relationship between the features extracted by the CNN to obtain the prediction results.The constructed hybrid fusion model improves gas-bearing prediction accuracy through two processes of feature optimization and intelligent prediction,giving full play to the advantages of DL and CML methods.The prediction results obtained are better than those of a single CNN model or APSO-LSSVM model.In the feature optimization process of multicomponent seismic attribute data,CNN has demonstrated better gas reservoir feature extraction capabilities than commonly used attribute optimization methods.In the prediction process,the APSO-LSSVM model can learn the gas reservoir characteristics better than the LSSVM model and has a higher prediction accuracy.The constructed CNN-APSO-LSSVM model had lower errors and a better fit on the test dataset than the other individual models.This method proves the effectiveness of DL technology for the feature extraction of gas reservoirs and provides a feasible way to combine DL and CML technologies to predict gas reservoirs.
基金supported by National Natural Science Foundation of China(Grant Nos.62376089,62302153,62302154,62202147)the key Research and Development Program of Hubei Province,China(Grant No.2023BEB024).
文摘The world produces vast quantities of high-dimensional multi-semantic data.However,extracting valuable information from such a large amount of high-dimensional and multi-label data is undoubtedly arduous and challenging.Feature selection aims to mitigate the adverse impacts of high dimensionality in multi-label data by eliminating redundant and irrelevant features.The ant colony optimization algorithm has demonstrated encouraging outcomes in multi-label feature selection,because of its simplicity,efficiency,and similarity to reinforcement learning.Nevertheless,existing methods do not consider crucial correlation information,such as dynamic redundancy and label correlation.To tackle these concerns,the paper proposes a multi-label feature selection technique based on ant colony optimization algorithm(MFACO),focusing on dynamic redundancy and label correlation.Initially,the dynamic redundancy is assessed between the selected feature subset and potential features.Meanwhile,the ant colony optimization algorithm extracts label correlation from the label set,which is then combined into the heuristic factor as label weights.Experimental results demonstrate that our proposed strategies can effectively enhance the optimal search ability of ant colony,outperforming the other algorithms involved in the paper.
文摘High-dimensional datasets present significant challenges for classification tasks.Dimensionality reduction,a crucial aspect of data preprocessing,has gained substantial attention due to its ability to improve classification per-formance.However,identifying the optimal features within high-dimensional datasets remains a computationally demanding task,necessitating the use of efficient algorithms.This paper introduces the Arithmetic Optimization Algorithm(AOA),a novel approach for finding the optimal feature subset.AOA is specifically modified to address feature selection problems based on a transfer function.Additionally,two enhancements are incorporated into the AOA algorithm to overcome limitations such as limited precision,slow convergence,and susceptibility to local optima.The first enhancement proposes a new method for selecting solutions to be improved during the search process.This method effectively improves the original algorithm’s accuracy and convergence speed.The second enhancement introduces a local search with neighborhood strategies(AOA_NBH)during the AOA exploitation phase.AOA_NBH explores the vast search space,aiding the algorithm in escaping local optima.Our results demonstrate that incorporating neighborhood methods enhances the output and achieves significant improvement over state-of-the-art methods.
基金The work is funded by the University Grant Commission(UGC)under(Start-up-Grant No.:F 30-592/2021(BSR)).
文摘Feature selection(FS)is a data preprocessing step in machine learning(ML)that selects a subset of relevant and informative features from a large feature pool.FS helps ML models improve their predictive accuracy at lower computational costs.Moreover,FS can handle the model overfitting problem on a high-dimensional dataset.A major problem with the filter and wrapper FS methods is that they consume a significant amount of time during FS on high-dimensional datasets.The proposed“HDFS(PSO-MI):hybrid distribute feature selection using particle swarm optimization-mutual information(PSO-MI)”,is a PSO-based hybrid method that can overcome the problem mentioned above.This method hybridizes the filter and wrapper techniques in a distributed manner.A new combiner is also introduced to merge the effective features selected from multiple data distributions.The effectiveness of the proposed HDFS(PSO-MI)method is evaluated using five ML classifiers,i.e.,logistic regression(LR),k-NN,support vector machine(SVM),decision tree(DT),and random forest(RF),on various datasets in terms of accuracy and Matthew’s correlation coefficient(MCC).From the experimental analysis,we observed that HDFS(PSO-MI)method yielded more than 98%,95%,92%,90%,and 85%accuracy for the unbalanced,kidney disease,emotions,wafer manufacturing,and breast cancer datasets,respectively.Our method shows promising results comapred to other methods,such as mutual information,gain ratio,Spearman correlation,analysis of variance(ANOVA),Pearson correlation,and an ensemble feature selection with ranking method(EFSRank).
基金supported by National Natural Science Foundation of China(Grant Nos.12272076,U2341232,11332004,and U1808215)the 111 Project of China(Grant No.B14013).
文摘Additive manufacturing(AM)has made significant progress in recent years and has been successfully applied in various fields owing to its ability to manufacture complex geometries.This method efficiently expands the design space,allowing for the creation of products with better performance than ever before.With the emergence of new manufacturing technologies,new design methods are required to efficiently utilize the expanded design space.Therefore,topology optimization methods have attracted the attention of researchers because of their ability to generate new and optimized designs without requiring prior experience.The combination of AM and topology optimization has proven to be a powerful tool for structural innovation in design and manufacturing.However,it is important to note that AM does not eliminate all manufacturing restrictions but instead replaces them with a different set of design considerations that designers must consider for the successful implementation of these technologies.This has motivated research on topology optimization methods that incorporate manufacturable constraints for AM structures.In this paper,we present a survey of the latest studies in this research area,with a particular focus on developments in China.Additionally,we discuss the existing research gaps and future development trends.
基金Supported by the National Natural Science Foundation of China(31101085)the Program for Young Core Teachers of Colleges in Henan(2011GGJS-094)the Scientific Research Project for the High Level Talents,North China University of Water Conservancy and Hydroelectric Power~~
文摘[Objective] The aim was to study the feature extraction of stored-grain insects based on ant colony optimization and support vector machine algorithm, and to explore the feasibility of the feature extraction of stored-grain insects. [Method] Through the analysis of feature extraction in the image recognition of the stored-grain insects, the recognition accuracy of the cross-validation training model in support vector machine (SVM) algorithm was taken as an important factor of the evaluation principle of feature extraction of stored-grain insects. The ant colony optimization (ACO) algorithm was applied to the automatic feature extraction of stored-grain insects. [Result] The algorithm extracted the optimal feature subspace of seven features from the 17 morphological features, including area and perimeter. The ninety image samples of the stored-grain insects were automatically recognized by the optimized SVM classifier, and the recognition accuracy was over 95%. [Conclusion] The experiment shows that the application of ant colony optimization to the feature extraction of grain insects is practical and feasible.
基金supported by the China Railway Corporation Science and Technology Research and Development Program(Grant Nos.K2020G035 and K2021G024)the National Natural Science Foundation of China(Grant No.52378411).
文摘The suddenness, uncertainty, and randomness of rockbursts directly affect the safety of tunnel construction. The prediction of rockbursts is a fundamental aspect of mitigating or even eliminating rockburst hazards. To address the shortcomings of the current rockburst prediction models, which have a limited number of samples and rely on manual test results as the majority of their input features, this paper proposes rockburst prediction models based on multi-featured drilling parameters of rock drilling jumbo. Firstly, four original drilling parameters, namely hammer pressure (Ph), feed pressure (Pf), rotation pressure (Pr), and feed speed (VP), together with the rockburst grades, were collected from 1093 rockburst cases. Then, a feature expansion investigation was performed based on the four original drilling parameters to establish a drilling parameter feature system and a rockburst prediction database containing 42 features. Furthermore, rockburst prediction models based on multi-featured drilling parameters were developed using the extreme tree (ET) algorithm and Bayesian optimization. The models take drilling parameters as input parameters and rockburst grades as output parameters. The effects of Bayesian optimization and the number of drilling parameter features on the model performance were analyzed using the accuracy, precision, recall and F1 value of the prediction set as the model performance evaluation indices. The results show that the Bayesian optimized model with 42 drilling parameter features as inputs performs best, with an accuracy of 91.89%. Finally, the reliability of the models was validated through field tests.
文摘Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.
文摘One exciting area within computer vision is classifying human activities, which has diverse applications like medical informatics, human-computer interaction, surveillance, and task monitoring systems. In the healthcare field, understanding and classifying patients’ activities is crucial for providing doctors with essential information for medication reactions and diagnosis. While some research methods already exist, utilizing machine learning and soft computational algorithms to recognize human activity from videos and images, there’s ongoing exploration of more advanced computer vision techniques. This paper introduces a straightforward and effective automated approach that involves five key steps: preprocessing, feature extraction technique, feature selection, feature fusion, and finally classification. To evaluate the proposed approach, two commonly used benchmark datasets KTH and Weizmann are employed for training, validation, and testing of ML classifiers. The study’s findings show that the first and second datasets had remarkable accuracy rates of 99.94% and 99.80%, respectively. When compared to existing methods, our approach stands out in terms of sensitivity, accuracy, precision, and specificity evaluation metrics. In essence, this paper demonstrates a practical method for automatically classifying human activities using an optimal feature fusion and deep learning approach, promising a great result that could benefit various fields, particularly in healthcare.
基金Acknowledgments This work was supported by National Natural Science Foundation of China (Grant no. 60971089), National Electronic Development Foundation of China (Grant no. 2009537), Jilin Province Science and Tech- nology Department Project of China (Grant no. 20090502).
文摘Particle Swarm Optimization (PSO) is a popular and bionic algorithm based on the social behavior associated with bird flocking for optimization problems. To maintain the diversity of swarms, a few studies of multi-swarm strategy have been reported. However, the competition among swarms, reservation or destruction of a swarm, has not been considered further. In this paper, we formulate four rules by introducing the mechanism for survival of the fittest, which simulates the competition among the swarms. Based on the mechanism, we design a modified Multi-Swarm PSO (MSPSO) to solve discrete problems, which consists of a number of sub-swarms and a multi-swarm scheduler that can monitor and control each sub-swarm using the rules. To further settle the feature selection problems, we propose an Improved Feature Selection (1FS) method by integrating MSPSO, Support Vector Machines (SVM) with F-score method. The IFS method aims to achieve higher generalization capa- bility through performing kernel parameter optimization and feature selection simultaneously. The performance of the proposed method is compared with that of the standard PSO based, Genetic Algorithm (GA) based and the grid search based mcthods on 10 benchmark datasets, taken from UCI machine learning and StatLog databases. The numerical results and statistical analysis show that the proposed IFS method performs significantly better than the other three methods in terms of prediction accuracy with smaller subset of features.
基金This work was financially supported by the National High Technology Research and Development Program of China (No.2003AA331080 and 2001AA339030)the Talent Science Research Foundation of Henan University of Science & Technology (No.09001121).
文摘Considering that the surface defects of cold rolled strips are hard to be recognized by human eyes under high-speed circumstances, an automatic recognition technique was discussed. Spectrum images of defects can be got by fast Fourier transform (FFF) and sum of valid pixels (SVP), and its optimized center region, which concentrates nearly all energies, are extracted as an original feature set. Using genetic algorithm to optimize the feature set, an optimized feature set with 51 features can be achieved. Using the optimized feature set as an input vector of neural networks, the recognition effects of LVQ neural networks have been studied. Experiment results show that the new method can get a higher classification rate and can settle the automatic recognition problem of surface defects on cold rolled strips ideally.
基金supported by the National Natural Science Foundation of China (60774096)the National HighTech R&D Program of China (2008BAK49B05)
文摘Feature optimization is important to agricultural text mining. Usually, the vector space model is used to represent text documents. However, this basic approach still suffers from two drawbacks: thecurse of dimension and the lack of semantic information. In this paper, a novel ontology-based feature optimization method for agricultural text was proposed. First, terms of vector space model were mapped into concepts of agricultural ontology, which concept frequency weights are computed statistically by term frequency weights; second, weights of concept similarity were assigned to the concept features according to the structure of the agricultural ontology. By combining feature frequency weights and feature similarity weights based on the agricultural ontology, the dimensionality of feature space can be reduced drastically. Moreover, the semantic information can be incorporated into this method. The results showed that this method yields a significant improvement on agricultural text clustering by the feature optimization.
文摘Dipper throated optimization(DTO)algorithm is a novel with a very efficient metaheuristic inspired by the dipper throated bird.DTO has its unique hunting technique by performing rapid bowing movements.To show the efficiency of the proposed algorithm,DTO is tested and compared to the algorithms of Particle Swarm Optimization(PSO),Whale Optimization Algorithm(WOA),Grey Wolf Optimizer(GWO),and Genetic Algorithm(GA)based on the seven unimodal benchmark functions.Then,ANOVA and Wilcoxon rank-sum tests are performed to confirm the effectiveness of the DTO compared to other optimization techniques.Additionally,to demonstrate the proposed algorithm’s suitability for solving complex realworld issues,DTO is used to solve the feature selection problem.The strategy of using DTOs as feature selection is evaluated using commonly used data sets from the University of California at Irvine(UCI)repository.The findings indicate that the DTO outperforms all other algorithms in addressing feature selection issues,demonstrating the proposed algorithm’s capabilities to solve complex real-world situations.