Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convol...Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.展开更多
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr...Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.展开更多
In the field of calculating the attack area of air-to-air missiles in modern air combat scenarios,the limitations of existing research,including real-time calculation,accuracy efficiency trade-off,and the absence of t...In the field of calculating the attack area of air-to-air missiles in modern air combat scenarios,the limitations of existing research,including real-time calculation,accuracy efficiency trade-off,and the absence of the three-dimensional attack area model,restrict their practical applications.To address these issues,an improved backtracking algorithm is proposed to improve calculation efficiency.A significant reduction in solution time and maintenance of accuracy in the three-dimensional attack area are achieved by using the proposed algorithm.Furthermore,the age-layered population structure genetic programming(ALPS-GP)algorithm is introduced to determine an analytical polynomial model of the three-dimensional attack area,considering real-time requirements.The accuracy of the polynomial model is enhanced through the coefficient correction using an improved gradient descent algorithm.The study reveals a remarkable combination of high accuracy and efficient real-time computation,with a mean error of 91.89 m using the analytical polynomial model of the three-dimensional attack area solved in just 10^(-4)s,thus meeting the requirements of real-time combat scenarios.展开更多
The original intention of the algorithmic recommender system is to grapple with the negative impacts caused by information overload,but the system also can be used as"hypernudge",a new form of online manipul...The original intention of the algorithmic recommender system is to grapple with the negative impacts caused by information overload,but the system also can be used as"hypernudge",a new form of online manipulation,to inten⁃tionally exploit people's cognitive and decision-making gaps to influence their decisions in practice,which is particu⁃larly detrimental to the sustainable development of the digital market.Limiting harmful algorithmic online manipula⁃tion in digital markets has become a challenging task.Globally,both the EU and China have responded to this issue,and the differences between them are so evident that their governance measures can serve as the typical case.The EU focuses on improving citizens'digital literacy and their ability to integrate into digital social life to independently ad⁃dress this issue,and expects to address harmful manipulation behavior through binding and applicable hard law,which is part of the digital strategy.By comparison,although there exist certain legal norms that have made relevant stipula⁃tions on manipulation issues,China continues to issue specific departmental regulations to regulate algorithmic recom⁃mender services,and pays more attention to addressing collective harm caused by algorithmic online manipulation through a multiple co-governance approach led by the government or industry associations to implement supervision.展开更多
Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability...Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability.In this paper,Hybrid Golden Jackal,and Improved Whale Optimization Algorithm(HGJIWOA)is proposed as an effective and optimal routing protocol that guarantees efficient routing of data packets in the established between the CHs and the movable sink.This HGJIWOA included the phases of Dynamic Lens-Imaging Learning Strategy and Novel Update Rules for determining the reliable route essential for data packets broadcasting attained through fitness measure estimation-based CH selection.The process of CH selection achieved using Golden Jackal Optimization Algorithm(GJOA)completely depends on the factors of maintainability,consistency,trust,delay,and energy.The adopted GJOA algorithm play a dominant role in determining the optimal path of routing depending on the parameter of reduced delay and minimal distance.It further utilized Improved Whale Optimisation Algorithm(IWOA)for forwarding the data from chosen CHs to the BS via optimized route depending on the parameters of energy and distance.It also included a reliable route maintenance process that aids in deciding the selected route through which data need to be transmitted or re-routed.The simulation outcomes of the proposed HGJIWOA mechanism with different sensor nodes confirmed an improved mean throughput of 18.21%,sustained residual energy of 19.64%with minimized end-to-end delay of 21.82%,better than the competitive CH selection approaches.展开更多
This paper examines the impact of algorithmic recommendations and data-driven marketing on consumer engagement and business performance.By leveraging large volumes of user data,businesses can deliver personalized cont...This paper examines the impact of algorithmic recommendations and data-driven marketing on consumer engagement and business performance.By leveraging large volumes of user data,businesses can deliver personalized content that enhances user experiences and increases conversion rates.However,the growing reliance on these technologies introduces significant risks,including privacy violations,algorithmic bias,and ethical concerns.This paper explores these challenges and provides recommendations for businesses to mitigate associated risks while optimizing marketing strategies.It highlights the importance of transparency,fairness,and user control in ensuring responsible and effective data-driven marketing.展开更多
Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant ...Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.展开更多
In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update ...In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update based on two extreme values: personal best and global best, which limits the diversity of information. Ideally, particles should learn from multiple advantageous particles to enhance interactivity and optimization efficiency. Accordingly, this paper proposes a PSO that simulates the evolutionary dynamics of species survival in mountain peak ecology (PEPSO) for feature selection. Based on the pyramid topology, the algorithm simulates the features of mountain peak ecology in nature and the competitive-cooperative strategies among species. According to the principles of the algorithm, the population is first adaptively divided into many subgroups based on the fitness level of particles. Then, particles within each subgroup are divided into three different types based on their evolutionary levels, employing different adaptive inertia weight rules and dynamic learning mechanisms to define distinct learning modes. Consequently, all particles play their respective roles in promoting the global optimization performance of the algorithm, similar to different species in the ecological pattern of mountain peaks. Experimental validation of the PEPSO performance was conducted on 18 public datasets. The experimental results demonstrate that the PEPSO outperforms other PSO variant-based feature selection methods and mainstream feature selection methods based on intelligent optimization algorithms in terms of overall performance in global search capability, classification accuracy, and reduction of feature space dimensions. Wilcoxon signed-rank test also confirms the excellent performance of the PEPSO.展开更多
The presence of circles in the network maximum flow problem increases the complexity of the preflow algorithm.This study proposes a novel two-stage preflow algorithm to address this issue.First,this study proves that ...The presence of circles in the network maximum flow problem increases the complexity of the preflow algorithm.This study proposes a novel two-stage preflow algorithm to address this issue.First,this study proves that at least one zero-flow arc must be present when the flow of the network reaches its maximum value.This result indicates that the maximum flow of the network will remain constant if a zero-flow arc within a circle is removed;therefore,the maximum flow of each network without circles can be calculated.The first stage involves identifying the zero-flow arc in the circle when the network flow reaches its maximum.The second stage aims to remove the zero-flow arc identified and modified in the first stage,thereby producing a new network without circles.The maximum flow of the original looped network can be obtained by solving the maximum flow of the newly generated acyclic network.Finally,an example is provided to demonstrate the validity and feasibility of this algorithm.This algorithm not only improves computational efficiency but also provides new perspectives and tools for solving similar network optimization problems.展开更多
With the continuous growth of power demand and the diversification of power consumption structure,the loss of distribution network has gradually become the focus of attention.Given the problems of single loss reductio...With the continuous growth of power demand and the diversification of power consumption structure,the loss of distribution network has gradually become the focus of attention.Given the problems of single loss reduction measure,lack of economy,and practicality in existing research,this paper proposes an optimization method of distribution network loss reduction based on tabu search algorithm and optimizes the combination and parameter configuration of loss reduction measure.The optimization model is developed with the goal of maximizing comprehensive benefits,incorporating both economic and environmental factors,and accounting for investment costs,including the loss of power reduction.Additionally,the model ensures that constraint conditions such as power flow equations,voltage deviations,and line transmission capacities are satisfied.The solution is obtained through a tabu search algorithm,which is well-suited for solving nonlinear problems with multiple constraints.Combined with the example of 10kV25 node construction,the simulation results show that the method can significantly reduce the network loss on the basis of ensuring the economy and environmental protection of the system,which provides a theoretical basis for distribution network planning.展开更多
The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resource...The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.展开更多
Evolutionary algorithms have been extensively utilized in practical applications.However,manually designed population updating formulas are inherently prone to the subjective influence of the designer.Genetic programm...Evolutionary algorithms have been extensively utilized in practical applications.However,manually designed population updating formulas are inherently prone to the subjective influence of the designer.Genetic programming(GP),characterized by its tree-based solution structure,is a widely adopted technique for optimizing the structure of mathematical models tailored to real-world problems.This paper introduces a GP-based framework(GPEAs)for the autonomous generation of update formulas,aiming to reduce human intervention.Partial modifications to tree-based GP have been instigated,encompassing adjustments to its initialization process and fundamental update operations such as crossover and mutation within the algorithm.By designing suitable function sets and terminal sets tailored to the selected evolutionary algorithm,and ultimately derive an improved update formula.The Cat Swarm Optimization Algorithm(CSO)is chosen as a case study,and the GP-EAs is employed to regenerate the speed update formulas of the CSO.To validate the feasibility of the GP-EAs,the comprehensive performance of the enhanced algorithm(GP-CSO)was evaluated on the CEC2017 benchmark suite.Furthermore,GP-CSO is applied to deduce suitable embedding factors,thereby improving the robustness of the digital watermarking process.The experimental results indicate that the update formulas generated through training with GP-EAs possess excellent performance scalability and practical application proficiency.展开更多
Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnos...Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnostic methods,electrocardiography(ECG)is particularly well-known for its ability to detect MI.However,confirming its accuracy—particularly in identifying the localization of myocardial damage—often presents challenges in practice.This study,therefore,proposes a new approach based on machine learning models for the analysis of 12-lead ECG data to accurately identify the localization of MI.In particular,the learning vector quantization(LVQ)algorithm was applied,considering the contribution of each ECG lead in the 12-channel system,which obtained an accuracy of 87%in localizing damaged myocardium.The developed model was tested on verified data from the PTB database,including 445 ECG recordings from both healthy individuals and MI-diagnosed patients.The results demonstrated that the 12-lead ECG system allows for a comprehensive understanding of cardiac activities in myocardial infarction patients,serving as an essential tool for the diagnosis of myocardial conditions and localizing their damage.A comprehensive comparison was performed,including CNN,SVM,and Logistic Regression,to evaluate the proposed LVQ model.The results demonstrate that the LVQ model achieves competitive performance in diagnostic tasks while maintaining computational efficiency,making it suitable for resource-constrained environments.This study also applies a carefully designed data pre-processing flow,including class balancing and noise removal,which improves the reliability and reproducibility of the results.These aspects highlight the potential application of the LVQ model in cardiac diagnostics,opening up prospects for its use along with more complex neural network architectures.展开更多
Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic ...Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic models,and there is a significant gap between the research results and actual wireless sensor networks.Some scholars have now modeled data fusion networks to make them more suitable for practical applications.This paper will explore the deployment problem of a stochastic data fusion wireless sensor network(SDFWSN),a model that reflects the randomness of environmental monitoring and uses data fusion techniques widely used in actual sensor networks for information collection.The deployment problem of SDFWSN is modeled as a multi-objective optimization problem.The network life cycle,spatiotemporal coverage,detection rate,and false alarm rate of SDFWSN are used as optimization objectives to optimize the deployment of network nodes.This paper proposes an enhanced multi-objective mongoose optimization algorithm(EMODMOA)to solve the deployment problem of SDFWSN.First,to overcome the shortcomings of the DMOA algorithm,such as its low convergence and tendency to get stuck in a local optimum,an encircling and hunting strategy is introduced into the original algorithm to propose the EDMOA algorithm.The EDMOA algorithm is designed as the EMODMOA algorithm by selecting reference points using the K-Nearest Neighbor(KNN)algorithm.To verify the effectiveness of the proposed algorithm,the EMODMOA algorithm was tested at CEC 2020 and achieved good results.In the SDFWSN deployment problem,the algorithm was compared with the Non-dominated Sorting Genetic Algorithm II(NSGAII),Multiple Objective Particle Swarm Optimization(MOPSO),Multi-Objective Evolutionary Algorithm based on Decomposition(MOEA/D),and Multi-Objective Grey Wolf Optimizer(MOGWO).By comparing and analyzing the performance evaluation metrics and optimization results of the objective functions of the multi-objective algorithms,the algorithm outperforms the other algorithms in the SDFWSN deployment results.To better demonstrate the superiority of the algorithm,simulations of diverse test cases were also performed,and good results were obtained.展开更多
Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To sa...Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To satisfy quality of service(QoS)requirements of various users,it is critical to research efficient routing strategies to fully utilize satellite resources.This paper proposes a multi-QoS information optimized routing algorithm based on reinforcement learning for LEO satellite networks,which guarantees high level assurance demand services to be prioritized under limited satellite resources while considering the load balancing performance of the satellite networks for low level assurance demand services to ensure the full and effective utilization of satellite resources.An auxiliary path search algorithm is proposed to accelerate the convergence of satellite routing algorithm.Simulation results show that the generated routing strategy can timely process and fully meet the QoS demands of high assurance services while effectively improving the load balancing performance of the link.展开更多
Objective To study the key technologies in the field of ginsenosides and to offer a guide for the future development ginsenosides through the main path identification method based on genetic knowledge persistence algo...Objective To study the key technologies in the field of ginsenosides and to offer a guide for the future development ginsenosides through the main path identification method based on genetic knowledge persistence algorithm(GKPA).Methods The global ginsenoside invention authorized patents were used as the data source to construct a ginsenoside patent self-citation network,and to identify high knowledge persistent patents(HKPP)of ginsenoside technology based on the GKPA,and extract its high knowledge persistence main path(HKPMP).Finally,the genetic forward and backward path(GFBP)was used to search the nodes on the main path,and draw the genetic forward and backward main path(GFBMP)of ginsenoside technology.Results and Conclusion The algorithm was applied to the field of ginsenosides.The research results show the milestone patents in ginsenosides technology and the main evolution process of three key technologies,which points out the future direction for the technological development of ginsenosides.The results obtained by this algorithm are more interpretable,comprehensive and scientific.展开更多
基金supported by Science and Technology Innovation Programfor Postgraduate Students in IDP Subsidized by Fundamental Research Funds for the Central Universities(Project No.ZY20240335)support of the Research Project of the Key Technology of Malicious Code Detection Based on Data Mining in APT Attack(Project No.2022IT173)the Research Project of the Big Data Sensitive Information Supervision Technology Based on Convolutional Neural Network(Project No.2022011033).
文摘Previous studies have shown that deep learning is very effective in detecting known attacks.However,when facing unknown attacks,models such as Deep Neural Networks(DNN)combined with Long Short-Term Memory(LSTM),Convolutional Neural Networks(CNN)combined with LSTM,and so on are built by simple stacking,which has the problems of feature loss,low efficiency,and low accuracy.Therefore,this paper proposes an autonomous detectionmodel for Distributed Denial of Service attacks,Multi-Scale Convolutional Neural Network-Bidirectional Gated Recurrent Units-Single Headed Attention(MSCNN-BiGRU-SHA),which is based on a Multistrategy Integrated Zebra Optimization Algorithm(MI-ZOA).The model undergoes training and testing with the CICDDoS2019 dataset,and its performance is evaluated on a new GINKS2023 dataset.The hyperparameters for Conv_filter and GRU_unit are optimized using the Multi-strategy Integrated Zebra Optimization Algorithm(MIZOA).The experimental results show that the test accuracy of the MSCNN-BiGRU-SHA model based on the MIZOA proposed in this paper is as high as 0.9971 in the CICDDoS 2019 dataset.The evaluation accuracy of the new dataset GINKS2023 created in this paper is 0.9386.Compared to the MSCNN-BiGRU-SHA model based on the Zebra Optimization Algorithm(ZOA),the detection accuracy on the GINKS2023 dataset has improved by 5.81%,precisionhas increasedby 1.35%,the recallhas improvedby 9%,and theF1scorehas increasedby 5.55%.Compared to the MSCNN-BiGRU-SHA models developed using Grid Search,Random Search,and Bayesian Optimization,the MSCNN-BiGRU-SHA model optimized with the MI-ZOA exhibits better performance in terms of accuracy,precision,recall,and F1 score.
基金supported by the National Natural Science Foundation of China(NSFC)under Grant(No.51677058).
文摘Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.
基金National Natural Science Foundation of China(62373187)Forward-looking Layout Special Projects(ILA220591A22)。
文摘In the field of calculating the attack area of air-to-air missiles in modern air combat scenarios,the limitations of existing research,including real-time calculation,accuracy efficiency trade-off,and the absence of the three-dimensional attack area model,restrict their practical applications.To address these issues,an improved backtracking algorithm is proposed to improve calculation efficiency.A significant reduction in solution time and maintenance of accuracy in the three-dimensional attack area are achieved by using the proposed algorithm.Furthermore,the age-layered population structure genetic programming(ALPS-GP)algorithm is introduced to determine an analytical polynomial model of the three-dimensional attack area,considering real-time requirements.The accuracy of the polynomial model is enhanced through the coefficient correction using an improved gradient descent algorithm.The study reveals a remarkable combination of high accuracy and efficient real-time computation,with a mean error of 91.89 m using the analytical polynomial model of the three-dimensional attack area solved in just 10^(-4)s,thus meeting the requirements of real-time combat scenarios.
文摘The original intention of the algorithmic recommender system is to grapple with the negative impacts caused by information overload,but the system also can be used as"hypernudge",a new form of online manipulation,to inten⁃tionally exploit people's cognitive and decision-making gaps to influence their decisions in practice,which is particu⁃larly detrimental to the sustainable development of the digital market.Limiting harmful algorithmic online manipula⁃tion in digital markets has become a challenging task.Globally,both the EU and China have responded to this issue,and the differences between them are so evident that their governance measures can serve as the typical case.The EU focuses on improving citizens'digital literacy and their ability to integrate into digital social life to independently ad⁃dress this issue,and expects to address harmful manipulation behavior through binding and applicable hard law,which is part of the digital strategy.By comparison,although there exist certain legal norms that have made relevant stipula⁃tions on manipulation issues,China continues to issue specific departmental regulations to regulate algorithmic recom⁃mender services,and pays more attention to addressing collective harm caused by algorithmic online manipulation through a multiple co-governance approach led by the government or industry associations to implement supervision.
文摘Reliable Cluster Head(CH)selectionbased routing protocols are necessary for increasing the packet transmission efficiency with optimal path discovery that never introduces degradation over the transmission reliability.In this paper,Hybrid Golden Jackal,and Improved Whale Optimization Algorithm(HGJIWOA)is proposed as an effective and optimal routing protocol that guarantees efficient routing of data packets in the established between the CHs and the movable sink.This HGJIWOA included the phases of Dynamic Lens-Imaging Learning Strategy and Novel Update Rules for determining the reliable route essential for data packets broadcasting attained through fitness measure estimation-based CH selection.The process of CH selection achieved using Golden Jackal Optimization Algorithm(GJOA)completely depends on the factors of maintainability,consistency,trust,delay,and energy.The adopted GJOA algorithm play a dominant role in determining the optimal path of routing depending on the parameter of reduced delay and minimal distance.It further utilized Improved Whale Optimisation Algorithm(IWOA)for forwarding the data from chosen CHs to the BS via optimized route depending on the parameters of energy and distance.It also included a reliable route maintenance process that aids in deciding the selected route through which data need to be transmitted or re-routed.The simulation outcomes of the proposed HGJIWOA mechanism with different sensor nodes confirmed an improved mean throughput of 18.21%,sustained residual energy of 19.64%with minimized end-to-end delay of 21.82%,better than the competitive CH selection approaches.
文摘This paper examines the impact of algorithmic recommendations and data-driven marketing on consumer engagement and business performance.By leveraging large volumes of user data,businesses can deliver personalized content that enhances user experiences and increases conversion rates.However,the growing reliance on these technologies introduces significant risks,including privacy violations,algorithmic bias,and ethical concerns.This paper explores these challenges and provides recommendations for businesses to mitigate associated risks while optimizing marketing strategies.It highlights the importance of transparency,fairness,and user control in ensuring responsible and effective data-driven marketing.
基金supported by the National Key R&D Program of China(Grant No.2022YFA1005000)the National Natural Science Foundation of China(Grant No.62025110 and 62101308).
文摘Satellite Internet(SI)provides broadband access as a critical information infrastructure in 6G.However,with the integration of the terrestrial Internet,the influx of massive terrestrial traffic will bring significant threats to SI,among which DDoS attack will intensify the erosion of limited bandwidth resources.Therefore,this paper proposes a DDoS attack tracking scheme using a multi-round iterative Viterbi algorithm to achieve high-accuracy attack path reconstruction and fast internal source locking,protecting SI from the source.Firstly,to reduce communication overhead,the logarithmic representation of the traffic volume is added to the digests after modeling SI,generating the lightweight deviation degree to construct the observation probability matrix for the Viterbi algorithm.Secondly,the path node matrix is expanded to multi-index matrices in the Viterbi algorithm to store index information for all probability values,deriving the path with non-repeatability and maximum probability.Finally,multiple rounds of iterative Viterbi tracking are performed locally to track DDoS attack based on trimming tracking results.Simulation and experimental results show that the scheme can achieve 96.8%tracking accuracy of external and internal DDoS attack at 2.5 seconds,with the communication overhead at 268KB/s,effectively protecting the limited bandwidth resources of SI.
文摘In recent years, particle swarm optimization (PSO) has received widespread attention in feature selection due to its simplicity and potential for global search. However, in traditional PSO, particles primarily update based on two extreme values: personal best and global best, which limits the diversity of information. Ideally, particles should learn from multiple advantageous particles to enhance interactivity and optimization efficiency. Accordingly, this paper proposes a PSO that simulates the evolutionary dynamics of species survival in mountain peak ecology (PEPSO) for feature selection. Based on the pyramid topology, the algorithm simulates the features of mountain peak ecology in nature and the competitive-cooperative strategies among species. According to the principles of the algorithm, the population is first adaptively divided into many subgroups based on the fitness level of particles. Then, particles within each subgroup are divided into three different types based on their evolutionary levels, employing different adaptive inertia weight rules and dynamic learning mechanisms to define distinct learning modes. Consequently, all particles play their respective roles in promoting the global optimization performance of the algorithm, similar to different species in the ecological pattern of mountain peaks. Experimental validation of the PEPSO performance was conducted on 18 public datasets. The experimental results demonstrate that the PEPSO outperforms other PSO variant-based feature selection methods and mainstream feature selection methods based on intelligent optimization algorithms in terms of overall performance in global search capability, classification accuracy, and reduction of feature space dimensions. Wilcoxon signed-rank test also confirms the excellent performance of the PEPSO.
基金The National Natural Science Foundation of China(No.72001107,72271120)the Fundamental Research Funds for the Central Universities(No.NS2024047,NP2024106)the China Postdoctoral Science Foundation(No.2020T130297,2019M660119).
文摘The presence of circles in the network maximum flow problem increases the complexity of the preflow algorithm.This study proposes a novel two-stage preflow algorithm to address this issue.First,this study proves that at least one zero-flow arc must be present when the flow of the network reaches its maximum value.This result indicates that the maximum flow of the network will remain constant if a zero-flow arc within a circle is removed;therefore,the maximum flow of each network without circles can be calculated.The first stage involves identifying the zero-flow arc in the circle when the network flow reaches its maximum.The second stage aims to remove the zero-flow arc identified and modified in the first stage,thereby producing a new network without circles.The maximum flow of the original looped network can be obtained by solving the maximum flow of the newly generated acyclic network.Finally,an example is provided to demonstrate the validity and feasibility of this algorithm.This algorithm not only improves computational efficiency but also provides new perspectives and tools for solving similar network optimization problems.
文摘With the continuous growth of power demand and the diversification of power consumption structure,the loss of distribution network has gradually become the focus of attention.Given the problems of single loss reduction measure,lack of economy,and practicality in existing research,this paper proposes an optimization method of distribution network loss reduction based on tabu search algorithm and optimizes the combination and parameter configuration of loss reduction measure.The optimization model is developed with the goal of maximizing comprehensive benefits,incorporating both economic and environmental factors,and accounting for investment costs,including the loss of power reduction.Additionally,the model ensures that constraint conditions such as power flow equations,voltage deviations,and line transmission capacities are satisfied.The solution is obtained through a tabu search algorithm,which is well-suited for solving nonlinear problems with multiple constraints.Combined with the example of 10kV25 node construction,the simulation results show that the method can significantly reduce the network loss on the basis of ensuring the economy and environmental protection of the system,which provides a theoretical basis for distribution network planning.
文摘The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.
文摘Evolutionary algorithms have been extensively utilized in practical applications.However,manually designed population updating formulas are inherently prone to the subjective influence of the designer.Genetic programming(GP),characterized by its tree-based solution structure,is a widely adopted technique for optimizing the structure of mathematical models tailored to real-world problems.This paper introduces a GP-based framework(GPEAs)for the autonomous generation of update formulas,aiming to reduce human intervention.Partial modifications to tree-based GP have been instigated,encompassing adjustments to its initialization process and fundamental update operations such as crossover and mutation within the algorithm.By designing suitable function sets and terminal sets tailored to the selected evolutionary algorithm,and ultimately derive an improved update formula.The Cat Swarm Optimization Algorithm(CSO)is chosen as a case study,and the GP-EAs is employed to regenerate the speed update formulas of the CSO.To validate the feasibility of the GP-EAs,the comprehensive performance of the enhanced algorithm(GP-CSO)was evaluated on the CEC2017 benchmark suite.Furthermore,GP-CSO is applied to deduce suitable embedding factors,thereby improving the robustness of the digital watermarking process.The experimental results indicate that the update formulas generated through training with GP-EAs possess excellent performance scalability and practical application proficiency.
基金funded by the Ministry of Science and Higher Education of the Republic of Kazakhstan,grant numbers AP14969403 and AP23485820.
文摘Myocardial infarction(MI)is one of the leading causes of death globally among cardiovascular diseases,necessitating modern and accurate diagnostics for cardiac patient conditions.Among the available functional diagnostic methods,electrocardiography(ECG)is particularly well-known for its ability to detect MI.However,confirming its accuracy—particularly in identifying the localization of myocardial damage—often presents challenges in practice.This study,therefore,proposes a new approach based on machine learning models for the analysis of 12-lead ECG data to accurately identify the localization of MI.In particular,the learning vector quantization(LVQ)algorithm was applied,considering the contribution of each ECG lead in the 12-channel system,which obtained an accuracy of 87%in localizing damaged myocardium.The developed model was tested on verified data from the PTB database,including 445 ECG recordings from both healthy individuals and MI-diagnosed patients.The results demonstrated that the 12-lead ECG system allows for a comprehensive understanding of cardiac activities in myocardial infarction patients,serving as an essential tool for the diagnosis of myocardial conditions and localizing their damage.A comprehensive comparison was performed,including CNN,SVM,and Logistic Regression,to evaluate the proposed LVQ model.The results demonstrate that the LVQ model achieves competitive performance in diagnostic tasks while maintaining computational efficiency,making it suitable for resource-constrained environments.This study also applies a carefully designed data pre-processing flow,including class balancing and noise removal,which improves the reliability and reproducibility of the results.These aspects highlight the potential application of the LVQ model in cardiac diagnostics,opening up prospects for its use along with more complex neural network architectures.
基金supported by the National Natural Science Foundation of China under Grant Nos.U21A20464,62066005Innovation Project of Guangxi Graduate Education under Grant No.YCSW2024313.
文摘Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic models,and there is a significant gap between the research results and actual wireless sensor networks.Some scholars have now modeled data fusion networks to make them more suitable for practical applications.This paper will explore the deployment problem of a stochastic data fusion wireless sensor network(SDFWSN),a model that reflects the randomness of environmental monitoring and uses data fusion techniques widely used in actual sensor networks for information collection.The deployment problem of SDFWSN is modeled as a multi-objective optimization problem.The network life cycle,spatiotemporal coverage,detection rate,and false alarm rate of SDFWSN are used as optimization objectives to optimize the deployment of network nodes.This paper proposes an enhanced multi-objective mongoose optimization algorithm(EMODMOA)to solve the deployment problem of SDFWSN.First,to overcome the shortcomings of the DMOA algorithm,such as its low convergence and tendency to get stuck in a local optimum,an encircling and hunting strategy is introduced into the original algorithm to propose the EDMOA algorithm.The EDMOA algorithm is designed as the EMODMOA algorithm by selecting reference points using the K-Nearest Neighbor(KNN)algorithm.To verify the effectiveness of the proposed algorithm,the EMODMOA algorithm was tested at CEC 2020 and achieved good results.In the SDFWSN deployment problem,the algorithm was compared with the Non-dominated Sorting Genetic Algorithm II(NSGAII),Multiple Objective Particle Swarm Optimization(MOPSO),Multi-Objective Evolutionary Algorithm based on Decomposition(MOEA/D),and Multi-Objective Grey Wolf Optimizer(MOGWO).By comparing and analyzing the performance evaluation metrics and optimization results of the objective functions of the multi-objective algorithms,the algorithm outperforms the other algorithms in the SDFWSN deployment results.To better demonstrate the superiority of the algorithm,simulations of diverse test cases were also performed,and good results were obtained.
基金National Key Research and Development Program(2021YFB2900604)。
文摘Low Earth orbit(LEO)satellite networks exhibit distinct characteristics,e.g.,limited resources of individual satellite nodes and dynamic network topology,which have brought many challenges for routing algorithms.To satisfy quality of service(QoS)requirements of various users,it is critical to research efficient routing strategies to fully utilize satellite resources.This paper proposes a multi-QoS information optimized routing algorithm based on reinforcement learning for LEO satellite networks,which guarantees high level assurance demand services to be prioritized under limited satellite resources while considering the load balancing performance of the satellite networks for low level assurance demand services to ensure the full and effective utilization of satellite resources.An auxiliary path search algorithm is proposed to accelerate the convergence of satellite routing algorithm.Simulation results show that the generated routing strategy can timely process and fully meet the QoS demands of high assurance services while effectively improving the load balancing performance of the link.
文摘Objective To study the key technologies in the field of ginsenosides and to offer a guide for the future development ginsenosides through the main path identification method based on genetic knowledge persistence algorithm(GKPA).Methods The global ginsenoside invention authorized patents were used as the data source to construct a ginsenoside patent self-citation network,and to identify high knowledge persistent patents(HKPP)of ginsenoside technology based on the GKPA,and extract its high knowledge persistence main path(HKPMP).Finally,the genetic forward and backward path(GFBP)was used to search the nodes on the main path,and draw the genetic forward and backward main path(GFBMP)of ginsenoside technology.Results and Conclusion The algorithm was applied to the field of ginsenosides.The research results show the milestone patents in ginsenosides technology and the main evolution process of three key technologies,which points out the future direction for the technological development of ginsenosides.The results obtained by this algorithm are more interpretable,comprehensive and scientific.