Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algori...Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algorithm to achieve optimal results.In our proposed model,gamma correction and Retinex address color cast issues and enhance image edges and details.The final enhanced image is obtained through color balancing.The BES algorithm seeks the optimal solution through the selection,search,and swooping stages.However,it is prone to getting stuck in local optima and converges slowly.To overcome these limitations,we propose an improved BES algorithm(ABES)with enhanced population learning,position updates,and control parameters.ABES is employed to optimize the core parameters of gamma correction and Retinex to improve image quality,and the maximization of information entropy is utilized as the objective function.Real benchmark images are collected to validate its performance.Experimental results demonstrate that ABES outperforms the existing image enhancement methods,including the flower pollination algorithm,the chimp optimization algorithm,particle swarm optimization,and BES,in terms of information entropy,peak signal-to-noise ratio(PSNR),structural similarity index(SSIM),and patch-based contrast quality index(PCQI).ABES demonstrates superior performance both qualitatively and quantitatively,and it helps enhance prominent features and contrast in the images while maintaining the natural appearance of the original images.展开更多
The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resource...The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.展开更多
BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches....BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.展开更多
For unachievable tracking problems, where the system output cannot precisely track a given reference, achieving the best possible approximation for the reference trajectory becomes the objective. This study aims to in...For unachievable tracking problems, where the system output cannot precisely track a given reference, achieving the best possible approximation for the reference trajectory becomes the objective. This study aims to investigate solutions using the Ptype learning control scheme. Initially, we demonstrate the necessity of gradient information for achieving the best approximation.Subsequently, we propose an input-output-driven learning gain design to handle the imprecise gradients of a class of uncertain systems. However, it is discovered that the desired performance may not be attainable when faced with incomplete information.To address this issue, an extended iterative learning control scheme is introduced. In this scheme, the tracking errors are modified through output data sampling, which incorporates lowmemory footprints and offers flexibility in learning gain design.The input sequence is shown to converge towards the desired input, resulting in an output that is closest to the given reference in the least square sense. Numerical simulations are provided to validate the theoretical findings.展开更多
By analyzing the correlation between courses in students’grades,we can provide a decision-making basis for the revision of courses and syllabi,rationally optimize courses,and further improve teaching effects.With the...By analyzing the correlation between courses in students’grades,we can provide a decision-making basis for the revision of courses and syllabi,rationally optimize courses,and further improve teaching effects.With the help of IBM SPSS Modeler data mining software,this paper uses Apriori algorithm for association rule mining to conduct an in-depth analysis of the grades of nursing students in Shandong College of Traditional Chinese Medicine,and to explore the correlation between professional basic courses and professional core courses.Lastly,according to the detailed analysis of the mining results,valuable curriculum information will be found from the actual teaching data.展开更多
This paper studies a strongly convergent inertial forward-backward-forward algorithm for the variational inequality problem in Hilbert spaces.In our convergence analysis,we do not assume the on-line rule of the inerti...This paper studies a strongly convergent inertial forward-backward-forward algorithm for the variational inequality problem in Hilbert spaces.In our convergence analysis,we do not assume the on-line rule of the inertial parameters and the iterates,which have been assumed by several authors whenever a strongly convergent algorithm with an inertial extrapolation step is proposed for a variational inequality problem.Consequently,our proof arguments are different from what is obtainable in the relevant literature.Finally,we give numerical tests to confirm the theoretical analysis and show that our proposed algorithm is superior to related ones in the literature.展开更多
Considering the pivotal role of single-wavelength anomalous diffraction(SAD) in macromolecular crystallography,our objective was to introduce DSAS,a novel program designed for efficient anomalous scattering substructu...Considering the pivotal role of single-wavelength anomalous diffraction(SAD) in macromolecular crystallography,our objective was to introduce DSAS,a novel program designed for efficient anomalous scattering substructure determination.DSAS stands out with its core components:a modified phase-retrieval algorithm and automated parameter tuning.The software boasts an intuitive graphical user interface(GUI),facilitating seamless input of essential data and real-time monitoring.Extensive testing on DSAS has involved diverse datasets,encompassing proteins,nucleic acids,and various anomalous scatters such as sulfur(S),selenium(Se),metals,and halogens.The results confirm DSAS’s exceptional performance in accurately determining heavy atom positions,making it a highly effective tool in the field.展开更多
Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental mea...Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental measurements,but these are often limited by the observation conditions,such as the number of configured sensors.Therefore,the resulting linear velocity profiles usually exhibit limitations in reproducing the temporal-varied and nonlinear behavior during the debris flow process.In this study,we present a novel approach to explore the debris flow velocity profile in detail upon our previous 3D-HBPSPH numerical model,i.e.,the three-dimensional Smoothed Particle Hydrodynamic model incorporating the Herschel-Bulkley-Papanastasiou rheology.Specifically,we propose a stratification aggregation algorithm for interpreting the details of SPH particles,which enables the recording of temporal velocities of debris flow at different mud depths.To analyze the velocity profile,we introduce a logarithmic-based nonlinear model with two key parameters,that a controlling the shape of velocity profile and b concerning its temporal evolution.We verify the proposed velocity profile and explore its sensitivity using 34 sets of velocity data from three individual flume experiments in previous literature.Our results demonstrate that the proposed temporalvaried nonlinear velocity profile outperforms the previous linear profiles.展开更多
The development of technologies such as big data and blockchain has brought convenience to life,but at the same time,privacy and security issues are becoming more and more prominent.The K-anonymity algorithm is an eff...The development of technologies such as big data and blockchain has brought convenience to life,but at the same time,privacy and security issues are becoming more and more prominent.The K-anonymity algorithm is an effective and low computational complexity privacy-preserving algorithm that can safeguard users’privacy by anonymizing big data.However,the algorithm currently suffers from the problem of focusing only on improving user privacy while ignoring data availability.In addition,ignoring the impact of quasi-identified attributes on sensitive attributes causes the usability of the processed data on statistical analysis to be reduced.Based on this,we propose a new K-anonymity algorithm to solve the privacy security problem in the context of big data,while guaranteeing improved data usability.Specifically,we construct a new information loss function based on the information quantity theory.Considering that different quasi-identification attributes have different impacts on sensitive attributes,we set weights for each quasi-identification attribute when designing the information loss function.In addition,to reduce information loss,we improve K-anonymity in two ways.First,we make the loss of information smaller than in the original table while guaranteeing privacy based on common artificial intelligence algorithms,i.e.,greedy algorithm and 2-means clustering algorithm.In addition,we improve the 2-means clustering algorithm by designing a mean-center method to select the initial center of mass.Meanwhile,we design the K-anonymity algorithm of this scheme based on the constructed information loss function,the improved 2-means clustering algorithm,and the greedy algorithm,which reduces the information loss.Finally,we experimentally demonstrate the effectiveness of the algorithm in improving the effect of 2-means clustering and reducing information loss.展开更多
As electro-hydrostatic actuator(EHA)technology advances towards lightweight and integration,the demand for enhanced internal flow pathways in hydraulic valve blocks intensifies.However,owing to the constraints imposed...As electro-hydrostatic actuator(EHA)technology advances towards lightweight and integration,the demand for enhanced internal flow pathways in hydraulic valve blocks intensifies.However,owing to the constraints imposed by traditional manufacturing processes,conventional hydraulic integrated valve blocks fail to satisfy the demands of a more compact channel layout and lower energy dissipation.Notably,the subjectivity in the arrangement of internal passages results in a time-consuming and labor-intensive process.This study employed additive manufacturing technology and the ant colony algorithm and B-spline curves for the meticulous design of internal passages within an aviation EHA valve block.The layout environment for the valve block passages was established,and path optimization was achieved using the ant colony algorithm,complemented by smoothing using B-spline curves.Three-dimensional modeling was performed using SolidWorks software,revealing a 10.03%reduction in volume for the optimized passages compared with the original passages.Computational fluid dynamics(CFD)simulations were performed using Fluent software,demonstrating that the algorithmically optimized passages effectively prevented the occurrence of vortices at right-angled locations,exhibited superior flow characteristics,and concurrently reduced pressure losses by 34.09%-36.36%.The small discrepancy between the experimental and simulation results validated the efficacy of the ant colony algorithm and B-spline curves in optimizing the passage design,offering a viable solution for channel design in additive manufacturing.展开更多
The existing algorithms for solving multi-objective optimization problems fall into three main categories:Decomposition-based,dominance-based,and indicator-based.Traditional multi-objective optimization problemsmainly...The existing algorithms for solving multi-objective optimization problems fall into three main categories:Decomposition-based,dominance-based,and indicator-based.Traditional multi-objective optimization problemsmainly focus on objectives,treating decision variables as a total variable to solve the problem without consideringthe critical role of decision variables in objective optimization.As seen,a variety of decision variable groupingalgorithms have been proposed.However,these algorithms are relatively broad for the changes of most decisionvariables in the evolution process and are time-consuming in the process of finding the Pareto frontier.To solvethese problems,a multi-objective optimization algorithm for grouping decision variables based on extreme pointPareto frontier(MOEA-DV/EPF)is proposed.This algorithm adopts a preprocessing rule to solve the Paretooptimal solution set of extreme points generated by simultaneous evolution in various target directions,obtainsthe basic Pareto front surface to determine the convergence effect,and analyzes the convergence and distributioneffects of decision variables.In the later stages of algorithm optimization,different mutation strategies are adoptedaccording to the nature of the decision variables to speed up the rate of evolution to obtain excellent individuals,thusenhancing the performance of the algorithm.Evaluation validation of the test functions shows that this algorithmcan solve the multi-objective optimization problem more efficiently.展开更多
The Advanced Geosynchronous Radiation Imager(AGRI)is a mission-critical instrument for the Fengyun series of satellites.AGRI acquires full-disk images every 15 min and views East Asia every 5 min through 14 spectral b...The Advanced Geosynchronous Radiation Imager(AGRI)is a mission-critical instrument for the Fengyun series of satellites.AGRI acquires full-disk images every 15 min and views East Asia every 5 min through 14 spectral bands,enabling the detection of highly variable aerosol optical depth(AOD).Quantitative retrieval of AOD has hitherto been challenging,especially over land.In this study,an AOD retrieval algorithm is proposed that combines deep learning and transfer learning.The algorithm uses core concepts from both the Dark Target(DT)and Deep Blue(DB)algorithms to select features for the machinelearning(ML)algorithm,allowing for AOD retrieval at 550 nm over both dark and bright surfaces.The algorithm consists of two steps:①A baseline deep neural network(DNN)with skip connections is developed using 10 min Advanced Himawari Imager(AHI)AODs as the target variable,and②sunphotometer AODs from 89 ground-based stations are used to fine-tune the DNN parameters.Out-of-station validation shows that the retrieved AOD attains high accuracy,characterized by a coefficient of determination(R2)of 0.70,a mean bias error(MBE)of 0.03,and a percentage of data within the expected error(EE)of 70.7%.A sensitivity study reveals that the top-of-atmosphere reflectance at 650 and 470 nm,as well as the surface reflectance at 650 nm,are the two largest sources of uncertainty impacting the retrieval.In a case study of monitoring an extreme aerosol event,the AGRI AOD is found to be able to capture the detailed temporal evolution of the event.This work demonstrates the superiority of the transfer-learning technique in satellite AOD retrievals and the applicability of the retrieved AGRI AOD in monitoring extreme pollution events.展开更多
Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Fi...Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Firstly,the Golden Sine algorithm and a nonlinear weight factor optimization strategy were added in the discoverer position update stage of the SSA algorithm.Secondly,the Cauchy-Gaussian perturbation was applied to the optimal position of the SSA algorithm to improve its ability to jump out of local optima.Finally,the local search mechanism based on the mountain climbing method was incorporated into the local search stage of the SSA algorithm,improving its local search ability.To evaluate the effectiveness of the proposed algorithm,the Whale Algorithm,Gray Wolf Algorithm,Improved Gray Wolf Algorithm,Sparrow Search Algorithm,and MSSA Algorithm were employed to solve various test functions.The accuracy and convergence speed of each algorithm were then compared and analyzed.The results indicate that the MSSA algorithm has superior solving ability and stability compared to other algorithms.To further validate the enhanced algorithm’s capabilities for path planning,evacuation experiments were conducted using different maps featuring various obstacle types.Additionally,a multi-exit evacuation scenario was constructed according to the actual building environment of a teaching building.Both the sparrow search algorithm and MSSA algorithm were employed in the simulation experiment for multiexit evacuation path planning.The findings demonstrate that the MSSA algorithm outperforms the comparison algorithm,showcasing its greater advantages and higher application potential.展开更多
The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the intera...The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage.展开更多
Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning probl...Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.展开更多
In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper st...In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper studies the problem of energy-efficient distributed heterogeneous permutation flowshop problem with variable processing speed(DHPFSP-VPS),considering both the minimum makespan and total energy consumption(TEC)as objectives.A discrete multi-objective squirrel search algorithm(DMSSA)is proposed to solve the DHPFSPVPS.DMSSA makes four improvements based on the squirrel search algorithm.Firstly,in terms of the population initialization strategy,four hybrid initialization methods targeting different objectives are proposed to enhance the quality of initial solutions.Secondly,enhancements are made to the population hierarchy system and position updating methods of the squirrel search algorithm,making it more suitable for discrete scheduling problems.Additionally,regarding the search strategy,six local searches are designed based on problem characteristics to enhance search capability.Moreover,a dynamic predator strategy based on Q-learning is devised to effectively balance DMSSA’s capability for global exploration and local exploitation.Finally,two speed control energy-efficient strategies are designed to reduce TEC.Extensive comparative experiments are conducted in this paper to validate the effectiveness of the proposed strategies.The results of comparing DMSSA with other algorithms demonstrate its superior performance and its potential for efficient solving of the DHPFSP-VPS problem.展开更多
This study sets up two new merit functions,which are minimized for the detection of real eigenvalue and complex eigenvalue to address nonlinear eigenvalue problems.For each eigen-parameter the vector variable is solve...This study sets up two new merit functions,which are minimized for the detection of real eigenvalue and complex eigenvalue to address nonlinear eigenvalue problems.For each eigen-parameter the vector variable is solved from a nonhomogeneous linear system obtained by reducing the number of eigen-equation one less,where one of the nonzero components of the eigenvector is normalized to the unit and moves the column containing that component to the right-hand side as a nonzero input vector.1D and 2D golden section search algorithms are employed to minimize the merit functions to locate real and complex eigenvalues.Simultaneously,the real and complex eigenvectors can be computed very accurately.A simpler approach to the nonlinear eigenvalue problems is proposed,which implements a normalization condition for the uniqueness of the eigenvector into the eigenequation directly.The real eigenvalues can be computed by the fictitious time integration method(FTIM),which saves computational costs compared to the one-dimensional golden section search algorithm(1D GSSA).The simpler method is also combined with the Newton iterationmethod,which is convergent very fast.All the proposed methods are easily programmed to compute the eigenvalue and eigenvector with high accuracy and efficiency.展开更多
In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and t...In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching.展开更多
This research introduces a novel approach to enhancing bucket elevator design and operation through the integration of discrete element method(DEM)simulation,design of experiments(DOE),and metaheuristic optimization a...This research introduces a novel approach to enhancing bucket elevator design and operation through the integration of discrete element method(DEM)simulation,design of experiments(DOE),and metaheuristic optimization algorithms.Specifically,the study employs the firefly algorithm(FA),a metaheuristic optimization technique,to optimize bucket elevator parameters for maximizing transport mass and mass flow rate discharge of granular materials under specified working conditions.The experimental methodology involves several key steps:screening experiments to identify significant factors affecting bucket elevator operation,central composite design(CCD)experiments to further explore these factors,and response surface methodology(RSM)to create predictive models for transport mass and mass flow rate discharge.The FA algorithm is then applied to optimize these models,and the results are validated through simulation and empirical experiments.The study validates the optimized parameters through simulation and empirical experiments,comparing results with DEM simulation.The outcomes demonstrate the effectiveness of the FA algorithm in identifying optimal bucket parameters,showcasing less than 10%and 15%deviation for transport mass and mass flow rate discharge,respectively,between predicted and actual values.Overall,this research provides insights into the critical factors influencing bucket elevator operation and offers a systematic methodology for optimizing bucket parameters,contributing to more efficient material handling in various industrial applications.展开更多
基金supported by the Research on theKey Technology of Damage Identification Method of Dam Concrete Structure based on Transformer Image Processing(242102521031)the project Research on Situational Awareness and Behavior Anomaly Prediction of Social Media Based on Multimodal Time Series Graph(232102520004)Key Scientific Research Project of Higher Education Institutions in Henan Province(25B520019).
文摘Image enhancement utilizes intensity transformation functions to maximize the information content of enhanced images.This paper approaches the topic as an optimization problem and uses the bald eagle search(BES)algorithm to achieve optimal results.In our proposed model,gamma correction and Retinex address color cast issues and enhance image edges and details.The final enhanced image is obtained through color balancing.The BES algorithm seeks the optimal solution through the selection,search,and swooping stages.However,it is prone to getting stuck in local optima and converges slowly.To overcome these limitations,we propose an improved BES algorithm(ABES)with enhanced population learning,position updates,and control parameters.ABES is employed to optimize the core parameters of gamma correction and Retinex to improve image quality,and the maximization of information entropy is utilized as the objective function.Real benchmark images are collected to validate its performance.Experimental results demonstrate that ABES outperforms the existing image enhancement methods,including the flower pollination algorithm,the chimp optimization algorithm,particle swarm optimization,and BES,in terms of information entropy,peak signal-to-noise ratio(PSNR),structural similarity index(SSIM),and patch-based contrast quality index(PCQI).ABES demonstrates superior performance both qualitatively and quantitatively,and it helps enhance prominent features and contrast in the images while maintaining the natural appearance of the original images.
文摘The widespread adoption of cloud computing has underscored the critical importance of efficient resource allocation and management, particularly in task scheduling, which involves assigning tasks to computing resources for optimized resource utilization. Several meta-heuristic algorithms have shown effectiveness in task scheduling, among which the relatively recent Willow Catkin Optimization (WCO) algorithm has demonstrated potential, albeit with apparent needs for enhanced global search capability and convergence speed. To address these limitations of WCO in cloud computing task scheduling, this paper introduces an improved version termed the Advanced Willow Catkin Optimization (AWCO) algorithm. AWCO enhances the algorithm’s performance by augmenting its global search capability through a quasi-opposition-based learning strategy and accelerating its convergence speed via sinusoidal mapping. A comprehensive evaluation utilizing the CEC2014 benchmark suite, comprising 30 test functions, demonstrates that AWCO achieves superior optimization outcomes, surpassing conventional WCO and a range of established meta-heuristics. The proposed algorithm also considers trade-offs among the cost, makespan, and load balancing objectives. Experimental results of AWCO are compared with those obtained using the other meta-heuristics, illustrating that the proposed algorithm provides superior performance in task scheduling. The method offers a robust foundation for enhancing the utilization of cloud computing resources in the domain of task scheduling within a cloud computing environment.
文摘BACKGROUND Eyelid reconstruction is an intricate process,addressing both aesthetic and functional aspects post-trauma or oncological surgery.Aesthetic concerns and oncological radicality guide personalized approaches.The complex anatomy,involving anterior and posterior lamellae,requires tailored reconstruction for optimal functionality.AIM To formulate an eyelid reconstruction algorithm through an extensive literature review and to validate it by juxtaposing surgical outcomes from Cattinara Hos-in dry eye and tears,which may lead to long-term consequences such as chronic conjunctivitis,discomfort,or photo-phobia.To prevent this issue,scars should be oriented vertically or perpendicularly to the free eyelid margin when the size of the tumor allows.In employing a malar flap to repair a lower eyelid defect,the malar incision must ascend diagonally;this facilitates enhanced flap advancement and mitigates ectropion by restricting vertical traction.Conse-quently,it is imperative to maintain that the generated tension remains consistently horizontal and never vertical[9].Lagophthalmos is a disorder characterized by the inability to completely close the eyelids,leading to corneal exposure and an increased risk of keratitis or ulceration;it may arise following upper eyelid surgery.To avert this issue,it is essential to preserve a minimum of 1 cm of skin between the superior edge of the excision and the inferior boundary of the eyebrow.Epiphora may occur in cancers involving the lacrimal puncta,requiring their removal.As previously stated,when employing a glabellar flap to rectify medial canthal abnormalities,it is essential to prevent a trapdoor effect or thickening of the flap relative to the eyelid skin to which it is affixed.Constraints about our proposed algorithm enco-mpass limited sample sizes and possible publication biases in existing studies.Subsequent investigations ought to examine long-term results to further refine the algorithm.Future research should evaluate the algorithm across varied populations and examine the impact of novel graft materials on enhancing reconstructive outcomes.CONCLUSION Eyelid reconstruction remains one of the most intriguing challenges for a plastic surgeon today.The most fascinating aspect of this discipline is the need to restore the functionality of such an essential structure while maintaining its aesthetics.In our opinion,creating decision-making algorithms can facilitate reaching this goal by allowing for the individualization of the reconstructive path while minimizing the incidence of complications.The fact that we have decreased the incidence of severe complications is a sign that the work is moving in the right direction.The fact that there has been no need for reintervention,neither for reconstructive issues nor for inadequate oncological radicality,overall signifies greater patient satisfaction as they do not have to undergo the stress of new surgeries.Even the minor complic-ations recorded are in line with those reported in the literature,and,even more importantly for patients,they are of limited duration.In our experience,after a year of application,we can say that the objective has been achieved,but much more can still be done.Behind every work,a scientific basis must be continually renewed and refreshed to maintain high-quality standards.Therefore,searching for possible alternative solutions to be included in one’s surgical armamentarium is fundamental to providing the patient with a fully personalized option.
基金supported by the National Natural Science Foundation of China (62173333, 12271522)Beijing Natural Science Foundation (Z210002)the Research Fund of Renmin University of China (2021030187)。
文摘For unachievable tracking problems, where the system output cannot precisely track a given reference, achieving the best possible approximation for the reference trajectory becomes the objective. This study aims to investigate solutions using the Ptype learning control scheme. Initially, we demonstrate the necessity of gradient information for achieving the best approximation.Subsequently, we propose an input-output-driven learning gain design to handle the imprecise gradients of a class of uncertain systems. However, it is discovered that the desired performance may not be attainable when faced with incomplete information.To address this issue, an extended iterative learning control scheme is introduced. In this scheme, the tracking errors are modified through output data sampling, which incorporates lowmemory footprints and offers flexibility in learning gain design.The input sequence is shown to converge towards the desired input, resulting in an output that is closest to the given reference in the least square sense. Numerical simulations are provided to validate the theoretical findings.
文摘By analyzing the correlation between courses in students’grades,we can provide a decision-making basis for the revision of courses and syllabi,rationally optimize courses,and further improve teaching effects.With the help of IBM SPSS Modeler data mining software,this paper uses Apriori algorithm for association rule mining to conduct an in-depth analysis of the grades of nursing students in Shandong College of Traditional Chinese Medicine,and to explore the correlation between professional basic courses and professional core courses.Lastly,according to the detailed analysis of the mining results,valuable curriculum information will be found from the actual teaching data.
文摘This paper studies a strongly convergent inertial forward-backward-forward algorithm for the variational inequality problem in Hilbert spaces.In our convergence analysis,we do not assume the on-line rule of the inertial parameters and the iterates,which have been assumed by several authors whenever a strongly convergent algorithm with an inertial extrapolation step is proposed for a variational inequality problem.Consequently,our proof arguments are different from what is obtainable in the relevant literature.Finally,we give numerical tests to confirm the theoretical analysis and show that our proposed algorithm is superior to related ones in the literature.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.32371280 and T2350011)。
文摘Considering the pivotal role of single-wavelength anomalous diffraction(SAD) in macromolecular crystallography,our objective was to introduce DSAS,a novel program designed for efficient anomalous scattering substructure determination.DSAS stands out with its core components:a modified phase-retrieval algorithm and automated parameter tuning.The software boasts an intuitive graphical user interface(GUI),facilitating seamless input of essential data and real-time monitoring.Extensive testing on DSAS has involved diverse datasets,encompassing proteins,nucleic acids,and various anomalous scatters such as sulfur(S),selenium(Se),metals,and halogens.The results confirm DSAS’s exceptional performance in accurately determining heavy atom positions,making it a highly effective tool in the field.
基金supported by the National Natural Science Foundation of China(Grant No.52078493)the Natural Science Foundation of Hunan Province(Grant No.2022JJ30700)+2 种基金the Natural Science Foundation for Excellent Young Scholars of Hunan(Grant No.2021JJ20057)the Science and Technology Plan Project of Changsha(Grant No.kq2305006)the Innovation Driven Program of Central South University(Grant No.2023CXQD033).
文摘Estimation of velocity profile within mud depth is a long-standing and essential problem in debris flow dynamics.Until now,various velocity profiles have been proposed based on the fitting analysis of experimental measurements,but these are often limited by the observation conditions,such as the number of configured sensors.Therefore,the resulting linear velocity profiles usually exhibit limitations in reproducing the temporal-varied and nonlinear behavior during the debris flow process.In this study,we present a novel approach to explore the debris flow velocity profile in detail upon our previous 3D-HBPSPH numerical model,i.e.,the three-dimensional Smoothed Particle Hydrodynamic model incorporating the Herschel-Bulkley-Papanastasiou rheology.Specifically,we propose a stratification aggregation algorithm for interpreting the details of SPH particles,which enables the recording of temporal velocities of debris flow at different mud depths.To analyze the velocity profile,we introduce a logarithmic-based nonlinear model with two key parameters,that a controlling the shape of velocity profile and b concerning its temporal evolution.We verify the proposed velocity profile and explore its sensitivity using 34 sets of velocity data from three individual flume experiments in previous literature.Our results demonstrate that the proposed temporalvaried nonlinear velocity profile outperforms the previous linear profiles.
基金Foundation of National Natural Science Foundation of China(62202118)Scientific and Technological Research Projects from Guizhou Education Department([2023]003)+1 种基金Guizhou Provincial Department of Science and Technology Hundred Levels of Innovative Talents Project(GCC[2023]018)Top Technology Talent Project from Guizhou Education Department([2022]073).
文摘The development of technologies such as big data and blockchain has brought convenience to life,but at the same time,privacy and security issues are becoming more and more prominent.The K-anonymity algorithm is an effective and low computational complexity privacy-preserving algorithm that can safeguard users’privacy by anonymizing big data.However,the algorithm currently suffers from the problem of focusing only on improving user privacy while ignoring data availability.In addition,ignoring the impact of quasi-identified attributes on sensitive attributes causes the usability of the processed data on statistical analysis to be reduced.Based on this,we propose a new K-anonymity algorithm to solve the privacy security problem in the context of big data,while guaranteeing improved data usability.Specifically,we construct a new information loss function based on the information quantity theory.Considering that different quasi-identification attributes have different impacts on sensitive attributes,we set weights for each quasi-identification attribute when designing the information loss function.In addition,to reduce information loss,we improve K-anonymity in two ways.First,we make the loss of information smaller than in the original table while guaranteeing privacy based on common artificial intelligence algorithms,i.e.,greedy algorithm and 2-means clustering algorithm.In addition,we improve the 2-means clustering algorithm by designing a mean-center method to select the initial center of mass.Meanwhile,we design the K-anonymity algorithm of this scheme based on the constructed information loss function,the improved 2-means clustering algorithm,and the greedy algorithm,which reduces the information loss.Finally,we experimentally demonstrate the effectiveness of the algorithm in improving the effect of 2-means clustering and reducing information loss.
基金Supported by National Natural Science Foundation of China(Grant No.51890881)。
文摘As electro-hydrostatic actuator(EHA)technology advances towards lightweight and integration,the demand for enhanced internal flow pathways in hydraulic valve blocks intensifies.However,owing to the constraints imposed by traditional manufacturing processes,conventional hydraulic integrated valve blocks fail to satisfy the demands of a more compact channel layout and lower energy dissipation.Notably,the subjectivity in the arrangement of internal passages results in a time-consuming and labor-intensive process.This study employed additive manufacturing technology and the ant colony algorithm and B-spline curves for the meticulous design of internal passages within an aviation EHA valve block.The layout environment for the valve block passages was established,and path optimization was achieved using the ant colony algorithm,complemented by smoothing using B-spline curves.Three-dimensional modeling was performed using SolidWorks software,revealing a 10.03%reduction in volume for the optimized passages compared with the original passages.Computational fluid dynamics(CFD)simulations were performed using Fluent software,demonstrating that the algorithmically optimized passages effectively prevented the occurrence of vortices at right-angled locations,exhibited superior flow characteristics,and concurrently reduced pressure losses by 34.09%-36.36%.The small discrepancy between the experimental and simulation results validated the efficacy of the ant colony algorithm and B-spline curves in optimizing the passage design,offering a viable solution for channel design in additive manufacturing.
基金the Liaoning Province Nature Fundation Project(2022-MS-291)the National Programme for Foreign Expert Projects(G2022006008L)+2 种基金the Basic Research Projects of Liaoning Provincial Department of Education(LJKMZ20220781,LJKMZ20220783,LJKQZ20222457)King Saud University funded this study through theResearcher Support Program Number(RSPD2023R704)King Saud University,Riyadh,Saudi Arabia.
文摘The existing algorithms for solving multi-objective optimization problems fall into three main categories:Decomposition-based,dominance-based,and indicator-based.Traditional multi-objective optimization problemsmainly focus on objectives,treating decision variables as a total variable to solve the problem without consideringthe critical role of decision variables in objective optimization.As seen,a variety of decision variable groupingalgorithms have been proposed.However,these algorithms are relatively broad for the changes of most decisionvariables in the evolution process and are time-consuming in the process of finding the Pareto frontier.To solvethese problems,a multi-objective optimization algorithm for grouping decision variables based on extreme pointPareto frontier(MOEA-DV/EPF)is proposed.This algorithm adopts a preprocessing rule to solve the Paretooptimal solution set of extreme points generated by simultaneous evolution in various target directions,obtainsthe basic Pareto front surface to determine the convergence effect,and analyzes the convergence and distributioneffects of decision variables.In the later stages of algorithm optimization,different mutation strategies are adoptedaccording to the nature of the decision variables to speed up the rate of evolution to obtain excellent individuals,thusenhancing the performance of the algorithm.Evaluation validation of the test functions shows that this algorithmcan solve the multi-objective optimization problem more efficiently.
基金supported by the National Natural Science of Foundation of China(41825011,42030608,42105128,and 42075079)the Opening Foundation of Key Laboratory of Atmospheric Sounding,the CMA and the CMA Research Center on Meteorological Observation Engineering Technology(U2021Z03).
文摘The Advanced Geosynchronous Radiation Imager(AGRI)is a mission-critical instrument for the Fengyun series of satellites.AGRI acquires full-disk images every 15 min and views East Asia every 5 min through 14 spectral bands,enabling the detection of highly variable aerosol optical depth(AOD).Quantitative retrieval of AOD has hitherto been challenging,especially over land.In this study,an AOD retrieval algorithm is proposed that combines deep learning and transfer learning.The algorithm uses core concepts from both the Dark Target(DT)and Deep Blue(DB)algorithms to select features for the machinelearning(ML)algorithm,allowing for AOD retrieval at 550 nm over both dark and bright surfaces.The algorithm consists of two steps:①A baseline deep neural network(DNN)with skip connections is developed using 10 min Advanced Himawari Imager(AHI)AODs as the target variable,and②sunphotometer AODs from 89 ground-based stations are used to fine-tune the DNN parameters.Out-of-station validation shows that the retrieved AOD attains high accuracy,characterized by a coefficient of determination(R2)of 0.70,a mean bias error(MBE)of 0.03,and a percentage of data within the expected error(EE)of 70.7%.A sensitivity study reveals that the top-of-atmosphere reflectance at 650 and 470 nm,as well as the surface reflectance at 650 nm,are the two largest sources of uncertainty impacting the retrieval.In a case study of monitoring an extreme aerosol event,the AGRI AOD is found to be able to capture the detailed temporal evolution of the event.This work demonstrates the superiority of the transfer-learning technique in satellite AOD retrievals and the applicability of the retrieved AGRI AOD in monitoring extreme pollution events.
基金supported by National Natural Science Foundation of China(71904006)Henan Province Key R&D Special Project(231111322200)+1 种基金the Science and Technology Research Plan of Henan Province(232102320043,232102320232,232102320046)the Natural Science Foundation of Henan(232300420317,232300420314).
文摘Reducing casualties and property losses through effective evacuation route planning has been a key focus for researchers in recent years.As part of this effort,an enhanced sparrow search algorithm(MSSA)was proposed.Firstly,the Golden Sine algorithm and a nonlinear weight factor optimization strategy were added in the discoverer position update stage of the SSA algorithm.Secondly,the Cauchy-Gaussian perturbation was applied to the optimal position of the SSA algorithm to improve its ability to jump out of local optima.Finally,the local search mechanism based on the mountain climbing method was incorporated into the local search stage of the SSA algorithm,improving its local search ability.To evaluate the effectiveness of the proposed algorithm,the Whale Algorithm,Gray Wolf Algorithm,Improved Gray Wolf Algorithm,Sparrow Search Algorithm,and MSSA Algorithm were employed to solve various test functions.The accuracy and convergence speed of each algorithm were then compared and analyzed.The results indicate that the MSSA algorithm has superior solving ability and stability compared to other algorithms.To further validate the enhanced algorithm’s capabilities for path planning,evacuation experiments were conducted using different maps featuring various obstacle types.Additionally,a multi-exit evacuation scenario was constructed according to the actual building environment of a teaching building.Both the sparrow search algorithm and MSSA algorithm were employed in the simulation experiment for multiexit evacuation path planning.The findings demonstrate that the MSSA algorithm outperforms the comparison algorithm,showcasing its greater advantages and higher application potential.
基金supported in part by the Central Government Guides Local Science and TechnologyDevelopment Funds(Grant No.YDZJSX2021A038)in part by theNational Natural Science Foundation of China under(Grant No.61806138)in part by the China University Industry-University-Research Collaborative Innovation Fund(Future Network Innovation Research and Application Project)(Grant 2021FNA04014).
文摘The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage.
基金supported by the Opening Fund of Shandong Provincial Key Laboratory of Network based Intelligent Computing,the National Natural Science Foundation of China(52205529,61803192)the Natural Science Foundation of Shandong Province(ZR2021QE195)+1 种基金the Youth Innovation Team Program of Shandong Higher Education Institution(2023KJ206)the Guangyue Youth Scholar Innovation Talent Program support received from Liaocheng University(LCUGYTD2022-03).
文摘Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.
基金supported by the Key Research and Development Project of Hubei Province(Nos.2020BAB114 and 2023BAB094).
文摘In the manufacturing industry,reasonable scheduling can greatly improve production efficiency,while excessive resource consumption highlights the growing significance of energy conservation in production.This paper studies the problem of energy-efficient distributed heterogeneous permutation flowshop problem with variable processing speed(DHPFSP-VPS),considering both the minimum makespan and total energy consumption(TEC)as objectives.A discrete multi-objective squirrel search algorithm(DMSSA)is proposed to solve the DHPFSPVPS.DMSSA makes four improvements based on the squirrel search algorithm.Firstly,in terms of the population initialization strategy,four hybrid initialization methods targeting different objectives are proposed to enhance the quality of initial solutions.Secondly,enhancements are made to the population hierarchy system and position updating methods of the squirrel search algorithm,making it more suitable for discrete scheduling problems.Additionally,regarding the search strategy,six local searches are designed based on problem characteristics to enhance search capability.Moreover,a dynamic predator strategy based on Q-learning is devised to effectively balance DMSSA’s capability for global exploration and local exploitation.Finally,two speed control energy-efficient strategies are designed to reduce TEC.Extensive comparative experiments are conducted in this paper to validate the effectiveness of the proposed strategies.The results of comparing DMSSA with other algorithms demonstrate its superior performance and its potential for efficient solving of the DHPFSP-VPS problem.
基金the National Science and Tech-nology Council,Taiwan for their financial support(Grant Number NSTC 111-2221-E-019-048).
文摘This study sets up two new merit functions,which are minimized for the detection of real eigenvalue and complex eigenvalue to address nonlinear eigenvalue problems.For each eigen-parameter the vector variable is solved from a nonhomogeneous linear system obtained by reducing the number of eigen-equation one less,where one of the nonzero components of the eigenvector is normalized to the unit and moves the column containing that component to the right-hand side as a nonzero input vector.1D and 2D golden section search algorithms are employed to minimize the merit functions to locate real and complex eigenvalues.Simultaneously,the real and complex eigenvectors can be computed very accurately.A simpler approach to the nonlinear eigenvalue problems is proposed,which implements a normalization condition for the uniqueness of the eigenvector into the eigenequation directly.The real eigenvalues can be computed by the fictitious time integration method(FTIM),which saves computational costs compared to the one-dimensional golden section search algorithm(1D GSSA).The simpler method is also combined with the Newton iterationmethod,which is convergent very fast.All the proposed methods are easily programmed to compute the eigenvalue and eigenvector with high accuracy and efficiency.
基金Supported by the Natural Science Foundation of Chongqing(General Program,NO.CSTB2022NSCQ-MSX0884)Discipline Teaching Special Project of Yangtze Normal University(csxkjx14)。
文摘In this paper,we prove that Euclid's algorithm,Bezout's equation and Divi-sion algorithm are equivalent to each other.Our result shows that Euclid has preliminarily established the theory of divisibility and the greatest common divisor.We further provided several suggestions for teaching.
基金This research was funded by the Faculty of Engineering,King Mongkut’s University of Technology North Bangkok.Contract No.ENG-NEW-66-39.
文摘This research introduces a novel approach to enhancing bucket elevator design and operation through the integration of discrete element method(DEM)simulation,design of experiments(DOE),and metaheuristic optimization algorithms.Specifically,the study employs the firefly algorithm(FA),a metaheuristic optimization technique,to optimize bucket elevator parameters for maximizing transport mass and mass flow rate discharge of granular materials under specified working conditions.The experimental methodology involves several key steps:screening experiments to identify significant factors affecting bucket elevator operation,central composite design(CCD)experiments to further explore these factors,and response surface methodology(RSM)to create predictive models for transport mass and mass flow rate discharge.The FA algorithm is then applied to optimize these models,and the results are validated through simulation and empirical experiments.The study validates the optimized parameters through simulation and empirical experiments,comparing results with DEM simulation.The outcomes demonstrate the effectiveness of the FA algorithm in identifying optimal bucket parameters,showcasing less than 10%and 15%deviation for transport mass and mass flow rate discharge,respectively,between predicted and actual values.Overall,this research provides insights into the critical factors influencing bucket elevator operation and offers a systematic methodology for optimizing bucket parameters,contributing to more efficient material handling in various industrial applications.