The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical r...The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical research.The review covers key topics such as computational modelling,bioinformatics,machine learning in medical diagnostics,and the integration of wearable technology for real-time health monitoring.Major findings indicate that computational models have significantly enhanced the understanding of complex biological systems,while machine learning algorithms have improved the accuracy of disease prediction and diagnosis.The synergy between bioinformatics and computational techniques has led to breakthroughs in personalized medicine,enabling more precise treatment strategies.Additionally,the integration of wearable devices with advanced computational methods has opened new avenues for continuous health monitoring and early disease detection.The review emphasizes the need for interdisciplinary collaboration to further advance this field.Future research should focus on developing more robust and scalable computational models,enhancing data integration techniques,and addressing ethical considerations related to data privacy and security.By fostering innovation at the intersection of these disciplines,the potential to revolutionize healthcare delivery and outcomes becomes increasingly attainable.展开更多
Recurrent neural networks(RNNs)have proven to be indispensable for processing sequential and temporal data,with extensive applications in language modeling,text generation,machine translation,and time-series forecasti...Recurrent neural networks(RNNs)have proven to be indispensable for processing sequential and temporal data,with extensive applications in language modeling,text generation,machine translation,and time-series forecasting.Despite their versatility,RNNs are frequently beset by significant training expenses and slow convergence times,which impinge upon their deployment in edge AI applications.Reservoir computing(RC),a specialized RNN variant,is attracting increased attention as a cost-effective alternative for processing temporal and sequential data at the edge.RC’s distinctive advantage stems from its compatibility with emerging memristive hardware,which leverages the energy efficiency and reduced footprint of analog in-memory and in-sensor computing,offering a streamlined and energy-efficient solution.This review offers a comprehensive explanation of RC’s underlying principles,fabrication processes,and surveys recent progress in nano-memristive device based RC systems from the viewpoints of in-memory and in-sensor RC function.It covers a spectrum of memristive device,from established oxide-based memristive device to cutting-edge material science developments,providing readers with a lucid understanding of RC’s hardware implementation and fostering innovative designs for in-sensor RC systems.Lastly,we identify prevailing challenges and suggest viable solutions,paving the way for future advancements in in-sensor RC technology.展开更多
We present a comprehensive mathematical framework establishing the foundations of holographic quantum computing, a novel paradigm that leverages holographic phenomena to achieve superior error correction and algorithm...We present a comprehensive mathematical framework establishing the foundations of holographic quantum computing, a novel paradigm that leverages holographic phenomena to achieve superior error correction and algorithmic efficiency. We rigorously demonstrate that quantum information can be encoded and processed using holographic principles, establishing fundamental theorems characterizing the error-correcting properties of holographic codes. We develop a complete set of universal quantum gates with explicit constructions and prove exponential speedups for specific classes of computational problems. Our framework demonstrates that holographic quantum codes achieve a code rate scaling as O(1/logn), superior to traditional quantum LDPC codes, while providing inherent protection against errors via geometric properties of the code structures. We prove a threshold theorem establishing that arbitrary quantum computations can be performed reliably when physical error rates fall below a constant threshold. Notably, our analysis suggests certain algorithms, including those involving high-dimensional state spaces and long-range interactions, achieve exponential speedups over both classical and conventional quantum approaches. This work establishes the theoretical foundations for a new approach to quantum computation that provides natural fault tolerance and scalability, directly addressing longstanding challenges of the field.展开更多
Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for t...Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for the global ground users.In this paper,the computation offloading problem and resource allocation problem are formulated as a mixed integer nonlinear program(MINLP)problem.This paper proposes a computation offloading algorithm based on deep deterministic policy gradient(DDPG)to obtain the user offloading decisions and user uplink transmission power.This paper uses the convex optimization algorithm based on Lagrange multiplier method to obtain the optimal MEC server resource allocation scheme.In addition,the expression of suboptimal user local CPU cycles is derived by relaxation method.Simulation results show that the proposed algorithm can achieve excellent convergence effect,and the proposed algorithm significantly reduces the system utility values at considerable time cost compared with other algorithms.展开更多
Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient int...Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient inter-satellite cooperative computation offloading(ICCO)algorithm for LEO satellite networks.Specifically,an ICCO system model is constructed,which considers using neighboring satellites in the LEO satellite networks to collaboratively process tasks generated by ground user terminals,effectively improving resource utilization efficiency.Additionally,the optimization objective of minimizing the system task computation offloading delay and energy consumption is established,which is decoupled into two sub-problems.In terms of computational resource allocation,the convexity of the problem is proved through theoretical derivation,and the Lagrange multiplier method is used to obtain the optimal solution of computational resources.To deal with the task offloading decision,a dynamic sticky binary particle swarm optimization algorithm is designed to obtain the offloading decision by iteration.Simulation results show that the ICCO algorithm can effectively reduce the delay and energy consumption.展开更多
Recently,the Fog-Radio Access Network(F-RAN)has gained considerable attention,because of its flexible architecture that allows rapid response to user requirements.In this paper,computational offloading in F-RAN is con...Recently,the Fog-Radio Access Network(F-RAN)has gained considerable attention,because of its flexible architecture that allows rapid response to user requirements.In this paper,computational offloading in F-RAN is considered,where multiple User Equipments(UEs)offload their computational tasks to the F-RAN through fog nodes.Each UE can select one of the fog nodes to offload its task,and each fog node may serve multiple UEs.The tasks are computed by the fog nodes or further offloaded to the cloud via a capacity-limited fronhaul link.In order to compute all UEs'tasks quickly,joint optimization of UE-Fog association,radio and computation resources of F-RAN is proposed to minimize the maximum latency of all UEs.This min-max problem is formulated as a Mixed Integer Nonlinear Program(MINP).To tackle it,first,MINP is reformulated as a continuous optimization problem,and then the Majorization Minimization(MM)method is used to find a solution.The MM approach that we develop is unconventional in that each MM subproblem is solved inexactly with the same provable convergence guarantee as the exact MM,thereby reducing the complexity of MM iteration.In addition,a cooperative offloading model is considered,where the fog nodes compress-and-forward their received signals to the cloud.Under this model,a similar min-max latency optimization problem is formulated and tackled by the inexact MM.Simulation results show that the proposed algorithms outperform some offloading strategies,and that the cooperative offloading can exploit transmission diversity better than noncooperative offloading to achieve better latency performance.展开更多
Over-the-air computation(AirComp)enables federated learning(FL)to rapidly aggregate local models at the central server using waveform superposition property of wireless channel.In this paper,a robust transmission sche...Over-the-air computation(AirComp)enables federated learning(FL)to rapidly aggregate local models at the central server using waveform superposition property of wireless channel.In this paper,a robust transmission scheme for an AirCompbased FL system with imperfect channel state information(CSI)is proposed.To model CSI uncertainty,an expectation-based error model is utilized.The main objective is to maximize the number of selected devices that meet mean-squared error(MSE)requirements for model broadcast and model aggregation.The problem is formulated as a combinatorial optimization problem and is solved in two steps.First,the priority order of devices is determined by a sparsity-inducing procedure.Then,a feasibility detection scheme is used to select the maximum number of devices to guarantee that the MSE requirements are met.An alternating optimization(AO)scheme is used to transform the resulting nonconvex problem into two convex subproblems.Numerical results illustrate the effectiveness and robustness of the proposed scheme.展开更多
With the rapid growth of socialmedia,the spread of fake news has become a growing problem,misleading the public and causing significant harm.As social media content is often composed of both images and text,the use of...With the rapid growth of socialmedia,the spread of fake news has become a growing problem,misleading the public and causing significant harm.As social media content is often composed of both images and text,the use of multimodal approaches for fake news detection has gained significant attention.To solve the problems existing in previous multi-modal fake news detection algorithms,such as insufficient feature extraction and insufficient use of semantic relations between modes,this paper proposes the MFFFND-Co(Multimodal Feature Fusion Fake News Detection with Co-Attention Block)model.First,the model deeply explores the textual content,image content,and frequency domain features.Then,it employs a Co-Attention mechanism for cross-modal fusion.Additionally,a semantic consistency detectionmodule is designed to quantify semantic deviations,thereby enhancing the performance of fake news detection.Experimentally verified on two commonly used datasets,Twitter and Weibo,the model achieved F1 scores of 90.0% and 94.0%,respectively,significantly outperforming the pre-modified MFFFND(Multimodal Feature Fusion Fake News Detection with Attention Block)model and surpassing other baseline models.This improves the accuracy of detecting fake information in artificial intelligence detection and engineering software detection.展开更多
This paper develops a comprehensive computational modeling and simulation framework based on Complex Adaptive Systems(CAS)theory to unveil the underlying mechanisms of self-organization,nonlinear evolution,and emergen...This paper develops a comprehensive computational modeling and simulation framework based on Complex Adaptive Systems(CAS)theory to unveil the underlying mechanisms of self-organization,nonlinear evolution,and emergence in social systems.By integrating mathematical models,agent-based modeling,network dynamic analysis,and hybrid modeling approaches,the study applies CAS theory to case studies in economic markets,political decision-making,and social interactions.The experimental results demonstrate that local interactions among individual agents can give rise to complex global phenomena,such as market fluctuations,opinion polarization,and sudden outbreaks of social movements.This framework not only provides a more robust explanation for the nonlinear dynamics and abrupt transitions that traditional models often fail to capture,but also offers valuable decision-support tools for public policy formulation,social governance,and risk management.Emphasizing the importance of interdisciplinary approaches,this work outlines future research directions in high-performance computing,artificial intelligence,and real-time data integration to further advance the theoretical and practical applications of CAS in the social sciences.展开更多
Food allergy has become a global concern.Spleen tyrosine kinase(SYK)inhibitors are promising therapeutics against allergic disorders.In this study,a total of 300 natural phenolic compounds were firstly subjected to vi...Food allergy has become a global concern.Spleen tyrosine kinase(SYK)inhibitors are promising therapeutics against allergic disorders.In this study,a total of 300 natural phenolic compounds were firstly subjected to virtual screening.Sesamin and its metabolites,sesamin monocatechol(SC-1)and sesamin dicatechol(SC-2),were identified as potential SYK inhibitors,showing high binding affinity and inhibition efficiency towards SYK.Compared with R406(a traditional SYK inhibitor),sesamin,SC-1,and SC-2 had lower binding energy and inhibition constant(Ki)during molecular docking,exhibited higher bioavailability,safety,metabolism/clearance rate,and distribution uniformity ADMET predictions,and showed high stability in occupying the ATP-binding pocket of SYK during molecular dynamics simulations.In anti-dinitrophenyl-immunoglobulin E(Anti-DNP-Ig E)/dinitrophenyl-human serum albumin(DNP-HSA)-stimulated rat basophilic leukemia(RBL-2H3)cells,sesamin in the concentration range of 5-80μmol/L influenced significantly the degranulation and cytokine release,with 54.00%inhibition againstβ-hexosaminidase release and 58.45%decrease in histamine.In BALB/c mice,sesamin could ameliorate Anti-DNP-Ig E/DNP-HSA-induced passive cutaneous anaphylaxis(PCA)and ovalbumin(OVA)-induced active systemic anaphylaxis(ASA)reactions,reduce the levels of allergic mediators(immunoglobulins and pro-inflammatory cytokines),partially correct the imbalance of T helper(Th)cells differentiation in the spleen,and inhibit the phosphorylation of SYK and its downstream signaling proteins,including p38 mitogen-activated protein kinases(p38 MAPK),extracellular signalregulated kinases(ERK),and p65 nuclear factor-κB(p65 NF-κB)in the spleen.Thus,sesamin may be a safe and versatile SYK inhibitor that can alleviate Ig E-mediated food allergies.展开更多
This paper proposes an innovative approach to social science research based on quantum theory,integrating quantum probability,quantum game theory,and quantum statistical methods into a comprehensive interdisciplinary ...This paper proposes an innovative approach to social science research based on quantum theory,integrating quantum probability,quantum game theory,and quantum statistical methods into a comprehensive interdisciplinary framework for both theoretical and empirical investigation.The study elaborates on how core quantum concepts such as superposition,interference,and measurement collapse can be applied to model social decision making,cognition,and interactions.Advanced quantum computational methods and algorithms are employed to transition from theoretical model development to simulation and experimental validation.Through case studies in international relations,economic games,and political decision making,the research demonstrates that quantum models possess significant advantages in explaining irrational and context-dependent behaviors that traditional methods often fail to capture.The paper also explores the potential applications of quantum social science in policy formulation and public decision making,addresses the ethical,privacy,and social equity challenges posed by quantum artificial intelligence,and outlines future research directions at the convergence of quantum AI,quantum machine learning,and big data analytics.The findings suggest that quantum social science not only offers a novel perspective for understanding complex social phenomena but also lays the foundation for more accurate and efficient systems in social forecasting and decision support.展开更多
The integration of technologies like artificial intelligence,6G,and vehicular ad-hoc networks holds great potential to meet the communication demands of the Internet of Vehicles and drive the advancement of vehicle ap...The integration of technologies like artificial intelligence,6G,and vehicular ad-hoc networks holds great potential to meet the communication demands of the Internet of Vehicles and drive the advancement of vehicle applications.However,these advancements also generate a surge in data processing requirements,necessitating the offloading of vehicular tasks to edge servers due to the limited computational capacity of vehicles.Despite recent advancements,the robustness and scalability of the existing approaches with respect to the number of vehicles and edge servers and their resources,as well as privacy,remain a concern.In this paper,a lightweight offloading strategy that leverages ubiquitous connectivity through the Space Air Ground Integrated Vehicular Network architecture while ensuring privacy preservation is proposed.The Internet of Vehicles(IoV)environment is first modeled as a graph,with vehicles and base stations as nodes,and their communication links as edges.Secondly,vehicular applications are offloaded to suitable servers based on latency using an attention-based heterogeneous graph neural network(HetGNN)algorithm.Subsequently,a differential privacy stochastic gradient descent trainingmechanism is employed for privacypreserving of vehicles and offloading inference.Finally,the simulation results demonstrated that the proposedHetGNN method shows good performance with 0.321 s of inference time,which is 42.68%,63.93%,30.22%,and 76.04% less than baseline methods such as Deep Deterministic Policy Gradient,Deep Q Learning,Deep Neural Network,and Genetic Algorithm,respectively.展开更多
Semisubmersible naval ships are versatile military crafts that combine the advantageous features of high-speed planing crafts and submarines.At-surface,these ships are designed to provide sufficient speed and maneuver...Semisubmersible naval ships are versatile military crafts that combine the advantageous features of high-speed planing crafts and submarines.At-surface,these ships are designed to provide sufficient speed and maneuverability.Additionally,they can perform shallow dives,offering low visual and acoustic detectability.Therefore,the hydrodynamic design of a semisubmersible naval ship should address at-surface and submerged conditions.In this study,Numerical analyses were performed using a semisubmersible hull form to analyze its hydrodynamic features,including resistance,powering,and maneuvering.The simulations were conducted with Star CCM+version 2302,a commercial package program that solves URANS equations using the SST k-ωturbulence model.The flow analysis was divided into two parts:at-surface simulations and shallowly submerged simulations.At-surface simulations cover the resistance,powering,trim,and sinkage at transition and planing regimes,with corresponding Froude numbers ranging from 0.42 to 1.69.Shallowly submerged simulations were performed at seven different submergence depths,ranging from D/LOA=0.0635 to D/LOA=0.635,and at two different speeds with Froude numbers of 0.21 and 0.33.The behaviors of the hydrodynamic forces and pitching moment for different operation depths were comprehensively analyzed.The results of the numerical analyses provide valuable insights into the hydrodynamic performance of semisubmersible naval ships,highlighting the critical factors influencing their resistance,powering,and maneuvering capabilities in both at-surface and submerged conditions.展开更多
The rapid evolution of international trade necessitates the adoption of intelligent digital solutions to enhance trade facilitation.The Single Window System(SWS)has emerged as a key mechanism for streamlining trade do...The rapid evolution of international trade necessitates the adoption of intelligent digital solutions to enhance trade facilitation.The Single Window System(SWS)has emerged as a key mechanism for streamlining trade documentation,customs clearance,and regulatory compliance.However,traditional SWS implementations face challenges such as data fragmentation,inefficient processing,and limited real-time intelligence.This study proposes a computational social science framework that integrates artificial intelligence(AI),machine learning,network analytics,and blockchain to optimize SWS operations.By employing predictive modeling,agentbased simulations,and algorithmic governance,this research demonstrates how computational methodologies improve trade efficiency,enhance regulatory compliance,and reduce transaction costs.Empirical case studies on AI-driven customs clearance,blockchain-enabled trade transparency,and network-based trade policy simulation illustrate the practical applications of these techniques.The study concludes that interdisciplinary collaboration and algorithmic governance are essential for advancing digital trade facilitation,ensuring resilience,transparency,and adaptability in global trade ecosystems.展开更多
Streptococcus suis(S.suis)is a major disease impacting pig farming globally.It can also be transferred to humans by eating raw pork.A comprehensive study was recently carried out to determine the indices throughmultip...Streptococcus suis(S.suis)is a major disease impacting pig farming globally.It can also be transferred to humans by eating raw pork.A comprehensive study was recently carried out to determine the indices throughmultiple geographic regions in China.Methods:The well-posed theorems were employed to conduct a thorough analysis of the model’s feasible features,including positivity,boundedness equilibria,reproduction number,and parameter sensitivity.Stochastic Euler,Runge Kutta,and EulerMaruyama are some of the numerical techniques used to replicate the behavior of the streptococcus suis infection in the pig population.However,the dynamic qualities of the suggested model cannot be restored using these techniques.Results:For the stochastic delay differential equations of the model,the non-standard finite difference approach in the sense of stochasticity is developed to avoid several problems such as negativity,unboundedness,inconsistency,and instability of the findings.Results from traditional stochastic methods either converge conditionally or diverge over time.The stochastic non-negative step size convergence nonstandard finite difference(NSFD)method unconditionally converges to the model’s true states.Conclusions:This study improves our understanding of the dynamics of streptococcus suis infection using versions of stochastic with delay approaches and opens up new avenues for the study of cognitive processes and neuronal analysis.Theplotted interaction behaviour and new solution comparison profiles.展开更多
Disordered ferromagnets with a domain structure that exhibit a hysteresis loop when driven by the external magnetic field are essential materials for modern technological applications.Therefore,the understanding and p...Disordered ferromagnets with a domain structure that exhibit a hysteresis loop when driven by the external magnetic field are essential materials for modern technological applications.Therefore,the understanding and potential for controlling the hysteresis phenomenon in thesematerials,especially concerning the disorder-induced critical behavior on the hysteresis loop,have attracted significant experimental,theoretical,and numerical research efforts.We review the challenges of the numerical modeling of physical phenomena behind the hysteresis loop critical behavior in disordered ferromagnetic systems related to the non-equilibriumstochastic dynamics of domain walls driven by external fields.Specifically,using the extended Random Field Ising Model,we present different simulation approaches and advanced numerical techniques that adequately describe the hysteresis loop shapes and the collective nature of the magnetization fluctuations associated with the criticality of the hysteresis loop for different sample shapes and varied parameters of disorder and rate of change of the external field,as well as the influence of thermal fluctuations and demagnetizing fields.The studied examples demonstrate how these numerical approaches reveal newphysical insights,providing quantitativemeasures of pertinent variables extracted from the systems’simulated or experimentally measured Barkhausen noise signals.The described computational techniques using inherent scale-invariance can be applied to the analysis of various complex systems,both quantum and classical,exhibiting non-equilibrium dynamical critical point or self-organized criticality.展开更多
The conventional computing architecture faces substantial chal-lenges,including high latency and energy consumption between memory and processing units.In response,in-memory computing has emerged as a promising altern...The conventional computing architecture faces substantial chal-lenges,including high latency and energy consumption between memory and processing units.In response,in-memory computing has emerged as a promising alternative architecture,enabling computing operations within memory arrays to overcome these limitations.Memristive devices have gained significant attention as key components for in-memory computing due to their high-density arrays,rapid response times,and ability to emulate biological synapses.Among these devices,two-dimensional(2D)material-based memristor and memtransistor arrays have emerged as particularly promising candidates for next-generation in-memory computing,thanks to their exceptional performance driven by the unique properties of 2D materials,such as layered structures,mechanical flexibility,and the capability to form heterojunctions.This review delves into the state-of-the-art research on 2D material-based memristive arrays,encompassing critical aspects such as material selection,device perfor-mance metrics,array structures,and potential applications.Furthermore,it provides a comprehensive overview of the current challenges and limitations associated with these arrays,along with potential solutions.The primary objective of this review is to serve as a significant milestone in realizing next-generation in-memory computing utilizing 2D materials and bridge the gap from single-device characterization to array-level and system-level implementations of neuromorphic computing,leveraging the potential of 2D material-based memristive devices.展开更多
Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a nove...Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”.展开更多
With the rapid advancement of Internet of Vehicles(IoV)technology,the demands for real-time navigation,advanced driver-assistance systems(ADAS),vehicle-to-vehicle(V2V)and vehicle-to-infrastructure(V2I)communications,a...With the rapid advancement of Internet of Vehicles(IoV)technology,the demands for real-time navigation,advanced driver-assistance systems(ADAS),vehicle-to-vehicle(V2V)and vehicle-to-infrastructure(V2I)communications,and multimedia entertainment systems have made in-vehicle applications increasingly computingintensive and delay-sensitive.These applications require significant computing resources,which can overwhelm the limited computing capabilities of vehicle terminals despite advancements in computing hardware due to the complexity of tasks,energy consumption,and cost constraints.To address this issue in IoV-based edge computing,particularly in scenarios where available computing resources in vehicles are scarce,a multi-master and multi-slave double-layer game model is proposed,which is based on task offloading and pricing strategies.The establishment of Nash equilibrium of the game is proven,and a distributed artificial bee colonies algorithm is employed to achieve game equilibrium.Our proposed solution addresses these bottlenecks by leveraging a game-theoretic approach for task offloading and resource allocation in mobile edge computing(MEC)-enabled IoV environments.Simulation results demonstrate that the proposed scheme outperforms existing solutions in terms of convergence speed and system utility.Specifically,the total revenue achieved by our scheme surpasses other algorithms by at least 8.98%.展开更多
Practical real-world scenarios such as the Internet,social networks,and biological networks present the challenges of data scarcity and complex correlations,which limit the applications of artificial intelligence.The ...Practical real-world scenarios such as the Internet,social networks,and biological networks present the challenges of data scarcity and complex correlations,which limit the applications of artificial intelligence.The graph structure is a typical tool used to formulate such correlations,it is incapable of modeling highorder correlations among different objects in systems;thus,the graph structure cannot fully convey the intricate correlations among objects.Confronted with the aforementioned two challenges,hypergraph computation models high-order correlations among data,knowledge,and rules through hyperedges and leverages these high-order correlations to enhance the data.Additionally,hypergraph computation achieves collaborative computation using data and high-order correlations,thereby offering greater modeling flexibility.In particular,we introduce three types of hypergraph computation methods:①hypergraph structure modeling,②hypergraph semantic computing,and③efficient hypergraph computing.We then specify how to adopt hypergraph computation in practice by focusing on specific tasks such as three-dimensional(3D)object recognition,revealing that hypergraph computation can reduce the data requirement by 80%while achieving comparable performance or improve the performance by 52%given the same data,compared with a traditional data-based method.A comprehensive overview of the applications of hypergraph computation in diverse domains,such as intelligent medicine and computer vision,is also provided.Finally,we introduce an open-source deep learning library,DeepHypergraph(DHG),which can serve as a tool for the practical usage of hypergraph computation.展开更多
文摘The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical research.The review covers key topics such as computational modelling,bioinformatics,machine learning in medical diagnostics,and the integration of wearable technology for real-time health monitoring.Major findings indicate that computational models have significantly enhanced the understanding of complex biological systems,while machine learning algorithms have improved the accuracy of disease prediction and diagnosis.The synergy between bioinformatics and computational techniques has led to breakthroughs in personalized medicine,enabling more precise treatment strategies.Additionally,the integration of wearable devices with advanced computational methods has opened new avenues for continuous health monitoring and early disease detection.The review emphasizes the need for interdisciplinary collaboration to further advance this field.Future research should focus on developing more robust and scalable computational models,enhancing data integration techniques,and addressing ethical considerations related to data privacy and security.By fostering innovation at the intersection of these disciplines,the potential to revolutionize healthcare delivery and outcomes becomes increasingly attainable.
基金supported by National Key Research and Development Program of China(Grant No.2022YFA1405600)Beijing Natural Science Foundation(Grant No.Z210006)+3 种基金National Natural Science Foundation of China—Young Scientists Fund(Grant No.12104051,62122004)Hong Kong Research Grant Council(Grant Nos.27206321,17205922,17212923 and C1009-22GF)Shenzhen Science and Technology Innovation Commission(SGDX20220530111405040)partially supported by ACCESS—AI Chip Center for Emerging Smart Systems,sponsored by Innovation and Technology Fund(ITF),Hong Kong SAR。
文摘Recurrent neural networks(RNNs)have proven to be indispensable for processing sequential and temporal data,with extensive applications in language modeling,text generation,machine translation,and time-series forecasting.Despite their versatility,RNNs are frequently beset by significant training expenses and slow convergence times,which impinge upon their deployment in edge AI applications.Reservoir computing(RC),a specialized RNN variant,is attracting increased attention as a cost-effective alternative for processing temporal and sequential data at the edge.RC’s distinctive advantage stems from its compatibility with emerging memristive hardware,which leverages the energy efficiency and reduced footprint of analog in-memory and in-sensor computing,offering a streamlined and energy-efficient solution.This review offers a comprehensive explanation of RC’s underlying principles,fabrication processes,and surveys recent progress in nano-memristive device based RC systems from the viewpoints of in-memory and in-sensor RC function.It covers a spectrum of memristive device,from established oxide-based memristive device to cutting-edge material science developments,providing readers with a lucid understanding of RC’s hardware implementation and fostering innovative designs for in-sensor RC systems.Lastly,we identify prevailing challenges and suggest viable solutions,paving the way for future advancements in in-sensor RC technology.
文摘We present a comprehensive mathematical framework establishing the foundations of holographic quantum computing, a novel paradigm that leverages holographic phenomena to achieve superior error correction and algorithmic efficiency. We rigorously demonstrate that quantum information can be encoded and processed using holographic principles, establishing fundamental theorems characterizing the error-correcting properties of holographic codes. We develop a complete set of universal quantum gates with explicit constructions and prove exponential speedups for specific classes of computational problems. Our framework demonstrates that holographic quantum codes achieve a code rate scaling as O(1/logn), superior to traditional quantum LDPC codes, while providing inherent protection against errors via geometric properties of the code structures. We prove a threshold theorem establishing that arbitrary quantum computations can be performed reliably when physical error rates fall below a constant threshold. Notably, our analysis suggests certain algorithms, including those involving high-dimensional state spaces and long-range interactions, achieve exponential speedups over both classical and conventional quantum approaches. This work establishes the theoretical foundations for a new approach to quantum computation that provides natural fault tolerance and scalability, directly addressing longstanding challenges of the field.
基金supported by National Natural Science Foundation of China No.62231012Natural Science Foundation for Outstanding Young Scholars of Heilongjiang Province under Grant YQ2020F001Heilongjiang Province Postdoctoral General Foundation under Grant AUGA4110004923.
文摘Low earth orbit(LEO)satellites with wide coverage can carry the mobile edge computing(MEC)servers with powerful computing capabilities to form the LEO satellite edge computing system,providing computing services for the global ground users.In this paper,the computation offloading problem and resource allocation problem are formulated as a mixed integer nonlinear program(MINLP)problem.This paper proposes a computation offloading algorithm based on deep deterministic policy gradient(DDPG)to obtain the user offloading decisions and user uplink transmission power.This paper uses the convex optimization algorithm based on Lagrange multiplier method to obtain the optimal MEC server resource allocation scheme.In addition,the expression of suboptimal user local CPU cycles is derived by relaxation method.Simulation results show that the proposed algorithm can achieve excellent convergence effect,and the proposed algorithm significantly reduces the system utility values at considerable time cost compared with other algorithms.
基金supported in part by Sub Project of National Key Research and Development plan in 2020 NO.2020YFC1511704Beijing Information Science and Technology University NO.2020KYNH212,NO.2021CGZH302+1 种基金Beijing Science and Technology Project(Grant No.Z211100004421009)in part by the National Natural Science Foundation of China(Grant No.62301058).
文摘Low Earth orbit(LEO)satellite networks have the advantages of low transmission delay and low deployment cost,playing an important role in providing reliable services to ground users.This paper studies an efficient inter-satellite cooperative computation offloading(ICCO)algorithm for LEO satellite networks.Specifically,an ICCO system model is constructed,which considers using neighboring satellites in the LEO satellite networks to collaboratively process tasks generated by ground user terminals,effectively improving resource utilization efficiency.Additionally,the optimization objective of minimizing the system task computation offloading delay and energy consumption is established,which is decoupled into two sub-problems.In terms of computational resource allocation,the convexity of the problem is proved through theoretical derivation,and the Lagrange multiplier method is used to obtain the optimal solution of computational resources.To deal with the task offloading decision,a dynamic sticky binary particle swarm optimization algorithm is designed to obtain the offloading decision by iteration.Simulation results show that the ICCO algorithm can effectively reduce the delay and energy consumption.
基金supported in part by the Natural Science Foundation of China (62171110,U19B2028 and U20B2070)。
文摘Recently,the Fog-Radio Access Network(F-RAN)has gained considerable attention,because of its flexible architecture that allows rapid response to user requirements.In this paper,computational offloading in F-RAN is considered,where multiple User Equipments(UEs)offload their computational tasks to the F-RAN through fog nodes.Each UE can select one of the fog nodes to offload its task,and each fog node may serve multiple UEs.The tasks are computed by the fog nodes or further offloaded to the cloud via a capacity-limited fronhaul link.In order to compute all UEs'tasks quickly,joint optimization of UE-Fog association,radio and computation resources of F-RAN is proposed to minimize the maximum latency of all UEs.This min-max problem is formulated as a Mixed Integer Nonlinear Program(MINP).To tackle it,first,MINP is reformulated as a continuous optimization problem,and then the Majorization Minimization(MM)method is used to find a solution.The MM approach that we develop is unconventional in that each MM subproblem is solved inexactly with the same provable convergence guarantee as the exact MM,thereby reducing the complexity of MM iteration.In addition,a cooperative offloading model is considered,where the fog nodes compress-and-forward their received signals to the cloud.Under this model,a similar min-max latency optimization problem is formulated and tackled by the inexact MM.Simulation results show that the proposed algorithms outperform some offloading strategies,and that the cooperative offloading can exploit transmission diversity better than noncooperative offloading to achieve better latency performance.
文摘Over-the-air computation(AirComp)enables federated learning(FL)to rapidly aggregate local models at the central server using waveform superposition property of wireless channel.In this paper,a robust transmission scheme for an AirCompbased FL system with imperfect channel state information(CSI)is proposed.To model CSI uncertainty,an expectation-based error model is utilized.The main objective is to maximize the number of selected devices that meet mean-squared error(MSE)requirements for model broadcast and model aggregation.The problem is formulated as a combinatorial optimization problem and is solved in two steps.First,the priority order of devices is determined by a sparsity-inducing procedure.Then,a feasibility detection scheme is used to select the maximum number of devices to guarantee that the MSE requirements are met.An alternating optimization(AO)scheme is used to transform the resulting nonconvex problem into two convex subproblems.Numerical results illustrate the effectiveness and robustness of the proposed scheme.
基金supported by Communication University of China(HG23035)partly supported by the Fundamental Research Funds for the Central Universities(CUC230A013).
文摘With the rapid growth of socialmedia,the spread of fake news has become a growing problem,misleading the public and causing significant harm.As social media content is often composed of both images and text,the use of multimodal approaches for fake news detection has gained significant attention.To solve the problems existing in previous multi-modal fake news detection algorithms,such as insufficient feature extraction and insufficient use of semantic relations between modes,this paper proposes the MFFFND-Co(Multimodal Feature Fusion Fake News Detection with Co-Attention Block)model.First,the model deeply explores the textual content,image content,and frequency domain features.Then,it employs a Co-Attention mechanism for cross-modal fusion.Additionally,a semantic consistency detectionmodule is designed to quantify semantic deviations,thereby enhancing the performance of fake news detection.Experimentally verified on two commonly used datasets,Twitter and Weibo,the model achieved F1 scores of 90.0% and 94.0%,respectively,significantly outperforming the pre-modified MFFFND(Multimodal Feature Fusion Fake News Detection with Attention Block)model and surpassing other baseline models.This improves the accuracy of detecting fake information in artificial intelligence detection and engineering software detection.
文摘This paper develops a comprehensive computational modeling and simulation framework based on Complex Adaptive Systems(CAS)theory to unveil the underlying mechanisms of self-organization,nonlinear evolution,and emergence in social systems.By integrating mathematical models,agent-based modeling,network dynamic analysis,and hybrid modeling approaches,the study applies CAS theory to case studies in economic markets,political decision-making,and social interactions.The experimental results demonstrate that local interactions among individual agents can give rise to complex global phenomena,such as market fluctuations,opinion polarization,and sudden outbreaks of social movements.This framework not only provides a more robust explanation for the nonlinear dynamics and abrupt transitions that traditional models often fail to capture,but also offers valuable decision-support tools for public policy formulation,social governance,and risk management.Emphasizing the importance of interdisciplinary approaches,this work outlines future research directions in high-performance computing,artificial intelligence,and real-time data integration to further advance the theoretical and practical applications of CAS in the social sciences.
基金Incubation Program of Youth Innovation in Shandong ProvinceKey Research and Development Program of Shandong Province(2021TZXD007)。
文摘Food allergy has become a global concern.Spleen tyrosine kinase(SYK)inhibitors are promising therapeutics against allergic disorders.In this study,a total of 300 natural phenolic compounds were firstly subjected to virtual screening.Sesamin and its metabolites,sesamin monocatechol(SC-1)and sesamin dicatechol(SC-2),were identified as potential SYK inhibitors,showing high binding affinity and inhibition efficiency towards SYK.Compared with R406(a traditional SYK inhibitor),sesamin,SC-1,and SC-2 had lower binding energy and inhibition constant(Ki)during molecular docking,exhibited higher bioavailability,safety,metabolism/clearance rate,and distribution uniformity ADMET predictions,and showed high stability in occupying the ATP-binding pocket of SYK during molecular dynamics simulations.In anti-dinitrophenyl-immunoglobulin E(Anti-DNP-Ig E)/dinitrophenyl-human serum albumin(DNP-HSA)-stimulated rat basophilic leukemia(RBL-2H3)cells,sesamin in the concentration range of 5-80μmol/L influenced significantly the degranulation and cytokine release,with 54.00%inhibition againstβ-hexosaminidase release and 58.45%decrease in histamine.In BALB/c mice,sesamin could ameliorate Anti-DNP-Ig E/DNP-HSA-induced passive cutaneous anaphylaxis(PCA)and ovalbumin(OVA)-induced active systemic anaphylaxis(ASA)reactions,reduce the levels of allergic mediators(immunoglobulins and pro-inflammatory cytokines),partially correct the imbalance of T helper(Th)cells differentiation in the spleen,and inhibit the phosphorylation of SYK and its downstream signaling proteins,including p38 mitogen-activated protein kinases(p38 MAPK),extracellular signalregulated kinases(ERK),and p65 nuclear factor-κB(p65 NF-κB)in the spleen.Thus,sesamin may be a safe and versatile SYK inhibitor that can alleviate Ig E-mediated food allergies.
文摘This paper proposes an innovative approach to social science research based on quantum theory,integrating quantum probability,quantum game theory,and quantum statistical methods into a comprehensive interdisciplinary framework for both theoretical and empirical investigation.The study elaborates on how core quantum concepts such as superposition,interference,and measurement collapse can be applied to model social decision making,cognition,and interactions.Advanced quantum computational methods and algorithms are employed to transition from theoretical model development to simulation and experimental validation.Through case studies in international relations,economic games,and political decision making,the research demonstrates that quantum models possess significant advantages in explaining irrational and context-dependent behaviors that traditional methods often fail to capture.The paper also explores the potential applications of quantum social science in policy formulation and public decision making,addresses the ethical,privacy,and social equity challenges posed by quantum artificial intelligence,and outlines future research directions at the convergence of quantum AI,quantum machine learning,and big data analytics.The findings suggest that quantum social science not only offers a novel perspective for understanding complex social phenomena but also lays the foundation for more accurate and efficient systems in social forecasting and decision support.
文摘The integration of technologies like artificial intelligence,6G,and vehicular ad-hoc networks holds great potential to meet the communication demands of the Internet of Vehicles and drive the advancement of vehicle applications.However,these advancements also generate a surge in data processing requirements,necessitating the offloading of vehicular tasks to edge servers due to the limited computational capacity of vehicles.Despite recent advancements,the robustness and scalability of the existing approaches with respect to the number of vehicles and edge servers and their resources,as well as privacy,remain a concern.In this paper,a lightweight offloading strategy that leverages ubiquitous connectivity through the Space Air Ground Integrated Vehicular Network architecture while ensuring privacy preservation is proposed.The Internet of Vehicles(IoV)environment is first modeled as a graph,with vehicles and base stations as nodes,and their communication links as edges.Secondly,vehicular applications are offloaded to suitable servers based on latency using an attention-based heterogeneous graph neural network(HetGNN)algorithm.Subsequently,a differential privacy stochastic gradient descent trainingmechanism is employed for privacypreserving of vehicles and offloading inference.Finally,the simulation results demonstrated that the proposedHetGNN method shows good performance with 0.321 s of inference time,which is 42.68%,63.93%,30.22%,and 76.04% less than baseline methods such as Deep Deterministic Policy Gradient,Deep Q Learning,Deep Neural Network,and Genetic Algorithm,respectively.
文摘Semisubmersible naval ships are versatile military crafts that combine the advantageous features of high-speed planing crafts and submarines.At-surface,these ships are designed to provide sufficient speed and maneuverability.Additionally,they can perform shallow dives,offering low visual and acoustic detectability.Therefore,the hydrodynamic design of a semisubmersible naval ship should address at-surface and submerged conditions.In this study,Numerical analyses were performed using a semisubmersible hull form to analyze its hydrodynamic features,including resistance,powering,and maneuvering.The simulations were conducted with Star CCM+version 2302,a commercial package program that solves URANS equations using the SST k-ωturbulence model.The flow analysis was divided into two parts:at-surface simulations and shallowly submerged simulations.At-surface simulations cover the resistance,powering,trim,and sinkage at transition and planing regimes,with corresponding Froude numbers ranging from 0.42 to 1.69.Shallowly submerged simulations were performed at seven different submergence depths,ranging from D/LOA=0.0635 to D/LOA=0.635,and at two different speeds with Froude numbers of 0.21 and 0.33.The behaviors of the hydrodynamic forces and pitching moment for different operation depths were comprehensively analyzed.The results of the numerical analyses provide valuable insights into the hydrodynamic performance of semisubmersible naval ships,highlighting the critical factors influencing their resistance,powering,and maneuvering capabilities in both at-surface and submerged conditions.
文摘The rapid evolution of international trade necessitates the adoption of intelligent digital solutions to enhance trade facilitation.The Single Window System(SWS)has emerged as a key mechanism for streamlining trade documentation,customs clearance,and regulatory compliance.However,traditional SWS implementations face challenges such as data fragmentation,inefficient processing,and limited real-time intelligence.This study proposes a computational social science framework that integrates artificial intelligence(AI),machine learning,network analytics,and blockchain to optimize SWS operations.By employing predictive modeling,agentbased simulations,and algorithmic governance,this research demonstrates how computational methodologies improve trade efficiency,enhance regulatory compliance,and reduce transaction costs.Empirical case studies on AI-driven customs clearance,blockchain-enabled trade transparency,and network-based trade policy simulation illustrate the practical applications of these techniques.The study concludes that interdisciplinary collaboration and algorithmic governance are essential for advancing digital trade facilitation,ensuring resilience,transparency,and adaptability in global trade ecosystems.
基金supported by the Deanship of Scientific Research,Vice Presidency for Graduate Studies and Scientific Research,King Faisal University,Saudi Arabia[KFU250259].
文摘Streptococcus suis(S.suis)is a major disease impacting pig farming globally.It can also be transferred to humans by eating raw pork.A comprehensive study was recently carried out to determine the indices throughmultiple geographic regions in China.Methods:The well-posed theorems were employed to conduct a thorough analysis of the model’s feasible features,including positivity,boundedness equilibria,reproduction number,and parameter sensitivity.Stochastic Euler,Runge Kutta,and EulerMaruyama are some of the numerical techniques used to replicate the behavior of the streptococcus suis infection in the pig population.However,the dynamic qualities of the suggested model cannot be restored using these techniques.Results:For the stochastic delay differential equations of the model,the non-standard finite difference approach in the sense of stochasticity is developed to avoid several problems such as negativity,unboundedness,inconsistency,and instability of the findings.Results from traditional stochastic methods either converge conditionally or diverge over time.The stochastic non-negative step size convergence nonstandard finite difference(NSFD)method unconditionally converges to the model’s true states.Conclusions:This study improves our understanding of the dynamics of streptococcus suis infection using versions of stochastic with delay approaches and opens up new avenues for the study of cognitive processes and neuronal analysis.Theplotted interaction behaviour and new solution comparison profiles.
基金Djordje Spasojevic and Svetislav Mijatovic acknowledge the support from the Ministry of Science,TechnologicalDevelopment and Innovation of the Republic of Serbia(Agreement No.451-03-65/2024-03/200162)S.J.ibid.(Agreement No.451-03-65/2024-03/200122)Bosiljka Tadic from the Slovenian Research Agency(program P1-0044).
文摘Disordered ferromagnets with a domain structure that exhibit a hysteresis loop when driven by the external magnetic field are essential materials for modern technological applications.Therefore,the understanding and potential for controlling the hysteresis phenomenon in thesematerials,especially concerning the disorder-induced critical behavior on the hysteresis loop,have attracted significant experimental,theoretical,and numerical research efforts.We review the challenges of the numerical modeling of physical phenomena behind the hysteresis loop critical behavior in disordered ferromagnetic systems related to the non-equilibriumstochastic dynamics of domain walls driven by external fields.Specifically,using the extended Random Field Ising Model,we present different simulation approaches and advanced numerical techniques that adequately describe the hysteresis loop shapes and the collective nature of the magnetization fluctuations associated with the criticality of the hysteresis loop for different sample shapes and varied parameters of disorder and rate of change of the external field,as well as the influence of thermal fluctuations and demagnetizing fields.The studied examples demonstrate how these numerical approaches reveal newphysical insights,providing quantitativemeasures of pertinent variables extracted from the systems’simulated or experimentally measured Barkhausen noise signals.The described computational techniques using inherent scale-invariance can be applied to the analysis of various complex systems,both quantum and classical,exhibiting non-equilibrium dynamical critical point or self-organized criticality.
基金This work was supported by the National Research Foundation,Singapore under Award No.NRF-CRP24-2020-0002.
文摘The conventional computing architecture faces substantial chal-lenges,including high latency and energy consumption between memory and processing units.In response,in-memory computing has emerged as a promising alternative architecture,enabling computing operations within memory arrays to overcome these limitations.Memristive devices have gained significant attention as key components for in-memory computing due to their high-density arrays,rapid response times,and ability to emulate biological synapses.Among these devices,two-dimensional(2D)material-based memristor and memtransistor arrays have emerged as particularly promising candidates for next-generation in-memory computing,thanks to their exceptional performance driven by the unique properties of 2D materials,such as layered structures,mechanical flexibility,and the capability to form heterojunctions.This review delves into the state-of-the-art research on 2D material-based memristive arrays,encompassing critical aspects such as material selection,device perfor-mance metrics,array structures,and potential applications.Furthermore,it provides a comprehensive overview of the current challenges and limitations associated with these arrays,along with potential solutions.The primary objective of this review is to serve as a significant milestone in realizing next-generation in-memory computing utilizing 2D materials and bridge the gap from single-device characterization to array-level and system-level implementations of neuromorphic computing,leveraging the potential of 2D material-based memristive devices.
基金the National Key Research and Development Program of China(2021YFF0900800)the National Natural Science Foundation of China(61972276,62206116,62032016)+2 种基金the New Liberal Arts Reform and Practice Project of National Ministry of Education(2021170002)the Open Research Fund of the State Key Laboratory for Management and Control of Complex Systems(20210101)Tianjin University Talent Innovation Reward Program for Literature and Science Graduate Student(C1-2022-010)。
文摘Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”.
基金supported by the Central University Basic Research Business Fee Fund Project(J2023-027)China Postdoctoral Science Foundation(No.2022M722248).
文摘With the rapid advancement of Internet of Vehicles(IoV)technology,the demands for real-time navigation,advanced driver-assistance systems(ADAS),vehicle-to-vehicle(V2V)and vehicle-to-infrastructure(V2I)communications,and multimedia entertainment systems have made in-vehicle applications increasingly computingintensive and delay-sensitive.These applications require significant computing resources,which can overwhelm the limited computing capabilities of vehicle terminals despite advancements in computing hardware due to the complexity of tasks,energy consumption,and cost constraints.To address this issue in IoV-based edge computing,particularly in scenarios where available computing resources in vehicles are scarce,a multi-master and multi-slave double-layer game model is proposed,which is based on task offloading and pricing strategies.The establishment of Nash equilibrium of the game is proven,and a distributed artificial bee colonies algorithm is employed to achieve game equilibrium.Our proposed solution addresses these bottlenecks by leveraging a game-theoretic approach for task offloading and resource allocation in mobile edge computing(MEC)-enabled IoV environments.Simulation results demonstrate that the proposed scheme outperforms existing solutions in terms of convergence speed and system utility.Specifically,the total revenue achieved by our scheme surpasses other algorithms by at least 8.98%.
文摘Practical real-world scenarios such as the Internet,social networks,and biological networks present the challenges of data scarcity and complex correlations,which limit the applications of artificial intelligence.The graph structure is a typical tool used to formulate such correlations,it is incapable of modeling highorder correlations among different objects in systems;thus,the graph structure cannot fully convey the intricate correlations among objects.Confronted with the aforementioned two challenges,hypergraph computation models high-order correlations among data,knowledge,and rules through hyperedges and leverages these high-order correlations to enhance the data.Additionally,hypergraph computation achieves collaborative computation using data and high-order correlations,thereby offering greater modeling flexibility.In particular,we introduce three types of hypergraph computation methods:①hypergraph structure modeling,②hypergraph semantic computing,and③efficient hypergraph computing.We then specify how to adopt hypergraph computation in practice by focusing on specific tasks such as three-dimensional(3D)object recognition,revealing that hypergraph computation can reduce the data requirement by 80%while achieving comparable performance or improve the performance by 52%given the same data,compared with a traditional data-based method.A comprehensive overview of the applications of hypergraph computation in diverse domains,such as intelligent medicine and computer vision,is also provided.Finally,we introduce an open-source deep learning library,DeepHypergraph(DHG),which can serve as a tool for the practical usage of hypergraph computation.