This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit comple...This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit complexity satisfies all requirements for physical observables, including self-adjointness, gauge invariance, and a consistent measurement theory with well-defined uncertainty relations. We develop complete protocols for measuring complexity in quantum systems and demonstrate its connections to gauge theory and quantum gravity. Our results suggest that computational requirements may constitute physical laws as fundamental as energy conservation. This framework grants insights into the relationship between quantum information, gravity, and the emergence of spacetime geometry while offering practical methods for experimental verification. Our results indicate that the physical universe may be governed by both energetic and computational constraints, with profound implications for our understanding of fundamental physics.展开更多
The suprachiasmatic nucleus in the hypothalamus is the master circadian clock in mammals,coordinating physiological processes with the 24-hour day–night cycle.Comprising various cell types,the suprachiasmatic nucleus...The suprachiasmatic nucleus in the hypothalamus is the master circadian clock in mammals,coordinating physiological processes with the 24-hour day–night cycle.Comprising various cell types,the suprachiasmatic nucleus(SCN)integrates environmental signals to maintain complex and robust circadian rhythms.Understanding the complexity and synchrony within SCN neurons is essential for effective circadian clock function.Synchrony involves coordinated neuronal firing for robust rhythms,while complexity reflects diverse activity patterns and interactions,indicating adaptability.Interestingly,the SCN retains circadian rhythms in vitro,demonstrating intrinsic rhythmicity.This study introduces the multiscale structural complexity method to analyze changes in SCN neuronal activity and complexity at macro and micro levels,based on Bagrov et al.’s approach.By examining structural complexity and local complexities across scales,we aim to understand how tetrodotoxin,a neurotoxin that inhibits action potentials,affects SCN neurons.Our method captures critical scales in neuronal interactions that traditional methods may overlook.Validation with the Goodwin model confirms the reliability of our observations.By integrating experimental data with theoretical models,this study provides new insights into the effects of tetrodotoxin(TTX)on neuronal complexities,contributing to the understanding of circadian rhythms.展开更多
The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertai...The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertainties. This study identifies 20 complexity factors through expert interviews and literature, categorising them into six groups. The Analytical Hierarchy Process evaluated the significance of different factors, establishing their corresponding weights to enhance adaptive project scheduling. A system dynamics (SD) model is developed and tested to evaluate the dynamic behaviour of identified complexity factors. The model simulates the impact of complexity on total project duration (TPD), revealing significant deviations from initial deterministic estimates. Data collection and analysis for reliability tests, including normality and Cronbach alpha, to validate the model’s components and expert feedback. Sensitivity analysis confirmed a positive relationship between complexity and project duration, with higher complexity levels resulting in increased TPD. This relationship highlights the inadequacy of static planning approaches and underscores the importance of addressing complexity dynamically. The study provides a framework for enhancing planning systems through system dynamics and recommends expanding the model to ensure broader applicability in diverse construction projects.展开更多
Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning fr...Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.展开更多
Visual question answering(VQA)is a multimodal task,involving a deep understanding of the image scene and the question’s meaning and capturing the relevant correlations between both modalities to infer the appropriate...Visual question answering(VQA)is a multimodal task,involving a deep understanding of the image scene and the question’s meaning and capturing the relevant correlations between both modalities to infer the appropriate answer.In this paper,we propose a VQA system intended to answer yes/no questions about real-world images,in Arabic.To support a robust VQA system,we work in two directions:(1)Using deep neural networks to semantically represent the given image and question in a fine-grainedmanner,namely ResNet-152 and Gated Recurrent Units(GRU).(2)Studying the role of the utilizedmultimodal bilinear pooling fusion technique in the trade-o.between the model complexity and the overall model performance.Some fusion techniques could significantly increase the model complexity,which seriously limits their applicability for VQA models.So far,there is no evidence of how efficient these multimodal bilinear pooling fusion techniques are for VQA systems dedicated to yes/no questions.Hence,a comparative analysis is conducted between eight bilinear pooling fusion techniques,in terms of their ability to reduce themodel complexity and improve themodel performance in this case of VQA systems.Experiments indicate that these multimodal bilinear pooling fusion techniques have improved the VQA model’s performance,until reaching the best performance of 89.25%.Further,experiments have proven that the number of answers in the developed VQA system is a critical factor that a.ects the effectiveness of these multimodal bilinear pooling techniques in achieving their main objective of reducing the model complexity.The Multimodal Local Perception Bilinear Pooling(MLPB)technique has shown the best balance between the model complexity and its performance,for VQA systems designed to answer yes/no questions.展开更多
This paper introduces a novel approach for parameter sensitivity evaluation and efficient slope reliability analysis based on quantile-based first-order second-moment method(QFOSM).The core principles of the QFOSM are...This paper introduces a novel approach for parameter sensitivity evaluation and efficient slope reliability analysis based on quantile-based first-order second-moment method(QFOSM).The core principles of the QFOSM are elucidated geometrically from the perspective of expanding ellipsoids.Based on this geometric interpretation,the QFOSM is further extended to estimate sensitivity indices and assess the significance of various uncertain parameters involved in the slope system.The proposed method has the advantage of computational simplicity,akin to the conventional first-order second-moment method(FOSM),while providing estimation accuracy close to that of the first-order reliability method(FORM).Its performance is demonstrated with a numerical example and three slope examples.The results show that the proposed method can efficiently estimate the slope reliability and simultaneously evaluate the sensitivity of the uncertain parameters.The proposed method does not involve complex optimization or iteration required by the FORM.It can provide a valuable complement to the existing approximate reliability analysis methods,offering rapid sensitivity evaluation and slope reliability analysis.展开更多
Investigating natural-inspired applications is a perennially appealing subject for scientists. The current increase in the speed of natural-origin structure growth may be linked to their superior mechanical properties...Investigating natural-inspired applications is a perennially appealing subject for scientists. The current increase in the speed of natural-origin structure growth may be linked to their superior mechanical properties and environmental resilience. Biological composite structures with helicoidal schemes and designs have remarkable capacities to absorb impact energy and withstand damage. However, there is a dearth of extensive study on the influence of fiber redirection and reorientation inside the matrix of a helicoid structure on its mechanical performance and reactivity. The present study aimed to explore the static and transient responses of a bio-inspired helicoid laminated composite(B-iHLC) shell under the influence of an explosive load using an isomorphic method. The structural integrity of the shell is maintained by a viscoelastic basis known as the Pasternak foundation, which encompasses two coefficients of stiffness and one coefficient of damping. The equilibrium equations governing shell dynamics are obtained by using Hamilton's principle and including the modified first-order shear theory,therefore obviating the need to employ a shear correction factor. The paper's model and approach are validated by doing numerical comparisons with respected publications. The findings of this study may be used in the construction of military and civilian infrastructure in situations when the structure is subjected to severe stresses that might potentially result in catastrophic collapse. The findings of this paper serve as the foundation for several other issues, including geometric optimization and the dynamic response of similar mechanical structures.展开更多
Using Euler’s first-order explicit(EE)method and the peridynamic differential operator(PDDO)to discretize the time and internal crystal-size derivatives,respectively,the Euler’s first-order explicit method–peridyna...Using Euler’s first-order explicit(EE)method and the peridynamic differential operator(PDDO)to discretize the time and internal crystal-size derivatives,respectively,the Euler’s first-order explicit method–peridynamic differential operator(EE–PDDO)was obtained for solving the one-dimensional population balance equation in crystallization.Four different conditions during crystallization were studied:size-independent growth,sizedependent growth in a batch process,nucleation and size-independent growth,and nucleation and size-dependent growth in a continuous process.The high accuracy of the EE–PDDO method was confirmed by comparing it with the numerical results obtained using the second-order upwind and HR-van methods.The method is characterized by non-oscillation and high accuracy,especially in the discontinuous and sharp crystal size distribution.The stability of the EE–PDDO method,choice of weight function in the PDDO method,and optimal time step are also discussed.展开更多
This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to co...This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.展开更多
Methane generation in landfills and its inadequate management represent the major avoidable source of anthropogenic methane today. This paper models methane production and the potential resources expected (electrical ...Methane generation in landfills and its inadequate management represent the major avoidable source of anthropogenic methane today. This paper models methane production and the potential resources expected (electrical energy production and potential carbon credits from avoided CH4 emissions) from its proper management in a municipal solid waste landfill located in Ouagadougou, Burkina Faso. The modeling was carried out using two first-order decay (FOD) models (LandGEM V3.02 and SWANA) using parameters evaluated on the basis of the characteristics of the waste admitted to the landfill and weather data for the site. At the same time, production data have been collected since 2016 in order to compare them with the model results. The results obtained from these models were compared to experimental one. For the simulation of methane production, the SWANA model showed better consistency with experimental data, with a coefficient of determination (R²) of 0.59 compared with the LandGEM model, which obtained a coefficient of 0.006. Thus, despite the low correlation values linked to the poor consistency of experimental data, the SWANA model models methane production much better than the LandGEM model. Thus, despite the low correlation values linked to the poor consistency of the experimental data, the SWANA model models methane production much better than the LandGEM V3.02 model. It was noted that the poor consistency of the experimental data justifies these low coefficients, and that they can be improved in the future thanks to ongoing in situ measurements. According to the SWANA model prediction, in 27 years of operation a biogas plant with 33% electrical efficiency using biogas from the Polesgo landfill would avoid 1,340 GgCO2e. Also, the evaluation of revenues due to electricity and carbon credit gave a total revenue derived from methane production of US$27.38 million at a cost of US$10.5/tonne CO2e.展开更多
Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity aff...Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity affects the quality of L2 collaborative writing.This study investigates the impact of task complexity on syntactic complexity,lexical complexity,and accuracy of the second language collaborative writing.English learners(N=50)in a Chinese university were required to complete two writing tasks collaboratively:a simple task and a complex task.Through analyzing their compositions,we found that task complexity has a significant impact on syntactic complexity and high complexity writing tasks help increase the syntactic complexity of second language collaborative writing.However,task complexity has little impact on lexical complexity and accuracy.The accuracy of writing tasks is largely influenced by the task requirements.The research results may enhance the understanding of collaborative writing and task complexity and provide valuable guidance for the second language teaching.展开更多
Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the n...Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.展开更多
Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over...Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.展开更多
Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic ...Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.展开更多
Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide ...Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.展开更多
The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are ca...The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.展开更多
The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features ...The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.展开更多
Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays c...Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays conducted within a specified time interval refers to “Delta troponin”. This study aimed to assess the correlation between the complexity of coronary lesions and significant delta high-sensitivity troponin I levels in patients with non-ST elevation myocardial infarction. Methods: This cross-sectional study was conducted in the Department of Cardiology, Ibrahim Cardiac Hospital & Research Institute, Dhaka, Bangladesh from July 2022 to June 2023. A total of 70 patients with significant delta hs-cTnI were included and divided into two groups: Group-A (n = 36) with a delta hs-cTnI rise between >20% to 49%, and Group-B (n = 34) with a delta hs-cTnI rise ≥ 50%. Coronary angiography was performed and the SYNTAX Score was calculated for both groups. Data were collected using SPSS version 25.0. Result: Patients with a high-rise delta cTnI (≥50%) showed a significantly higher proportion of lesions in major coronary arteries LCx and LAD compared to those with a low-rise of cTnI (20% - 49%) (p = 0.007 and 0.004, respectively). The presence of triple vessel diseases was higher in the former group than in the latter (p 22, compared to none in the low-rise group (p Conclusion: A high rise in delta hs-cTnI is linked to higher SYNTAX scores, signifying complex coronary lesions in NSTEMI patients, with a significant linear correlation between them. Patients with a high rise in delta cTnI may exhibit more significant coronary artery lesions and triple vessel diseases compared to those with a low rise in cTnI.展开更多
This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the m...This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.展开更多
To expand the study on the structures and biological activities of the anthracyclines anticancer drugs and reduce their toxic side effects,the new anthraquinone derivatives,9‑pyridylanthrahydrazone(9‑PAH)and 9,10‑bisp...To expand the study on the structures and biological activities of the anthracyclines anticancer drugs and reduce their toxic side effects,the new anthraquinone derivatives,9‑pyridylanthrahydrazone(9‑PAH)and 9,10‑bispyridylanthrahydrazone(9,10‑PAH)were designed and synthesized.Utilizing 9‑PAH and 9,10‑PAH as promising anticancer ligands,their respective copper complexes,namely[Cu(L1)Cl_(2)]Cl(1)and{[Cu_(4)(μ_(2)‑Cl)_(3)Cl_(4)(9,10‑PAH)_(2)(DMSO)_(2)]Cl_(2)}_(n)(2),were subsequently synthesized,where the new ligand L1 is formed by coupling two 9‑PAH ligands in the coordination reaction.The chemical and crystal structures of 1 and 2 were elucidated by IR,MS,elemental analysis,and single‑crystal X‑ray diffraction.Complex 1 forms a mononuclear structure.L1 coordinates with Cu through its three N atoms,together with two Cl atoms,to form a five‑coordinated square pyramidal geometry.Complex 2 constitutes a polymeric structure,wherein each structural unit centrosymmetrically encompasses two five‑coordinated binuclear copper complexes(Cu1,Cu2)of 9,10‑PAH,with similar square pyramidal geometry.A chlorine atom(Cl_(2)),located at the symmetry center,bridges Cu1 and Cu1A to connect the two binuclear copper structures.Meanwhile,the two five‑coordinated Cu2 atoms symmetrically bridge the adjacent structural units via one coordinated Cl atom,respectively,thus forming a 1D chain‑like polymeric structure.In vitro anticancer activity assessments revealed that 1 and 2 showed significant cytotoxicity even higher than cisplatin.Specifically,the IC_(50)values of 2 against HeLa‑229 and SK‑OV‑3 cancer cell lines were determined to be(5.92±0.32)μmol·L^(-1)and(6.48±0.39)μmol·L^(-1),respectively.2 could also block the proliferation of HeLa‑229 cells in S phase and significantly induce cell apoptosis.In addition,fluorescence quenching competition experiments suggested that 2 might interact with DNA by an intercalative binding mode,offering insights into its underlying anticancer mechanism.CCDC:2388918,1;2388919,2.展开更多
文摘This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit complexity satisfies all requirements for physical observables, including self-adjointness, gauge invariance, and a consistent measurement theory with well-defined uncertainty relations. We develop complete protocols for measuring complexity in quantum systems and demonstrate its connections to gauge theory and quantum gravity. Our results suggest that computational requirements may constitute physical laws as fundamental as energy conservation. This framework grants insights into the relationship between quantum information, gravity, and the emergence of spacetime geometry while offering practical methods for experimental verification. Our results indicate that the physical universe may be governed by both energetic and computational constraints, with profound implications for our understanding of fundamental physics.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.12275179,11875042,and 12150410309)the Natural Science Foundation of Shanghai(Grant No.21ZR1443900).
文摘The suprachiasmatic nucleus in the hypothalamus is the master circadian clock in mammals,coordinating physiological processes with the 24-hour day–night cycle.Comprising various cell types,the suprachiasmatic nucleus(SCN)integrates environmental signals to maintain complex and robust circadian rhythms.Understanding the complexity and synchrony within SCN neurons is essential for effective circadian clock function.Synchrony involves coordinated neuronal firing for robust rhythms,while complexity reflects diverse activity patterns and interactions,indicating adaptability.Interestingly,the SCN retains circadian rhythms in vitro,demonstrating intrinsic rhythmicity.This study introduces the multiscale structural complexity method to analyze changes in SCN neuronal activity and complexity at macro and micro levels,based on Bagrov et al.’s approach.By examining structural complexity and local complexities across scales,we aim to understand how tetrodotoxin,a neurotoxin that inhibits action potentials,affects SCN neurons.Our method captures critical scales in neuronal interactions that traditional methods may overlook.Validation with the Goodwin model confirms the reliability of our observations.By integrating experimental data with theoretical models,this study provides new insights into the effects of tetrodotoxin(TTX)on neuronal complexities,contributing to the understanding of circadian rhythms.
文摘The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertainties. This study identifies 20 complexity factors through expert interviews and literature, categorising them into six groups. The Analytical Hierarchy Process evaluated the significance of different factors, establishing their corresponding weights to enhance adaptive project scheduling. A system dynamics (SD) model is developed and tested to evaluate the dynamic behaviour of identified complexity factors. The model simulates the impact of complexity on total project duration (TPD), revealing significant deviations from initial deterministic estimates. Data collection and analysis for reliability tests, including normality and Cronbach alpha, to validate the model’s components and expert feedback. Sensitivity analysis confirmed a positive relationship between complexity and project duration, with higher complexity levels resulting in increased TPD. This relationship highlights the inadequacy of static planning approaches and underscores the importance of addressing complexity dynamically. The study provides a framework for enhancing planning systems through system dynamics and recommends expanding the model to ensure broader applicability in diverse construction projects.
基金supported by the National Natural Science Foundation of China(32370703)the CAMS Innovation Fund for Medical Sciences(CIFMS)(2022-I2M-1-021,2021-I2M-1-061)the Major Project of Guangzhou National Labora-tory(GZNL2024A01015).
文摘Viral infectious diseases,characterized by their intricate nature and wide-ranging diversity,pose substantial challenges in the domain of data management.The vast volume of data generated by these diseases,spanning from the molecular mechanisms within cells to large-scale epidemiological patterns,has surpassed the capabilities of traditional analytical methods.In the era of artificial intelligence(AI)and big data,there is an urgent necessity for the optimization of these analytical methods to more effectively handle and utilize the information.Despite the rapid accumulation of data associated with viral infections,the lack of a comprehensive framework for integrating,selecting,and analyzing these datasets has left numerous researchers uncertain about which data to select,how to access it,and how to utilize it most effectively in their research.This review endeavors to fill these gaps by exploring the multifaceted nature of viral infectious diseases and summarizing relevant data across multiple levels,from the molecular details of pathogens to broad epidemiological trends.The scope extends from the micro-scale to the macro-scale,encompassing pathogens,hosts,and vectors.In addition to data summarization,this review thoroughly investigates various dataset sources.It also traces the historical evolution of data collection in the field of viral infectious diseases,highlighting the progress achieved over time.Simultaneously,it evaluates the current limitations that impede data utilization.Furthermore,we propose strategies to surmount these challenges,focusing on the development and application of advanced computational techniques,AI-driven models,and enhanced data integration practices.By providing a comprehensive synthesis of existing knowledge,this review is designed to guide future research and contribute to more informed approaches in the surveillance,prevention,and control of viral infectious diseases,particularly within the context of the expanding big-data landscape.
文摘Visual question answering(VQA)is a multimodal task,involving a deep understanding of the image scene and the question’s meaning and capturing the relevant correlations between both modalities to infer the appropriate answer.In this paper,we propose a VQA system intended to answer yes/no questions about real-world images,in Arabic.To support a robust VQA system,we work in two directions:(1)Using deep neural networks to semantically represent the given image and question in a fine-grainedmanner,namely ResNet-152 and Gated Recurrent Units(GRU).(2)Studying the role of the utilizedmultimodal bilinear pooling fusion technique in the trade-o.between the model complexity and the overall model performance.Some fusion techniques could significantly increase the model complexity,which seriously limits their applicability for VQA models.So far,there is no evidence of how efficient these multimodal bilinear pooling fusion techniques are for VQA systems dedicated to yes/no questions.Hence,a comparative analysis is conducted between eight bilinear pooling fusion techniques,in terms of their ability to reduce themodel complexity and improve themodel performance in this case of VQA systems.Experiments indicate that these multimodal bilinear pooling fusion techniques have improved the VQA model’s performance,until reaching the best performance of 89.25%.Further,experiments have proven that the number of answers in the developed VQA system is a critical factor that a.ects the effectiveness of these multimodal bilinear pooling techniques in achieving their main objective of reducing the model complexity.The Multimodal Local Perception Bilinear Pooling(MLPB)technique has shown the best balance between the model complexity and its performance,for VQA systems designed to answer yes/no questions.
基金supported by the National Natural Science Foundation of China(Grant Nos.52109144,52025094 and 52222905).
文摘This paper introduces a novel approach for parameter sensitivity evaluation and efficient slope reliability analysis based on quantile-based first-order second-moment method(QFOSM).The core principles of the QFOSM are elucidated geometrically from the perspective of expanding ellipsoids.Based on this geometric interpretation,the QFOSM is further extended to estimate sensitivity indices and assess the significance of various uncertain parameters involved in the slope system.The proposed method has the advantage of computational simplicity,akin to the conventional first-order second-moment method(FOSM),while providing estimation accuracy close to that of the first-order reliability method(FORM).Its performance is demonstrated with a numerical example and three slope examples.The results show that the proposed method can efficiently estimate the slope reliability and simultaneously evaluate the sensitivity of the uncertain parameters.The proposed method does not involve complex optimization or iteration required by the FORM.It can provide a valuable complement to the existing approximate reliability analysis methods,offering rapid sensitivity evaluation and slope reliability analysis.
文摘Investigating natural-inspired applications is a perennially appealing subject for scientists. The current increase in the speed of natural-origin structure growth may be linked to their superior mechanical properties and environmental resilience. Biological composite structures with helicoidal schemes and designs have remarkable capacities to absorb impact energy and withstand damage. However, there is a dearth of extensive study on the influence of fiber redirection and reorientation inside the matrix of a helicoid structure on its mechanical performance and reactivity. The present study aimed to explore the static and transient responses of a bio-inspired helicoid laminated composite(B-iHLC) shell under the influence of an explosive load using an isomorphic method. The structural integrity of the shell is maintained by a viscoelastic basis known as the Pasternak foundation, which encompasses two coefficients of stiffness and one coefficient of damping. The equilibrium equations governing shell dynamics are obtained by using Hamilton's principle and including the modified first-order shear theory,therefore obviating the need to employ a shear correction factor. The paper's model and approach are validated by doing numerical comparisons with respected publications. The findings of this study may be used in the construction of military and civilian infrastructure in situations when the structure is subjected to severe stresses that might potentially result in catastrophic collapse. The findings of this paper serve as the foundation for several other issues, including geometric optimization and the dynamic response of similar mechanical structures.
文摘Using Euler’s first-order explicit(EE)method and the peridynamic differential operator(PDDO)to discretize the time and internal crystal-size derivatives,respectively,the Euler’s first-order explicit method–peridynamic differential operator(EE–PDDO)was obtained for solving the one-dimensional population balance equation in crystallization.Four different conditions during crystallization were studied:size-independent growth,sizedependent growth in a batch process,nucleation and size-independent growth,and nucleation and size-dependent growth in a continuous process.The high accuracy of the EE–PDDO method was confirmed by comparing it with the numerical results obtained using the second-order upwind and HR-van methods.The method is characterized by non-oscillation and high accuracy,especially in the discontinuous and sharp crystal size distribution.The stability of the EE–PDDO method,choice of weight function in the PDDO method,and optimal time step are also discussed.
文摘This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.
文摘Methane generation in landfills and its inadequate management represent the major avoidable source of anthropogenic methane today. This paper models methane production and the potential resources expected (electrical energy production and potential carbon credits from avoided CH4 emissions) from its proper management in a municipal solid waste landfill located in Ouagadougou, Burkina Faso. The modeling was carried out using two first-order decay (FOD) models (LandGEM V3.02 and SWANA) using parameters evaluated on the basis of the characteristics of the waste admitted to the landfill and weather data for the site. At the same time, production data have been collected since 2016 in order to compare them with the model results. The results obtained from these models were compared to experimental one. For the simulation of methane production, the SWANA model showed better consistency with experimental data, with a coefficient of determination (R²) of 0.59 compared with the LandGEM model, which obtained a coefficient of 0.006. Thus, despite the low correlation values linked to the poor consistency of experimental data, the SWANA model models methane production much better than the LandGEM model. Thus, despite the low correlation values linked to the poor consistency of the experimental data, the SWANA model models methane production much better than the LandGEM V3.02 model. It was noted that the poor consistency of the experimental data justifies these low coefficients, and that they can be improved in the future thanks to ongoing in situ measurements. According to the SWANA model prediction, in 27 years of operation a biogas plant with 33% electrical efficiency using biogas from the Polesgo landfill would avoid 1,340 GgCO2e. Also, the evaluation of revenues due to electricity and carbon credit gave a total revenue derived from methane production of US$27.38 million at a cost of US$10.5/tonne CO2e.
文摘Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity affects the quality of L2 collaborative writing.This study investigates the impact of task complexity on syntactic complexity,lexical complexity,and accuracy of the second language collaborative writing.English learners(N=50)in a Chinese university were required to complete two writing tasks collaboratively:a simple task and a complex task.Through analyzing their compositions,we found that task complexity has a significant impact on syntactic complexity and high complexity writing tasks help increase the syntactic complexity of second language collaborative writing.However,task complexity has little impact on lexical complexity and accuracy.The accuracy of writing tasks is largely influenced by the task requirements.The research results may enhance the understanding of collaborative writing and task complexity and provide valuable guidance for the second language teaching.
文摘Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.
基金supported by the Ministry of Science and High Education of Russia(Theme No.368121031700169-1 of ICMM UrB RAS).
文摘Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.
文摘Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.
文摘Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.
基金supported in part by the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(Grant No.2022C03174)the National Natural Science Foundation of China(No.92067103)+4 种基金the Key Research and Development Program of Shaanxi,China(No.2021ZDLGY06-02)the Natural Science Foundation of Shaanxi Province(No.2019ZDLGY12-02)the Shaanxi Innovation Team Project(No.2018TD-007)the Xi'an Science and technology Innovation Plan(No.201809168CX9JC10)the Fundamental Research Funds for the Central Universities(No.YJS2212)and National 111 Program of China B16037.
文摘The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.
文摘The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.
文摘Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays conducted within a specified time interval refers to “Delta troponin”. This study aimed to assess the correlation between the complexity of coronary lesions and significant delta high-sensitivity troponin I levels in patients with non-ST elevation myocardial infarction. Methods: This cross-sectional study was conducted in the Department of Cardiology, Ibrahim Cardiac Hospital & Research Institute, Dhaka, Bangladesh from July 2022 to June 2023. A total of 70 patients with significant delta hs-cTnI were included and divided into two groups: Group-A (n = 36) with a delta hs-cTnI rise between >20% to 49%, and Group-B (n = 34) with a delta hs-cTnI rise ≥ 50%. Coronary angiography was performed and the SYNTAX Score was calculated for both groups. Data were collected using SPSS version 25.0. Result: Patients with a high-rise delta cTnI (≥50%) showed a significantly higher proportion of lesions in major coronary arteries LCx and LAD compared to those with a low-rise of cTnI (20% - 49%) (p = 0.007 and 0.004, respectively). The presence of triple vessel diseases was higher in the former group than in the latter (p 22, compared to none in the low-rise group (p Conclusion: A high rise in delta hs-cTnI is linked to higher SYNTAX scores, signifying complex coronary lesions in NSTEMI patients, with a significant linear correlation between them. Patients with a high rise in delta cTnI may exhibit more significant coronary artery lesions and triple vessel diseases compared to those with a low rise in cTnI.
文摘This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.
文摘To expand the study on the structures and biological activities of the anthracyclines anticancer drugs and reduce their toxic side effects,the new anthraquinone derivatives,9‑pyridylanthrahydrazone(9‑PAH)and 9,10‑bispyridylanthrahydrazone(9,10‑PAH)were designed and synthesized.Utilizing 9‑PAH and 9,10‑PAH as promising anticancer ligands,their respective copper complexes,namely[Cu(L1)Cl_(2)]Cl(1)and{[Cu_(4)(μ_(2)‑Cl)_(3)Cl_(4)(9,10‑PAH)_(2)(DMSO)_(2)]Cl_(2)}_(n)(2),were subsequently synthesized,where the new ligand L1 is formed by coupling two 9‑PAH ligands in the coordination reaction.The chemical and crystal structures of 1 and 2 were elucidated by IR,MS,elemental analysis,and single‑crystal X‑ray diffraction.Complex 1 forms a mononuclear structure.L1 coordinates with Cu through its three N atoms,together with two Cl atoms,to form a five‑coordinated square pyramidal geometry.Complex 2 constitutes a polymeric structure,wherein each structural unit centrosymmetrically encompasses two five‑coordinated binuclear copper complexes(Cu1,Cu2)of 9,10‑PAH,with similar square pyramidal geometry.A chlorine atom(Cl_(2)),located at the symmetry center,bridges Cu1 and Cu1A to connect the two binuclear copper structures.Meanwhile,the two five‑coordinated Cu2 atoms symmetrically bridge the adjacent structural units via one coordinated Cl atom,respectively,thus forming a 1D chain‑like polymeric structure.In vitro anticancer activity assessments revealed that 1 and 2 showed significant cytotoxicity even higher than cisplatin.Specifically,the IC_(50)values of 2 against HeLa‑229 and SK‑OV‑3 cancer cell lines were determined to be(5.92±0.32)μmol·L^(-1)and(6.48±0.39)μmol·L^(-1),respectively.2 could also block the proliferation of HeLa‑229 cells in S phase and significantly induce cell apoptosis.In addition,fluorescence quenching competition experiments suggested that 2 might interact with DNA by an intercalative binding mode,offering insights into its underlying anticancer mechanism.CCDC:2388918,1;2388919,2.