This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit comple...This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit complexity satisfies all requirements for physical observables, including self-adjointness, gauge invariance, and a consistent measurement theory with well-defined uncertainty relations. We develop complete protocols for measuring complexity in quantum systems and demonstrate its connections to gauge theory and quantum gravity. Our results suggest that computational requirements may constitute physical laws as fundamental as energy conservation. This framework grants insights into the relationship between quantum information, gravity, and the emergence of spacetime geometry while offering practical methods for experimental verification. Our results indicate that the physical universe may be governed by both energetic and computational constraints, with profound implications for our understanding of fundamental physics.展开更多
The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertai...The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertainties. This study identifies 20 complexity factors through expert interviews and literature, categorising them into six groups. The Analytical Hierarchy Process evaluated the significance of different factors, establishing their corresponding weights to enhance adaptive project scheduling. A system dynamics (SD) model is developed and tested to evaluate the dynamic behaviour of identified complexity factors. The model simulates the impact of complexity on total project duration (TPD), revealing significant deviations from initial deterministic estimates. Data collection and analysis for reliability tests, including normality and Cronbach alpha, to validate the model’s components and expert feedback. Sensitivity analysis confirmed a positive relationship between complexity and project duration, with higher complexity levels resulting in increased TPD. This relationship highlights the inadequacy of static planning approaches and underscores the importance of addressing complexity dynamically. The study provides a framework for enhancing planning systems through system dynamics and recommends expanding the model to ensure broader applicability in diverse construction projects.展开更多
Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity aff...Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity affects the quality of L2 collaborative writing.This study investigates the impact of task complexity on syntactic complexity,lexical complexity,and accuracy of the second language collaborative writing.English learners(N=50)in a Chinese university were required to complete two writing tasks collaboratively:a simple task and a complex task.Through analyzing their compositions,we found that task complexity has a significant impact on syntactic complexity and high complexity writing tasks help increase the syntactic complexity of second language collaborative writing.However,task complexity has little impact on lexical complexity and accuracy.The accuracy of writing tasks is largely influenced by the task requirements.The research results may enhance the understanding of collaborative writing and task complexity and provide valuable guidance for the second language teaching.展开更多
Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the n...Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.展开更多
Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic ...Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.展开更多
Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in ge...Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.展开更多
The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are ca...The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.展开更多
Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over...Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.展开更多
This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to co...This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.展开更多
Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide ...Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.展开更多
This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the m...This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.展开更多
Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays c...Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays conducted within a specified time interval refers to “Delta troponin”. This study aimed to assess the correlation between the complexity of coronary lesions and significant delta high-sensitivity troponin I levels in patients with non-ST elevation myocardial infarction. Methods: This cross-sectional study was conducted in the Department of Cardiology, Ibrahim Cardiac Hospital & Research Institute, Dhaka, Bangladesh from July 2022 to June 2023. A total of 70 patients with significant delta hs-cTnI were included and divided into two groups: Group-A (n = 36) with a delta hs-cTnI rise between >20% to 49%, and Group-B (n = 34) with a delta hs-cTnI rise ≥ 50%. Coronary angiography was performed and the SYNTAX Score was calculated for both groups. Data were collected using SPSS version 25.0. Result: Patients with a high-rise delta cTnI (≥50%) showed a significantly higher proportion of lesions in major coronary arteries LCx and LAD compared to those with a low-rise of cTnI (20% - 49%) (p = 0.007 and 0.004, respectively). The presence of triple vessel diseases was higher in the former group than in the latter (p 22, compared to none in the low-rise group (p Conclusion: A high rise in delta hs-cTnI is linked to higher SYNTAX scores, signifying complex coronary lesions in NSTEMI patients, with a significant linear correlation between them. Patients with a high rise in delta cTnI may exhibit more significant coronary artery lesions and triple vessel diseases compared to those with a low rise in cTnI.展开更多
The aim of the paper is to explore the main paradigms and methodology of social research,framing them in historical path and highlighting the epistemological foundations.It moves from reflection on research methodolog...The aim of the paper is to explore the main paradigms and methodology of social research,framing them in historical path and highlighting the epistemological foundations.It moves from reflection on research methodology as a‘discourse of method’to focus on the paradigmatic dimension of the social sciences,according to Kuhn’s meaning for which paradigm indicates a shared and recognized theoretical perspective within the scientific community.The paper highlights the role of paradigms in shaping theoretical and empirical inquiry.It further examines the positivist and neo-positivist paradigms,which emphasize observation and empirical verifiability,quantification,formulation of laws,and cause-and-effect relationships,arguing for the uniqueness of the scientific method.Lazarsfeld brings to the social sciences the language‘of variables’,borrowed from mathematics and statistics.The distinction introduced by Windelband between‘nomothetic’and‘idiographic’sciences is followed by Weber’s elaboration of the concept of‘Verstehen’,which shifts the focus to the understanding of social reality through the meanings that individuals attribute to their actions.The interpretive paradigm paves the way for qualitative research methods.Finally,the paper delves into the complexity paradigm,which challenges the reductionist and deterministic models of classical science and outlines an epistemological shift in the key notions of science,introducing concepts such as‘emergence’,‘auto-eco-organization’and‘recursive processes’.The complexity of social reality calls for a rethinking of sociological methods,favoring multidimensional and event-based analysis over statistical regularities,privileging observation,intervention and the‘in vivo method’on the level of empirical research.Complexity pushes sociology to redefine itself along with its object traditionally understood as‘society’.展开更多
The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features ...The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.展开更多
For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; ...For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.展开更多
The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVD...The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.展开更多
By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the s...By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..展开更多
Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = ...Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = 2,3,the number of sequences with given k-error linear complexity and the expected k-error linear complexity are provided. Moreover,the proportion of the sequences whose k-error linear complexity is bigger than the expected value is analyzed.展开更多
A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linki...A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linking number to random linking number and variable speed growth index a are introduced in it. The main effects of vg and a on topological transition features of the LUHNM-VSC are revealed. For comparison with the other models, we construct a type of the network complexity pyramid with seven levels, in which from the bottom level-1 to the top level-7 of the pyramid simplicity-universality is increasing but complexity-diversity is decreasing. The transition relations between them depend on matching of four hybrid ratios (dr, fd, gr, vg). Thus the most of network models can be investigated in the unification way via four hybrid ratios (dr, fd, gr, vg). The LUHNM-VSG as the level-1 of the pyramid is much better and closer to description of real-world networks as well as has potential application.展开更多
Along with the rapid development of air traffic, the contradiction between conventional air traffic management(ATM)and the increasingly complex air traffic situations is more severe,which essentially reduces the opera...Along with the rapid development of air traffic, the contradiction between conventional air traffic management(ATM)and the increasingly complex air traffic situations is more severe,which essentially reduces the operational efficiency of air transport systems. Thus,objectively measuring the air traffic situation complexity becomes a concern in the field of ATM. Most existing studies focus on air traffic complexity assessment,and rarely on the scientific guidance of complex traffic situations. According to the projected time of aircraft arriving at the target sector boundary,we formulated two control strategies to reduce the air traffic complexity. The strategy of entry time optimization was applied to the controllable flights in the adjacent upstream sectors. In contrast,the strategy of flying dynamic speed optimization was applied to the flights in the target sector. During the process of solving complexity control models,we introduced a physical programming method. We transformed the multi-objective optimization problem involving complexity and delay to single-objective optimization problems by designing different preference function. Actual data validated the two complexity control strategies can eliminate the high-complexity situations in reality. The control strategy based on the entry time optimization was more efficient than that based on the speed dynamic optimization. A basic framework for studying air traffic complexity management was preliminarily established. Our findings will help the implementation of a complexity-based ATM.展开更多
文摘This work proposes quantum circuit complexity—the minimal number of elementary operations needed to implement a quantum transformation—be established as a legitimate physical observable. We prove that circuit complexity satisfies all requirements for physical observables, including self-adjointness, gauge invariance, and a consistent measurement theory with well-defined uncertainty relations. We develop complete protocols for measuring complexity in quantum systems and demonstrate its connections to gauge theory and quantum gravity. Our results suggest that computational requirements may constitute physical laws as fundamental as energy conservation. This framework grants insights into the relationship between quantum information, gravity, and the emergence of spacetime geometry while offering practical methods for experimental verification. Our results indicate that the physical universe may be governed by both energetic and computational constraints, with profound implications for our understanding of fundamental physics.
文摘The construction projects’ dynamic and interconnected nature requires a comprehensive understanding of complexity during pre-construction. Traditional tools such as Gantt charts, CPM, and PERT often overlook uncertainties. This study identifies 20 complexity factors through expert interviews and literature, categorising them into six groups. The Analytical Hierarchy Process evaluated the significance of different factors, establishing their corresponding weights to enhance adaptive project scheduling. A system dynamics (SD) model is developed and tested to evaluate the dynamic behaviour of identified complexity factors. The model simulates the impact of complexity on total project duration (TPD), revealing significant deviations from initial deterministic estimates. Data collection and analysis for reliability tests, including normality and Cronbach alpha, to validate the model’s components and expert feedback. Sensitivity analysis confirmed a positive relationship between complexity and project duration, with higher complexity levels resulting in increased TPD. This relationship highlights the inadequacy of static planning approaches and underscores the importance of addressing complexity dynamically. The study provides a framework for enhancing planning systems through system dynamics and recommends expanding the model to ensure broader applicability in diverse construction projects.
文摘Nowadays,collaborative writing has gained much attention of many scholars.And task complexity is a crucial factor that influences second language(L2)writing.However,little research has explored how task complexity affects the quality of L2 collaborative writing.This study investigates the impact of task complexity on syntactic complexity,lexical complexity,and accuracy of the second language collaborative writing.English learners(N=50)in a Chinese university were required to complete two writing tasks collaboratively:a simple task and a complex task.Through analyzing their compositions,we found that task complexity has a significant impact on syntactic complexity and high complexity writing tasks help increase the syntactic complexity of second language collaborative writing.However,task complexity has little impact on lexical complexity and accuracy.The accuracy of writing tasks is largely influenced by the task requirements.The research results may enhance the understanding of collaborative writing and task complexity and provide valuable guidance for the second language teaching.
文摘Elementary information theory is used to model cybersecurity complexity, where the model assumes that security risk management is a binomial stochastic process. Complexity is shown to increase exponentially with the number of vulnerabilities in combination with security risk management entropy. However, vulnerabilities can be either local or non-local, where the former is confined to networked elements and the latter results from interactions between elements. Furthermore, interactions involve multiple methods of communication, where each method can contain vulnerabilities specific to that method. Importantly, the number of possible interactions scales quadratically with the number of elements in standard network topologies. Minimizing these interactions can significantly reduce the number of vulnerabilities and the accompanying complexity. Two network configurations that yield sub-quadratic and linear scaling relations are presented.
文摘Task-based Language Teaching(TBLT)research has provided ample evidence that cognitive complexity is an important aspect of task design that influences learner’s performance in terms of fluency,accuracy,and syntactic complexity.Despite the substantial number of empirical investigations into task complexity in journal articles,storyline complexity,one of the features of it,is scarcely investigated.Previous research mainly focused on the impact of storyline complexity on learners’oral performance,but the impact on learners’written performance is less investigated.Thus,this study aims at investigating the effects of narrative complexity of storyline on senior high school students’written performance,as displayed by its complexity,fluency,and accuracy.The present study has important pedagogical implications.That is,task design and assessment should make a distinction between different types of narrative tasks.For example,the task with single or dual storyline.Results on task complexity may contribute to informing the pedagogical choices made by teachers when prioritizing work with a specific linguistic dimension.
文摘Aim To present a quantitative method for structural complexity analysis and evaluation of information systems. Methods Based on Petri net modeling and analysis techniques and with the aid of mathematical tools in general net theory(GNT), a quantitative method for structure description and analysis of information systems was introduced. Results The structural complexity index and two related factors, i.e. element complexity factor and connection complexity factor were defined, and the relations between them and the parameters of the Petri net based model of the system were derived. Application example was presented. Conclusion The proposed method provides a theoretical basis for quantitative analysis and evaluation of the structural complexity and can be applied in the general planning and design processes of the information systems.
基金supported in part by the“Pioneer”and“Leading Goose”R&D Program of Zhejiang(Grant No.2022C03174)the National Natural Science Foundation of China(No.92067103)+4 种基金the Key Research and Development Program of Shaanxi,China(No.2021ZDLGY06-02)the Natural Science Foundation of Shaanxi Province(No.2019ZDLGY12-02)the Shaanxi Innovation Team Project(No.2018TD-007)the Xi'an Science and technology Innovation Plan(No.201809168CX9JC10)the Fundamental Research Funds for the Central Universities(No.YJS2212)and National 111 Program of China B16037.
文摘The security of Federated Learning(FL)/Distributed Machine Learning(DML)is gravely threatened by data poisoning attacks,which destroy the usability of the model by contaminating training samples,so such attacks are called causative availability indiscriminate attacks.Facing the problem that existing data sanitization methods are hard to apply to real-time applications due to their tedious process and heavy computations,we propose a new supervised batch detection method for poison,which can fleetly sanitize the training dataset before the local model training.We design a training dataset generation method that helps to enhance accuracy and uses data complexity features to train a detection model,which will be used in an efficient batch hierarchical detection process.Our model stockpiles knowledge about poison,which can be expanded by retraining to adapt to new attacks.Being neither attack-specific nor scenario-specific,our method is applicable to FL/DML or other online or offline scenarios.
基金supported by the Ministry of Science and High Education of Russia(Theme No.368121031700169-1 of ICMM UrB RAS).
文摘Continuous-flow microchannels are widely employed for synthesizing various materials,including nanoparticles,polymers,and metal-organic frameworks(MOFs),to name a few.Microsystem technology allows precise control over reaction parameters,resulting in purer,more uniform,and structurally stable products due to more effective mass transfer manipulation.However,continuous-flow synthesis processes may be accompanied by the emergence of spatial convective structures initiating convective flows.On the one hand,convection can accelerate reactions by intensifying mass transfer.On the other hand,it may lead to non-uniformity in the final product or defects,especially in MOF microcrystal synthesis.The ability to distinguish regions of convective and diffusive mass transfer may be the key to performing higher-quality reactions and obtaining purer products.In this study,we investigate,for the first time,the possibility of using the information complexity measure as a criterion for assessing the intensity of mass transfer in microchannels,considering both spatial and temporal non-uniformities of liquid’s distributions resulting from convection formation.We calculate the complexity using shearlet transform based on a local approach.In contrast to existing methods for calculating complexity,the shearlet transform based approach provides a more detailed representation of local heterogeneities.Our analysis involves experimental images illustrating the mixing process of two non-reactive liquids in a Y-type continuous-flow microchannel under conditions of double-diffusive convection formation.The obtained complexity fields characterize the mixing process and structure formation,revealing variations in mass transfer intensity along the microchannel.We compare the results with cases of liquid mixing via a pure diffusive mechanism.Upon analysis,it was revealed that the complexity measure exhibits sensitivity to variations in the type of mass transfer,establishing its feasibility as an indirect criterion for assessing mass transfer intensity.The method presented can extend beyond flow analysis,finding application in the controlling of microstructures of various materials(porosity,for instance)or surface defects in metals,optical systems and other materials that hold significant relevance in materials science and engineering.
文摘This work introduces a modification to the Heisenberg Uncertainty Principle (HUP) by incorporating quantum complexity, including potential nonlinear effects. Our theoretical framework extends the traditional HUP to consider the complexity of quantum states, offering a more nuanced understanding of measurement precision. By adding a complexity term to the uncertainty relation, we explore nonlinear modifications such as polynomial, exponential, and logarithmic functions. Rigorous mathematical derivations demonstrate the consistency of the modified principle with classical quantum mechanics and quantum information theory. We investigate the implications of this modified HUP for various aspects of quantum mechanics, including quantum metrology, quantum algorithms, quantum error correction, and quantum chaos. Additionally, we propose experimental protocols to test the validity of the modified HUP, evaluating their feasibility with current and near-term quantum technologies. This work highlights the importance of quantum complexity in quantum mechanics and provides a refined perspective on the interplay between complexity, entanglement, and uncertainty in quantum systems. The modified HUP has the potential to stimulate interdisciplinary research at the intersection of quantum physics, information theory, and complexity theory, with significant implications for the development of quantum technologies and the understanding of the quantum-to-classical transition.
文摘Living objects have complex internal and external interactions. The complexity is regulated and controlled by homeostasis, which is the balance of multiple opposing influences. The environmental effects finally guide the self-organized structure. The living systems are open, dynamic structures performing random, stationary, stochastic, self-organizing processes. The self-organizing procedure is defined by the spatial-temporal fractal structure, which is self-similar both in space and time. The system’s complexity appears in its energetics, which tries the most efficient use of the available energies;for that, it organizes various well-connected networks. The controller of environmental relations is the Darwinian selection on a long-time scale. The energetics optimize the healthy processes tuned to the highest efficacy and minimal loss (minimalization of the entropy production). The organism is built up by morphogenetic rules and develops various networks from the genetic level to the organism. The networks have intensive crosstalk and form a balance in the Nash equilibrium, which is the homeostatic state in healthy conditions. Homeostasis may be described as a Nash equilibrium, which ensures energy distribution in a “democratic” way regarding the functions of the parts in the complete system. Cancer radically changes the network system in the organism. Cancer is a network disease. Deviation from healthy networking appears at every level, from genetic (molecular) to cells, tissues, organs, and organisms. The strong proliferation of malignant tissue is the origin of most of the life-threatening processes. The weak side of cancer development is the change of complex information networking in the system, being vulnerable to immune attacks. Cancer cells are masters of adaptation and evade immune surveillance. This hiding process can be broken by electromagnetic nonionizing radiation, for which the malignant structure has no adaptation strategy. Our objective is to review the different sides of living complexity and use the knowledge to fight against cancer.
文摘This study examines the role of the syntactic complexity of the text in the reading comprehension skills of students.Utilizing the qualitative method of research,this paper used structured interview questions as the main data-gathering instruments.English language teachers from Coral na Munti National High School were selected as the respondents of the study.Finding of the study suggests that the syntactic complexity of the text affects the reading comprehension of the students.Students found it challenging to understand the message that the author conveyed if he or she used a large number of phrases and clauses in one sentence.Furthermore,the complex sentence syntactic structure was deemed the most challenging for students to understand.To overcome said challenges in comprehending text,various reading intervention programs were utilized by teachers.These interventions include focused or targeted instruction and the implementation of the Project Dear,suggested by the Department of Education.These programs were proven to help students improve their comprehension as well as their knowledge in syntactical structure of sentences.This study underscores the importance of selecting appropriate reading materials and implementing suitable reading intervention programs to enhance students’comprehension skills.
文摘Introduction: The role of high-sensitive cardiac troponin (hs-cTn) assays has higher analytical precision at lower concentrations to detect myocardial injury. The changes in troponin concentration between two assays conducted within a specified time interval refers to “Delta troponin”. This study aimed to assess the correlation between the complexity of coronary lesions and significant delta high-sensitivity troponin I levels in patients with non-ST elevation myocardial infarction. Methods: This cross-sectional study was conducted in the Department of Cardiology, Ibrahim Cardiac Hospital & Research Institute, Dhaka, Bangladesh from July 2022 to June 2023. A total of 70 patients with significant delta hs-cTnI were included and divided into two groups: Group-A (n = 36) with a delta hs-cTnI rise between >20% to 49%, and Group-B (n = 34) with a delta hs-cTnI rise ≥ 50%. Coronary angiography was performed and the SYNTAX Score was calculated for both groups. Data were collected using SPSS version 25.0. Result: Patients with a high-rise delta cTnI (≥50%) showed a significantly higher proportion of lesions in major coronary arteries LCx and LAD compared to those with a low-rise of cTnI (20% - 49%) (p = 0.007 and 0.004, respectively). The presence of triple vessel diseases was higher in the former group than in the latter (p 22, compared to none in the low-rise group (p Conclusion: A high rise in delta hs-cTnI is linked to higher SYNTAX scores, signifying complex coronary lesions in NSTEMI patients, with a significant linear correlation between them. Patients with a high rise in delta cTnI may exhibit more significant coronary artery lesions and triple vessel diseases compared to those with a low rise in cTnI.
文摘The aim of the paper is to explore the main paradigms and methodology of social research,framing them in historical path and highlighting the epistemological foundations.It moves from reflection on research methodology as a‘discourse of method’to focus on the paradigmatic dimension of the social sciences,according to Kuhn’s meaning for which paradigm indicates a shared and recognized theoretical perspective within the scientific community.The paper highlights the role of paradigms in shaping theoretical and empirical inquiry.It further examines the positivist and neo-positivist paradigms,which emphasize observation and empirical verifiability,quantification,formulation of laws,and cause-and-effect relationships,arguing for the uniqueness of the scientific method.Lazarsfeld brings to the social sciences the language‘of variables’,borrowed from mathematics and statistics.The distinction introduced by Windelband between‘nomothetic’and‘idiographic’sciences is followed by Weber’s elaboration of the concept of‘Verstehen’,which shifts the focus to the understanding of social reality through the meanings that individuals attribute to their actions.The interpretive paradigm paves the way for qualitative research methods.Finally,the paper delves into the complexity paradigm,which challenges the reductionist and deterministic models of classical science and outlines an epistemological shift in the key notions of science,introducing concepts such as‘emergence’,‘auto-eco-organization’and‘recursive processes’.The complexity of social reality calls for a rethinking of sociological methods,favoring multidimensional and event-based analysis over statistical regularities,privileging observation,intervention and the‘in vivo method’on the level of empirical research.Complexity pushes sociology to redefine itself along with its object traditionally understood as‘society’.
文摘The rhetorical structure of abstracts has been a widely discussed topic, as it can greatly enhance the abstract writing skills of second-language writers. This study aims to provide guidance on the syntactic features that L2 learners can employ, as well as suggest which features they should focus on in English academic writing. To achieve this, all samples were analyzed for rhetorical moves using Hyland’s five-rhetorical move model. Additionally, all sentences were evaluated for syntactic complexity, considering measures such as global, clausal and phrasal complexity. The findings reveal that expert writers exhibit a more balanced use of syntactic complexity across moves, effectively fulfilling the rhetorical objectives of abstracts. On the other hand, MA students tend to rely excessively on embedded structures and dependent clauses in an attempt to increase complexity. The implications of these findings for academic writing research, pedagogy, and assessment are thoroughly discussed.
文摘For many continuous bio-medieal signals with both strong nonlinearity and non-stationarity, two criterions were proposed for their complexity estimation : (1) Only a short data set is enough for robust estimation; (2) No over-coarse graining preproeessing, such as transferring the original signal into a binary time series, is needed. Co complexity measure proposed by us previously is one of such measures. However, it lacks the solid mathematical foundation and thus its use is limited. A modified version of this measure is proposed, and some important properties are proved rigorously. According to these properties, this measure can be considered as an index of randomness of time series in some senses, and thus also a quantitative index of complexity under the meaning of randomness finding complexity. Compared with other similar measures, this measure seems more suitable for estimating a large quantity of complexity measures for a given task, such as studying the dynamic variation of such measures in sliding windows of a long process, owing to its fast speed for estimation.
基金The National Basic Research Program of China(973Program)(No.2012CB725402)the National Natural Science Foundation of China(No.51478113)
文摘The complexity and applicability of three relative car-following models are investigated and they are the optimal velocity model (OVM),the generalized force model (GFM) and the full velocity difference model (FVDM).The vehicle trajectory data used is collected from the digital pictures obtained at a 30-storey building near Ⅰ-80 freeway.Three different calibrating methods are used to estimate the model parameters and to study the relationships between model complexity and applicability from overall,inter-driver and intra-driver analysis.Results of the three methods of the OVM,GFM and FVDM show that the complexity and applicability are not consistent and the complicated models are not always superior to the simple ones in modeling car-following.The findings of this study can provide useful information for car-following behavior modeling.
基金supported by the Chinese Academy of Science(grand KZCX2-406)founded by Chinese Science of Academy undred People’Project.
文摘By the method of gradient pattern analysis, twenty plots were set at altitudes of 700-2600 m with an interval of 100 m on the northern slope of the Changbai Mountain. The dissimilarity of respective sub-plots in the same community was measured and the complexity of plant communities at different altitudes was analyzed. The result from binary data of tree species in canopy tree indicated that the sub-plots in the communities, except subalpine Betula ermanii forest, showed comparatively high dissimilarity in species composition. Especially, the dissimilarity index (0.7) of broadleaved/Korean pine forest at low altitudes was obviously higher than other communities. The differences are not obvious between communities referring to dark coniferous forest. Comparatively, the dissimilarity in sub-plots of the communities at altitude of 1400 m was slightly higher than that of other communities, which reflected the complexity of tree species compositions of transitory-type communities. For subalpine Betula ermanii forest, tree species composition was simple and showed a high similarity between sub-plots. The results derived from binary data of shrub showed that the dissimilarity index of shrub species in broadleaved/Korean pine forest at low altitudes was higher than that in other communities, but the divergence tendency wasn抰 so obvious as that of arbor species. The dissimilarity derived from binary data of herb and all plant species at different altitudes showed greatly close tendency, and the differences in herb and all plant species between sub-plots were the greatest for the communities of broad-leaved-Korean pine forest and alpine tundra zone..
基金the National Natural Science Foundation of China (No.60373092).
文摘Linear complexity and k-error linear complexity of the stream cipher are two important standards to scale the randomicity of keystreams. For the 2n -periodicperiodic binary sequence with linear complexity 2n 1and k = 2,3,the number of sequences with given k-error linear complexity and the expected k-error linear complexity are provided. Moreover,the proportion of the sequences whose k-error linear complexity is bigger than the expected value is analyzed.
基金Supported by National Natural Science Foundation of China under Grant Nos. 70431002, 10647001, and 60874087
文摘A large unified hybrid network model with a variable speed growth (LUHNM-VSG) is proposed as third model of the unified hybrid network theoretical framework (UHNTF). A hybrid growth ratio vg of deterministic linking number to random linking number and variable speed growth index a are introduced in it. The main effects of vg and a on topological transition features of the LUHNM-VSC are revealed. For comparison with the other models, we construct a type of the network complexity pyramid with seven levels, in which from the bottom level-1 to the top level-7 of the pyramid simplicity-universality is increasing but complexity-diversity is decreasing. The transition relations between them depend on matching of four hybrid ratios (dr, fd, gr, vg). Thus the most of network models can be investigated in the unification way via four hybrid ratios (dr, fd, gr, vg). The LUHNM-VSG as the level-1 of the pyramid is much better and closer to description of real-world networks as well as has potential application.
基金supported by the National Natural Science Foundation of China (Nos.U1833103, 71801215, U1933103)the Fundamental Research Funds for the Central Universities (No.3122019129)。
文摘Along with the rapid development of air traffic, the contradiction between conventional air traffic management(ATM)and the increasingly complex air traffic situations is more severe,which essentially reduces the operational efficiency of air transport systems. Thus,objectively measuring the air traffic situation complexity becomes a concern in the field of ATM. Most existing studies focus on air traffic complexity assessment,and rarely on the scientific guidance of complex traffic situations. According to the projected time of aircraft arriving at the target sector boundary,we formulated two control strategies to reduce the air traffic complexity. The strategy of entry time optimization was applied to the controllable flights in the adjacent upstream sectors. In contrast,the strategy of flying dynamic speed optimization was applied to the flights in the target sector. During the process of solving complexity control models,we introduced a physical programming method. We transformed the multi-objective optimization problem involving complexity and delay to single-objective optimization problems by designing different preference function. Actual data validated the two complexity control strategies can eliminate the high-complexity situations in reality. The control strategy based on the entry time optimization was more efficient than that based on the speed dynamic optimization. A basic framework for studying air traffic complexity management was preliminarily established. Our findings will help the implementation of a complexity-based ATM.