Detecting coupling pattern between elements in a complex system is a basic task in data-driven analysis. The trajectory for each specific element is a cooperative result of its intrinsic dynamic, its couplings with ot...Detecting coupling pattern between elements in a complex system is a basic task in data-driven analysis. The trajectory for each specific element is a cooperative result of its intrinsic dynamic, its couplings with other elements, and the environment. It is subsequently composed of many components, only some of which take part in the couplings. In this paper we present a framework to detect the component correlation pattern. Firstly, the interested trajectories are decomposed into components by using decomposing methods such as the Fourier expansion and the Wavelet transformation. Secondly, the cross-correlations between the components are calculated, resulting into a component cross-correlation matrix(network).Finally, the dominant structure in the network is identified to characterize the coupling pattern in the system. Several deterministic dynamical models turn out to be characterized with rich structures such as the clustering of the components. The pattern of correlation between respiratory(RESP) and ECG signals is composed of five sub-clusters that are mainly formed by the components in ECG signal. Interestingly, only 7 components from RESP(scattered in four sub-clusters) take part in the realization of coupling between the two signals.展开更多
To address the issue of low measurement accuracy caused by noise interference in the acquisition of low fluid flow rate signals with ultrasonic Doppler flow meters,a novel signal processing algorithm that combines ens...To address the issue of low measurement accuracy caused by noise interference in the acquisition of low fluid flow rate signals with ultrasonic Doppler flow meters,a novel signal processing algorithm that combines ensemble empirical mode decomposition(EEMD)and cross-correlation algorithm was proposed.Firstly,a fast Fourier transform(FFT)spectrum analysis was utilized to ascertain the frequency range of the signal.Secondly,data acquisition was conducted at an appropriate sampling frequency,and the acquired Doppler flow rate signal was then decomposed into a series of intrinsic mode functions(IMFs)by EEMD.Subsequently,these decomposed IMFs were recombined based on their energy entropy,and then the noise of the recombined Doppler flow rate signal was removed by cross-correlation filtering.Finally,an ideal ultrasonic Doppler flow rate signal was extracted.Simulation and experimental verification show that the proposed Doppler flow signal processing method can effectively enhance the signal-to-noise ratio(SNR)and extend the lower limit of measurement of the ultrasonic Doppler flow meter.展开更多
The elastic thickness parameter was estimated using the mobile correlation technique between the observed isostatic disturbance and the gravity disturbance calculated through direct gravimetric modeling. We computed t...The elastic thickness parameter was estimated using the mobile correlation technique between the observed isostatic disturbance and the gravity disturbance calculated through direct gravimetric modeling. We computed the vertical flexure value of the crust for a specific elastic thickness using a given topographic dataset. The gravity disturbance due to the topography was determined after the calculation. A grid of values for the elastic thickness parameter was generated. Then, a moving correlation was performed between the observed gravity data(representing actual surface data) and the calculated data from the forward modeling. The optimum elastic thickness of the particular point corresponded to the highest correlation coefficient. The methodology was tested on synthetic data and showed that the synthetic depth closely matched the original depth, including the elastic thickness value. To validate the results, the described procedure was applied to a real dataset from the Barreirinhas Basin, situated in the northeastern region of Brazil. The results show that the obtained crustal depth is highly correlated with the depth from known models. Additionally, we noted that the elastic thickness behaves as expected, decreasing from the continent towards the ocean. Based on the results, this method has the potential to be employed as a direct estimate of crustal depth and elastic thickness for any region.展开更多
This document presents a framework for recognizing people by palm vein distribution analysis using cross-correlation based signatures to obtain descriptors. Haar wavelets are useful in reducing the number of features ...This document presents a framework for recognizing people by palm vein distribution analysis using cross-correlation based signatures to obtain descriptors. Haar wavelets are useful in reducing the number of features while maintaining high recognition rates. This experiment achieved 97.5% of individuals classified correctly with two levels of Haar wavelets. This study used twelve-version of RGB and NIR (near infrared) wavelength images per individual. One hundred people were studied;therefore 4,800 instances compose the complete database. A Multilayer Perceptron (MLP) was trained to improve the recognition rate in a k-fold cross-validation test with k = 10. Classification results using MLP neural network were obtained using Weka (open source machine learning software).展开更多
In the article“Deep Learning-Enhanced Brain Tumor Prediction via Entropy-Coded BPSO in CIELAB Color Space”by Mudassir Khalil,Muhammad Imran Sharif,Ahmed Naeem,Muhammad Umar Chaudhry,Hafiz Tayyab Rauf,Adham E.Ragab C...In the article“Deep Learning-Enhanced Brain Tumor Prediction via Entropy-Coded BPSO in CIELAB Color Space”by Mudassir Khalil,Muhammad Imran Sharif,Ahmed Naeem,Muhammad Umar Chaudhry,Hafiz Tayyab Rauf,Adham E.Ragab Computers,Materials&Continua,2023,Vol.77,No.2,pp.2031–2047.DOI:10.32604/cmc.2023.043687,URL:https://www.techscience.com/cmc/v77n2/54831,there was an error regarding the affiliation for the author Hafiz Tayyab Rauf.Instead of“Centre for Smart Systems,AI and Cybersecurity,Staffordshire University,Stoke-on-Trent,ST42DE,UK”,the affiliation should be“Independent Researcher,Bradford,BD80HS,UK”.展开更多
The temperature change and rate of CO2 change are correlated with a time lag, as reported in a previous paper. The correlation was investigated by calculating a correlation coefficient r of these changes for selected ...The temperature change and rate of CO2 change are correlated with a time lag, as reported in a previous paper. The correlation was investigated by calculating a correlation coefficient r of these changes for selected ENSO events in this study. Annual periodical increases and decreases in the CO2 concentration were considered, with a regular pattern of minimum values in August and maximum values in May each year. An increased deviation in CO2 and temperature was found in response to the occurrence of El Niño, but the increase in CO2 lagged behind the change in temperature by 5 months. This pattern was not observed for La Niña events. An increase in global CO2 emissions and a subsequent increase in global temperature proposed by IPCC were not observed, but an increase in global temperature, an increase in soil respiration, and a subsequent increase in global CO2 emissions were noticed. This natural process can be clearly detected during periods of increasing temperature specifically during El Niño events. The results cast strong doubts that anthropogenic CO2 is the cause of global warming.展开更多
A two-stage algorithm based on deep learning for the detection and recognition of can bottom spray codes and numbers is proposed to address the problems of small character areas and fast production line speeds in can ...A two-stage algorithm based on deep learning for the detection and recognition of can bottom spray codes and numbers is proposed to address the problems of small character areas and fast production line speeds in can bottom spray code number recognition.In the coding number detection stage,Differentiable Binarization Network is used as the backbone network,combined with the Attention and Dilation Convolutions Path Aggregation Network feature fusion structure to enhance the model detection effect.In terms of text recognition,using the Scene Visual Text Recognition coding number recognition network for end-to-end training can alleviate the problem of coding recognition errors caused by image color distortion due to variations in lighting and background noise.In addition,model pruning and quantization are used to reduce the number ofmodel parameters to meet deployment requirements in resource-constrained environments.A comparative experiment was conducted using the dataset of tank bottom spray code numbers collected on-site,and a transfer experiment was conducted using the dataset of packaging box production date.The experimental results show that the algorithm proposed in this study can effectively locate the coding of cans at different positions on the roller conveyor,and can accurately identify the coding numbers at high production line speeds.The Hmean value of the coding number detection is 97.32%,and the accuracy of the coding number recognition is 98.21%.This verifies that the algorithm proposed in this paper has high accuracy in coding number detection and recognition.展开更多
The care of a patient involved in major trauma with exsanguinating haemorrhage is time-critical to achieve definitive haemorrhage control,and it requires coordinated multidisciplinary care.During initial resuscitation...The care of a patient involved in major trauma with exsanguinating haemorrhage is time-critical to achieve definitive haemorrhage control,and it requires coordinated multidisciplinary care.During initial resuscitation of a patient in the emergency department(ED),Code Crimson activation facilitates rapid decisionmaking by multi-disciplinary specialists for definitive haemorrhage control in operating theatre(OT)and/or interventional radiology(IR)suite.Once this decision has been made,there may still be various factors that lead to delay in transporting the patient from ED to OT/IR.Red Blanket protocol identifies and addresses these factors and processes which cause delay,and aims to facilitate rapid and safe transport of the haemodynamically unstable patient from ED to OT,while minimizing delay in resuscitation during the transfer.The two processes,Code Crimson and Red Blanket,complement each other.It would be ideal to merge the two processes into a single protocol rather than having two separate workflows.Introducing these quality improvement strategies and coor-dinated processes within the trauma framework of the hospitals/healthcare systems will help in further improving the multi-disciplinary care for the complex trauma patients requiring rapid and definitive haemorrhage control.展开更多
Multilevel coding(MLC)is a commonly used polar coded modulation scheme,but challenging to implement in engineering due to its high complexity and long decoding delay for high-order modulations.To address these limitat...Multilevel coding(MLC)is a commonly used polar coded modulation scheme,but challenging to implement in engineering due to its high complexity and long decoding delay for high-order modulations.To address these limitations,a novel two-level serially concatenated MLC scheme,in which the bitlevels with similar reliability are bundled and transmitted together,is proposed.The proposed scheme hierarchically protects the two bit-level sets:the bitlevel sets at the higher level are sufficiently reliable and do not require excessive resources for protection,whereas only the bit-level sets at the lower level are encoded by polar codes.The proposed scheme has the advantages of low power consumption,low delay and high reliability.Moreover,an optimized constellation signal labeling rule that can enhance the performance is proposed.Finally,the superiority of the proposed scheme is validated through the theoretical analysis and simulation results.Compared with the bit interleaving coding modulation(BICM)scheme,under 256-quadrature amplitude modulation(QAM),the proposed scheme attains a performance gain of 1.0 dB while reducing the decoding complexity by 54.55%.展开更多
Neuroscience (also known as neurobiology) is a science that studies the structure, function, development, pharmacology and pathology of the nervous system. In recent years, C. Cotardo has introduced coding theory into...Neuroscience (also known as neurobiology) is a science that studies the structure, function, development, pharmacology and pathology of the nervous system. In recent years, C. Cotardo has introduced coding theory into neuroscience, proposing the concept of combinatorial neural codes. And it was further studied in depth using algebraic methods by C. Curto. In this paper, we construct a class of combinatorial neural codes with special properties based on classical combinatorial structures such as orthogonal Latin rectangle, disjoint Steiner systems, groupable designs and transversal designs. These neural codes have significant weight distribution properties and large minimum distances, and are thus valuable for potential applications in information representation and neuroscience. This study provides new ideas for the construction method and property analysis of combinatorial neural codes, and enriches the study of algebraic coding theory.展开更多
National Fire codes,mandated by government authorities to tackle technical challenges in fire prevention and control,establish fundamental standards for construction practices.International collaboration in fire prote...National Fire codes,mandated by government authorities to tackle technical challenges in fire prevention and control,establish fundamental standards for construction practices.International collaboration in fire protection technologies has opened avenues for China to access a wealth of documents and codes,which are crucial in crafting regulations and developing a robust,scientific framework for fire code formulation.However,the translation of these codes into Chinese has been inadequate,thereby diminishing the benefits of technological exchange and collaborative learning.This underscores the necessity for comprehensive research into code translation,striving for higher-quality translations guided by established translation theories.In this study,we translated the initial segment of the NFPA 1 Fire Code into Chinese and examined both the source text and target text through the lens of Translation Shift Theory,a concept introduced by Catford.The conclusion culminated in identifying four key shifts across various linguistic levels:lexis,sentences,and groups,to ensure an accurate and precise translation of fire codes.This study offers a through and lucid explanation of how the translator integrates Catford’s theories to solve technical challenges in NFPA 1 Fire Code translation,and establish essential standards for construction translation practices.展开更多
Landslides significantly threaten lives and infrastructure, especially in seismically active regions. This study conducts a probabilistic analysis of seismic landslide runout behavior, leveraging a large-deformation f...Landslides significantly threaten lives and infrastructure, especially in seismically active regions. This study conducts a probabilistic analysis of seismic landslide runout behavior, leveraging a large-deformation finite-element (LDFE) model that accounts for the three-dimensional (3D) spatial variability and cross-correlation in soil strength — a reflection of natural soils' inherent properties. LDFE model results are validated by comparing them against previous studies, followed by an examination of the effects of univariable, uncorrelated bivariable, and cross-correlated bivariable random fields on landslide runout behavior. The study's findings reveal that integrating variability in both friction angle and cohesion within uncorrelated bivariable random fields markedly influences runout distances when compared with univariable random fields. Moreover, the cross-correlation of soil cohesion and friction angle dramatically affects runout behavior, with positive correlations enlarging and negative correlations reducing runout distances. Transitioning from two-dimensional (2D) to 3D analyses, a more realistic representation of sliding surface, landslide velocity, runout distance and final deposit morphology is achieved. The study highlights that 2D random analyses substantially underestimate the mean value and overestimate the variability of runout distance, underscoring the importance of 3D modeling in accurately predicting landslide behavior. Overall, this work emphasizes the essential role of understanding 3D cross-correlation in soil strength for landslide hazard assessment and mitigation strategies.展开更多
We construct an infinite family of minimal linear codes over the ring F_(2)+u F_(2).These codes are defined through trace functions and Boolean functions.Their Lee weight distribution is completely computed by Walsh t...We construct an infinite family of minimal linear codes over the ring F_(2)+u F_(2).These codes are defined through trace functions and Boolean functions.Their Lee weight distribution is completely computed by Walsh transformation.By Gray mapping,we obtain a family of minimal binary linear codes from a generic construction,which have prominent applications in secret sharing and secure two-party computation.展开更多
Quantum computing has the potential to solve complex problems that are inefficiently handled by classical computation.However,the high sensitivity of qubits to environmental interference and the high error rates in cu...Quantum computing has the potential to solve complex problems that are inefficiently handled by classical computation.However,the high sensitivity of qubits to environmental interference and the high error rates in current quantum devices exceed the error correction thresholds required for effective algorithm execution.Therefore,quantum error correction technology is crucial to achieving reliable quantum computing.In this work,we study a topological surface code with a two-dimensional lattice structure that protects quantum information by introducing redundancy across multiple qubits and using syndrome qubits to detect and correct errors.However,errors can occur not only in data qubits but also in syndrome qubits,and different types of errors may generate the same syndromes,complicating the decoding task and creating a need for more efficient decoding methods.To address this challenge,we used a transformer decoder based on an attention mechanism.By mapping the surface code lattice,the decoder performs a self-attention process on all input syndromes,thereby obtaining a global receptive field.The performance of the decoder was evaluated under a phenomenological error model.Numerical results demonstrate that the decoder achieved a decoding accuracy of 93.8%.Additionally,we obtained decoding thresholds of 5%and 6.05%at maximum code distances of 7 and 9,respectively.These results indicate that the decoder used demonstrates a certain capability in correcting noise errors in surface codes.展开更多
Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-med...Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-medium length regime.However,this list decoding complexity becomes formidable as the decoding output list size increases.This is primarily incurred by the OSD.Addressing this challenge,this paper proposes the low complexity SCL decoding through reducing the complexity of component code decoding,and pruning the redundant SCL decoding paths.For the former,an efficient skipping rule is introduced for the OSD so that the higher order decoding can be skipped when they are not possible to provide a more likely codeword candidate.It is further extended to the OSD variant,the box-andmatch algorithm(BMA),in facilitating the component code decoding.Moreover,through estimating the correlation distance lower bounds(CDLBs)of the component code decoding outputs,a path pruning(PP)-SCL decoding is proposed to further facilitate the decoding of U-UV codes.In particular,its integration with the improved OSD and BMA is discussed.Simulation results show that significant complexity reduction can be achieved.Consequently,the U-UV codes can outperform the cyclic redundancy check(CRC)-polar codes with a similar decoding complexity.展开更多
Quantum error-correcting codes are essential for fault-tolerant quantum computing,as they effectively detect and correct noise-induced errors by distributing information across multiple physical qubits.The subsystem s...Quantum error-correcting codes are essential for fault-tolerant quantum computing,as they effectively detect and correct noise-induced errors by distributing information across multiple physical qubits.The subsystem surface code with three-qubit check operators demonstrates significant application potential due to its simplified measurement operations and low logical error rates.However,the existing minimum-weight perfect matching(MWPM)algorithm exhibits high computational complexity and lacks flexibility in large-scale systems.Therefore,this paper proposes a decoder based on a graph attention network(GAT),representing error syndromes as undirected graphs with edge weights,and employing a multihead attention mechanism to efficiently aggregate node features and enable parallel computation.Compared to MWPM,the GAT decoder exhibits linear growth in computational complexity,adapts to different quantum code structures,and demonstrates stronger robustness under high physical error rates.The experimental results demonstrate that the proposed decoder achieves an overall accuracy of 89.95%under various small code lattice sizes(L=2,3,4,5),with the logical error rate threshold increasing to 0.0078,representing an improvement of approximately 13.04%compared to the MWPM decoder.This result significantly outperforms traditional methods,showcasing superior performance under small code lattice sizes and providing a more efficient decoding solution for large-scale quantum error correction.展开更多
The traditional Feng Shui pattern embodies rich ecological wisdom and philosophical thoughts,which are of great significance to the modern sustainable space design.The core concepts of Feng Shui patterns from traditio...The traditional Feng Shui pattern embodies rich ecological wisdom and philosophical thoughts,which are of great significance to the modern sustainable space design.The core concepts of Feng Shui patterns from traditional civilization can provide a theoretical foundation and research framework for this study.By integrating these principles,such as“hiding the wind and gathering the Qi”and“backing the mountain and facing the water”,a functional relationship between urban structures can be established.This approach can help optimize the spatial layout of urban elements,minimize energy consumption,and enhance environmental comfort.It also examines the influence of the ShanShui City pattern in traditional Feng Shui on guiding the development of modern urban ecological networks,as well as its role in protecting and restoring biodiversity through ecological corridors and ecological nodes.The modern urban design of traditional Feng Shui culture focuses on the inheritance and innovation of riotous things and the combination of traditional Feng Shui concepts and modern design concepts to form ecological spaces with cultural connotation.This paper hopes to give some inspiration or methods for contemporary urban design and to reconcile the relationship between human and nature through these thoughts.展开更多
In this paper,we propose a hybrid decode-and-forward and soft information relaying(HDFSIR)strategy to mitigate error propagation in coded cooperative communications.In the HDFSIR approach,the relay operates in decode-...In this paper,we propose a hybrid decode-and-forward and soft information relaying(HDFSIR)strategy to mitigate error propagation in coded cooperative communications.In the HDFSIR approach,the relay operates in decode-and-forward(DF)mode when it successfully decodes the received message;otherwise,it switches to soft information relaying(SIR)mode.The benefits of the DF and SIR forwarding strategies are combined to achieve better performance than deploying the DF or SIR strategy alone.Closed-form expressions for the outage probability and symbol error rate(SER)are derived for coded cooperative communication with HDFSIR and energy-harvesting relays.Additionally,we introduce a novel normalized log-likelihood-ratio based soft estimation symbol(NL-SES)mapping technique,which enhances soft symbol accuracy for higher-order modulation,and propose a model characterizing the relationship between the estimated complex soft symbol and the actual high-order modulated symbol.Further-more,the hybrid DF-SIR strategy is extended to a distributed Alamouti space-time-coded cooperative network.To evaluate the~performance of the proposed HDFSIR strategy,we implement extensive Monte Carlo simulations under varying channel conditions.Results demonstrate significant improvements with the hybrid technique outperforming individual DF and SIR strategies in both conventional and distributed Alamouti space-time coded cooperative networks.Moreover,at a SER of 10^(-3),the proposed NL-SES mapping demonstrated a 3.5 dB performance gain over the conventional averaging one,highlighting its superior accuracy in estimating soft symbols for quadrature phase-shift keying modulation.展开更多
基金Project supported by the National Natural Science Foundation of China (Grant Nos. 11875042 and 11505114)the Shanghai Project for Construction of Top Disciplines (Grant No. USST-SYS-01)。
文摘Detecting coupling pattern between elements in a complex system is a basic task in data-driven analysis. The trajectory for each specific element is a cooperative result of its intrinsic dynamic, its couplings with other elements, and the environment. It is subsequently composed of many components, only some of which take part in the couplings. In this paper we present a framework to detect the component correlation pattern. Firstly, the interested trajectories are decomposed into components by using decomposing methods such as the Fourier expansion and the Wavelet transformation. Secondly, the cross-correlations between the components are calculated, resulting into a component cross-correlation matrix(network).Finally, the dominant structure in the network is identified to characterize the coupling pattern in the system. Several deterministic dynamical models turn out to be characterized with rich structures such as the clustering of the components. The pattern of correlation between respiratory(RESP) and ECG signals is composed of five sub-clusters that are mainly formed by the components in ECG signal. Interestingly, only 7 components from RESP(scattered in four sub-clusters) take part in the realization of coupling between the two signals.
基金supported by National Natural Science Foundation of China(No.61973234)Tianjin Science and Technology Plan Project(No.22YDTPJC00090)。
文摘To address the issue of low measurement accuracy caused by noise interference in the acquisition of low fluid flow rate signals with ultrasonic Doppler flow meters,a novel signal processing algorithm that combines ensemble empirical mode decomposition(EEMD)and cross-correlation algorithm was proposed.Firstly,a fast Fourier transform(FFT)spectrum analysis was utilized to ascertain the frequency range of the signal.Secondly,data acquisition was conducted at an appropriate sampling frequency,and the acquired Doppler flow rate signal was then decomposed into a series of intrinsic mode functions(IMFs)by EEMD.Subsequently,these decomposed IMFs were recombined based on their energy entropy,and then the noise of the recombined Doppler flow rate signal was removed by cross-correlation filtering.Finally,an ideal ultrasonic Doppler flow rate signal was extracted.Simulation and experimental verification show that the proposed Doppler flow signal processing method can effectively enhance the signal-to-noise ratio(SNR)and extend the lower limit of measurement of the ultrasonic Doppler flow meter.
文摘The elastic thickness parameter was estimated using the mobile correlation technique between the observed isostatic disturbance and the gravity disturbance calculated through direct gravimetric modeling. We computed the vertical flexure value of the crust for a specific elastic thickness using a given topographic dataset. The gravity disturbance due to the topography was determined after the calculation. A grid of values for the elastic thickness parameter was generated. Then, a moving correlation was performed between the observed gravity data(representing actual surface data) and the calculated data from the forward modeling. The optimum elastic thickness of the particular point corresponded to the highest correlation coefficient. The methodology was tested on synthetic data and showed that the synthetic depth closely matched the original depth, including the elastic thickness value. To validate the results, the described procedure was applied to a real dataset from the Barreirinhas Basin, situated in the northeastern region of Brazil. The results show that the obtained crustal depth is highly correlated with the depth from known models. Additionally, we noted that the elastic thickness behaves as expected, decreasing from the continent towards the ocean. Based on the results, this method has the potential to be employed as a direct estimate of crustal depth and elastic thickness for any region.
文摘This document presents a framework for recognizing people by palm vein distribution analysis using cross-correlation based signatures to obtain descriptors. Haar wavelets are useful in reducing the number of features while maintaining high recognition rates. This experiment achieved 97.5% of individuals classified correctly with two levels of Haar wavelets. This study used twelve-version of RGB and NIR (near infrared) wavelength images per individual. One hundred people were studied;therefore 4,800 instances compose the complete database. A Multilayer Perceptron (MLP) was trained to improve the recognition rate in a k-fold cross-validation test with k = 10. Classification results using MLP neural network were obtained using Weka (open source machine learning software).
文摘In the article“Deep Learning-Enhanced Brain Tumor Prediction via Entropy-Coded BPSO in CIELAB Color Space”by Mudassir Khalil,Muhammad Imran Sharif,Ahmed Naeem,Muhammad Umar Chaudhry,Hafiz Tayyab Rauf,Adham E.Ragab Computers,Materials&Continua,2023,Vol.77,No.2,pp.2031–2047.DOI:10.32604/cmc.2023.043687,URL:https://www.techscience.com/cmc/v77n2/54831,there was an error regarding the affiliation for the author Hafiz Tayyab Rauf.Instead of“Centre for Smart Systems,AI and Cybersecurity,Staffordshire University,Stoke-on-Trent,ST42DE,UK”,the affiliation should be“Independent Researcher,Bradford,BD80HS,UK”.
文摘The temperature change and rate of CO2 change are correlated with a time lag, as reported in a previous paper. The correlation was investigated by calculating a correlation coefficient r of these changes for selected ENSO events in this study. Annual periodical increases and decreases in the CO2 concentration were considered, with a regular pattern of minimum values in August and maximum values in May each year. An increased deviation in CO2 and temperature was found in response to the occurrence of El Niño, but the increase in CO2 lagged behind the change in temperature by 5 months. This pattern was not observed for La Niña events. An increase in global CO2 emissions and a subsequent increase in global temperature proposed by IPCC were not observed, but an increase in global temperature, an increase in soil respiration, and a subsequent increase in global CO2 emissions were noticed. This natural process can be clearly detected during periods of increasing temperature specifically during El Niño events. The results cast strong doubts that anthropogenic CO2 is the cause of global warming.
文摘A two-stage algorithm based on deep learning for the detection and recognition of can bottom spray codes and numbers is proposed to address the problems of small character areas and fast production line speeds in can bottom spray code number recognition.In the coding number detection stage,Differentiable Binarization Network is used as the backbone network,combined with the Attention and Dilation Convolutions Path Aggregation Network feature fusion structure to enhance the model detection effect.In terms of text recognition,using the Scene Visual Text Recognition coding number recognition network for end-to-end training can alleviate the problem of coding recognition errors caused by image color distortion due to variations in lighting and background noise.In addition,model pruning and quantization are used to reduce the number ofmodel parameters to meet deployment requirements in resource-constrained environments.A comparative experiment was conducted using the dataset of tank bottom spray code numbers collected on-site,and a transfer experiment was conducted using the dataset of packaging box production date.The experimental results show that the algorithm proposed in this study can effectively locate the coding of cans at different positions on the roller conveyor,and can accurately identify the coding numbers at high production line speeds.The Hmean value of the coding number detection is 97.32%,and the accuracy of the coding number recognition is 98.21%.This verifies that the algorithm proposed in this paper has high accuracy in coding number detection and recognition.
文摘The care of a patient involved in major trauma with exsanguinating haemorrhage is time-critical to achieve definitive haemorrhage control,and it requires coordinated multidisciplinary care.During initial resuscitation of a patient in the emergency department(ED),Code Crimson activation facilitates rapid decisionmaking by multi-disciplinary specialists for definitive haemorrhage control in operating theatre(OT)and/or interventional radiology(IR)suite.Once this decision has been made,there may still be various factors that lead to delay in transporting the patient from ED to OT/IR.Red Blanket protocol identifies and addresses these factors and processes which cause delay,and aims to facilitate rapid and safe transport of the haemodynamically unstable patient from ED to OT,while minimizing delay in resuscitation during the transfer.The two processes,Code Crimson and Red Blanket,complement each other.It would be ideal to merge the two processes into a single protocol rather than having two separate workflows.Introducing these quality improvement strategies and coor-dinated processes within the trauma framework of the hospitals/healthcare systems will help in further improving the multi-disciplinary care for the complex trauma patients requiring rapid and definitive haemorrhage control.
基金supported by the External Cooperation Program of Science and Technology of Fujian Province,China(2024I0016)the Fundamental Research Funds for the Central Universities(ZQN-1005).
文摘Multilevel coding(MLC)is a commonly used polar coded modulation scheme,but challenging to implement in engineering due to its high complexity and long decoding delay for high-order modulations.To address these limitations,a novel two-level serially concatenated MLC scheme,in which the bitlevels with similar reliability are bundled and transmitted together,is proposed.The proposed scheme hierarchically protects the two bit-level sets:the bitlevel sets at the higher level are sufficiently reliable and do not require excessive resources for protection,whereas only the bit-level sets at the lower level are encoded by polar codes.The proposed scheme has the advantages of low power consumption,low delay and high reliability.Moreover,an optimized constellation signal labeling rule that can enhance the performance is proposed.Finally,the superiority of the proposed scheme is validated through the theoretical analysis and simulation results.Compared with the bit interleaving coding modulation(BICM)scheme,under 256-quadrature amplitude modulation(QAM),the proposed scheme attains a performance gain of 1.0 dB while reducing the decoding complexity by 54.55%.
文摘Neuroscience (also known as neurobiology) is a science that studies the structure, function, development, pharmacology and pathology of the nervous system. In recent years, C. Cotardo has introduced coding theory into neuroscience, proposing the concept of combinatorial neural codes. And it was further studied in depth using algebraic methods by C. Curto. In this paper, we construct a class of combinatorial neural codes with special properties based on classical combinatorial structures such as orthogonal Latin rectangle, disjoint Steiner systems, groupable designs and transversal designs. These neural codes have significant weight distribution properties and large minimum distances, and are thus valuable for potential applications in information representation and neuroscience. This study provides new ideas for the construction method and property analysis of combinatorial neural codes, and enriches the study of algebraic coding theory.
基金Hangzhou Philosophy and Social Science Planning Program(24JD15)。
文摘National Fire codes,mandated by government authorities to tackle technical challenges in fire prevention and control,establish fundamental standards for construction practices.International collaboration in fire protection technologies has opened avenues for China to access a wealth of documents and codes,which are crucial in crafting regulations and developing a robust,scientific framework for fire code formulation.However,the translation of these codes into Chinese has been inadequate,thereby diminishing the benefits of technological exchange and collaborative learning.This underscores the necessity for comprehensive research into code translation,striving for higher-quality translations guided by established translation theories.In this study,we translated the initial segment of the NFPA 1 Fire Code into Chinese and examined both the source text and target text through the lens of Translation Shift Theory,a concept introduced by Catford.The conclusion culminated in identifying four key shifts across various linguistic levels:lexis,sentences,and groups,to ensure an accurate and precise translation of fire codes.This study offers a through and lucid explanation of how the translator integrates Catford’s theories to solve technical challenges in NFPA 1 Fire Code translation,and establish essential standards for construction translation practices.
基金supported by the National Natural Science Foundation of China(Grant No.U22A20596)the Shenzhen Science and Technology Program(Grant No.GJHZ20220913142605010)the Jinan Lead Researcher Project(Grant No.202333051).
文摘Landslides significantly threaten lives and infrastructure, especially in seismically active regions. This study conducts a probabilistic analysis of seismic landslide runout behavior, leveraging a large-deformation finite-element (LDFE) model that accounts for the three-dimensional (3D) spatial variability and cross-correlation in soil strength — a reflection of natural soils' inherent properties. LDFE model results are validated by comparing them against previous studies, followed by an examination of the effects of univariable, uncorrelated bivariable, and cross-correlated bivariable random fields on landslide runout behavior. The study's findings reveal that integrating variability in both friction angle and cohesion within uncorrelated bivariable random fields markedly influences runout distances when compared with univariable random fields. Moreover, the cross-correlation of soil cohesion and friction angle dramatically affects runout behavior, with positive correlations enlarging and negative correlations reducing runout distances. Transitioning from two-dimensional (2D) to 3D analyses, a more realistic representation of sliding surface, landslide velocity, runout distance and final deposit morphology is achieved. The study highlights that 2D random analyses substantially underestimate the mean value and overestimate the variability of runout distance, underscoring the importance of 3D modeling in accurately predicting landslide behavior. Overall, this work emphasizes the essential role of understanding 3D cross-correlation in soil strength for landslide hazard assessment and mitigation strategies.
基金National Natural Science Foundation of China(12201171)。
文摘We construct an infinite family of minimal linear codes over the ring F_(2)+u F_(2).These codes are defined through trace functions and Boolean functions.Their Lee weight distribution is completely computed by Walsh transformation.By Gray mapping,we obtain a family of minimal binary linear codes from a generic construction,which have prominent applications in secret sharing and secure two-party computation.
基金Project supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021MF049)Joint Fund of Natural Science Foundation of Shandong Province(Grant Nos.ZR2022LLZ012 and ZR2021LLZ001)the Key R&D Program of Shandong Province,China(Grant No.2023CXGC010901)。
文摘Quantum computing has the potential to solve complex problems that are inefficiently handled by classical computation.However,the high sensitivity of qubits to environmental interference and the high error rates in current quantum devices exceed the error correction thresholds required for effective algorithm execution.Therefore,quantum error correction technology is crucial to achieving reliable quantum computing.In this work,we study a topological surface code with a two-dimensional lattice structure that protects quantum information by introducing redundancy across multiple qubits and using syndrome qubits to detect and correct errors.However,errors can occur not only in data qubits but also in syndrome qubits,and different types of errors may generate the same syndromes,complicating the decoding task and creating a need for more efficient decoding methods.To address this challenge,we used a transformer decoder based on an attention mechanism.By mapping the surface code lattice,the decoder performs a self-attention process on all input syndromes,thereby obtaining a global receptive field.The performance of the decoder was evaluated under a phenomenological error model.Numerical results demonstrate that the decoder achieved a decoding accuracy of 93.8%.Additionally,we obtained decoding thresholds of 5%and 6.05%at maximum code distances of 7 and 9,respectively.These results indicate that the decoder used demonstrates a certain capability in correcting noise errors in surface codes.
基金supported by the National Natural Science Foundation of China(NSFC)with project ID 62071498the Guangdong National Science Foundation(GDNSF)with project ID 2024A1515010213.
文摘Constituted by BCH component codes and its ordered statistics decoding(OSD),the successive cancellation list(SCL)decoding of U-UV structural codes can provide competent error-correction performance in the short-to-medium length regime.However,this list decoding complexity becomes formidable as the decoding output list size increases.This is primarily incurred by the OSD.Addressing this challenge,this paper proposes the low complexity SCL decoding through reducing the complexity of component code decoding,and pruning the redundant SCL decoding paths.For the former,an efficient skipping rule is introduced for the OSD so that the higher order decoding can be skipped when they are not possible to provide a more likely codeword candidate.It is further extended to the OSD variant,the box-andmatch algorithm(BMA),in facilitating the component code decoding.Moreover,through estimating the correlation distance lower bounds(CDLBs)of the component code decoding outputs,a path pruning(PP)-SCL decoding is proposed to further facilitate the decoding of U-UV codes.In particular,its integration with the improved OSD and BMA is discussed.Simulation results show that significant complexity reduction can be achieved.Consequently,the U-UV codes can outperform the cyclic redundancy check(CRC)-polar codes with a similar decoding complexity.
基金Project supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021MF049)the Joint Fund of the Natural Science Foundation of Shandong Province,China(Grant Nos.ZR2022LLZ012 and ZR2021LLZ001)the Key Research and Development Program of Shandong Province,China(Grant No.2023CXGC010901)。
文摘Quantum error-correcting codes are essential for fault-tolerant quantum computing,as they effectively detect and correct noise-induced errors by distributing information across multiple physical qubits.The subsystem surface code with three-qubit check operators demonstrates significant application potential due to its simplified measurement operations and low logical error rates.However,the existing minimum-weight perfect matching(MWPM)algorithm exhibits high computational complexity and lacks flexibility in large-scale systems.Therefore,this paper proposes a decoder based on a graph attention network(GAT),representing error syndromes as undirected graphs with edge weights,and employing a multihead attention mechanism to efficiently aggregate node features and enable parallel computation.Compared to MWPM,the GAT decoder exhibits linear growth in computational complexity,adapts to different quantum code structures,and demonstrates stronger robustness under high physical error rates.The experimental results demonstrate that the proposed decoder achieves an overall accuracy of 89.95%under various small code lattice sizes(L=2,3,4,5),with the logical error rate threshold increasing to 0.0078,representing an improvement of approximately 13.04%compared to the MWPM decoder.This result significantly outperforms traditional methods,showcasing superior performance under small code lattice sizes and providing a more efficient decoding solution for large-scale quantum error correction.
文摘The traditional Feng Shui pattern embodies rich ecological wisdom and philosophical thoughts,which are of great significance to the modern sustainable space design.The core concepts of Feng Shui patterns from traditional civilization can provide a theoretical foundation and research framework for this study.By integrating these principles,such as“hiding the wind and gathering the Qi”and“backing the mountain and facing the water”,a functional relationship between urban structures can be established.This approach can help optimize the spatial layout of urban elements,minimize energy consumption,and enhance environmental comfort.It also examines the influence of the ShanShui City pattern in traditional Feng Shui on guiding the development of modern urban ecological networks,as well as its role in protecting and restoring biodiversity through ecological corridors and ecological nodes.The modern urban design of traditional Feng Shui culture focuses on the inheritance and innovation of riotous things and the combination of traditional Feng Shui concepts and modern design concepts to form ecological spaces with cultural connotation.This paper hopes to give some inspiration or methods for contemporary urban design and to reconcile the relationship between human and nature through these thoughts.
基金funded by the Deanship of Graduate Studies and Scientific Research at Jouf University under grant No.(DGSSR-2024-02-02160).
文摘In this paper,we propose a hybrid decode-and-forward and soft information relaying(HDFSIR)strategy to mitigate error propagation in coded cooperative communications.In the HDFSIR approach,the relay operates in decode-and-forward(DF)mode when it successfully decodes the received message;otherwise,it switches to soft information relaying(SIR)mode.The benefits of the DF and SIR forwarding strategies are combined to achieve better performance than deploying the DF or SIR strategy alone.Closed-form expressions for the outage probability and symbol error rate(SER)are derived for coded cooperative communication with HDFSIR and energy-harvesting relays.Additionally,we introduce a novel normalized log-likelihood-ratio based soft estimation symbol(NL-SES)mapping technique,which enhances soft symbol accuracy for higher-order modulation,and propose a model characterizing the relationship between the estimated complex soft symbol and the actual high-order modulated symbol.Further-more,the hybrid DF-SIR strategy is extended to a distributed Alamouti space-time-coded cooperative network.To evaluate the~performance of the proposed HDFSIR strategy,we implement extensive Monte Carlo simulations under varying channel conditions.Results demonstrate significant improvements with the hybrid technique outperforming individual DF and SIR strategies in both conventional and distributed Alamouti space-time coded cooperative networks.Moreover,at a SER of 10^(-3),the proposed NL-SES mapping demonstrated a 3.5 dB performance gain over the conventional averaging one,highlighting its superior accuracy in estimating soft symbols for quadrature phase-shift keying modulation.