Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced tran...Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.展开更多
It is shown that time asymmetry is essential for deriving thermodynamic law and arises from the turnover of energy while reducing its information content and driving entropy increase. A dynamically interpreted princip...It is shown that time asymmetry is essential for deriving thermodynamic law and arises from the turnover of energy while reducing its information content and driving entropy increase. A dynamically interpreted principle of least action enables time asymmetry and time flow as a generation of action and redefines useful energy as an information system which implements a form of acting information. This is demonstrated using a basic formula, originally applied for time symmetry/energy conservation considerations, relating time asymmetry (which is conventionally denied but here expressly allowed), to energy behaviour. The results derived then explained that a dynamic energy is driving time asymmetry. It is doing it by decreasing the information content of useful energy, thus generating action and entropy increase, explaining action-time as an information phenomenon. Thermodynamic laws follow directly. The formalism derived readily explains what energy is, why it is conserved (1st law of thermodynamics), why entropy increases (2nd law) and that maximum entropy production within the restraints of the system controls self-organized processes of non-linear irreversible thermodynamics. The general significance of the principle of least action arises from its role of controlling the action generating oriented time of nature. These results contrast with present understanding of time neutrality and clock-time, which are here considered a source of paradoxes, intellectual contradictions and dead-end roads in models explaining nature and the universe.展开更多
The acquisition of neutron time spectrum data plays a pivotal role in the precise quantification of uranium via prompt fission neutron uranium logging(PFNUL).However,the impact of the detector dead-time effect remains...The acquisition of neutron time spectrum data plays a pivotal role in the precise quantification of uranium via prompt fission neutron uranium logging(PFNUL).However,the impact of the detector dead-time effect remains paramount in the accurate acquisition of the neutron time spectrum.Therefore,it is imperative for neutron logging instruments to establish a dead-time correction method that is not only uncomplicated but also practical and caters to various logging sites.This study has formulated an innovative equation for determining dead time and introduced a dead-time correction method for the neutron time spectrum,called the“dual flux method.”Using this approach,a logging instrument captures two neutron time spectra under disparate neutron fluxes.By carefully selecting specific“windows”on the neutron time spectrum,the dead time can be accurately ascertained.To substantiate its efficacy and discern the influencing factors,experiments were conducted utilizing a deuterium-tritium(D-T)neutron source,a Helium-3(3He)detector,and polyethylene shielding to collate and analyze the neutron time spectrum under varying neutron fluxes(at high voltages).The findings underscore that the“height”and“spacing”of the two windows are the most pivotal influencing factors.Notably,the“height”(fd)should surpass 2,and the“spacing”twd should exceed 200μs.The dead time of the 3 He detector determined in the experiment was 7.35μs.After the dead-time correction,the deviation of the decay coefficients from the theoretical values for the neutron time spectrum under varying neutron fluxes decreased from 12.4%to within 5%.Similarly,for the PFNUL instrument,the deviation in the decay coefficients decreased from 22.94 to 0.49%after correcting for the dead-time effect.These results demonstrate the exceptional efficacy of the proposed method in ensuring precise uranium quantification.The dual flux method was experimentally validated as a universal approach applicable to pulsed neutron logging instruments and holds immense significance for uranium exploration.展开更多
The time-varying periodic variations in Global Navigation Satellite System(GNSS)stations affect the reliable time series analysis and appropriate geophysical interpretation.In this study,we apply the singular spectrum...The time-varying periodic variations in Global Navigation Satellite System(GNSS)stations affect the reliable time series analysis and appropriate geophysical interpretation.In this study,we apply the singular spectrum analysis(SSA)method to characterize and interpret the periodic patterns of GNSS deformations in China using multiple geodetic datasets.These include 23-year observations from the Crustal Movement Observation Network of China(CMONOC),displacements inferred from the Gravity Recovery and Climate Experiment(GRACE),and loadings derived from Geophysical models(GM).The results reveal that all CMONOC time series exhibit seasonal signals characterized by amplitude and phase modulations,and the SSA method outperforms the traditional least squares fitting(LSF)method in extracting and interpreting the time-varying seasonal signals from the original time series.The decrease in the root mean square(RMS)correlates well with the annual cycle variance estimated by the SSA method,and the average reduction in noise amplitudes is nearly twice as much for SSA filtered results compared with those from the LSF method.With SSA analysis,the time-varying seasonal signals for all the selected stations can be identified in the reconstructed components corresponding to the first ten eigenvalues.Moreover,both RMS reduction and correlation analysis imply the advantages of GRACE solutions in explaining the GNSS periodic variations,and the geophysical effects can account for 71%of the GNSS annual amplitudes,and the average RMS reduction is 15%.The SSA method has proved to be useful for investigating the GNSS timevarying seasonal signals.It could be applicable as an auxiliary tool in the improvement of nonlinear variations investigations.展开更多
This article broadens terminology and approaches that continue to advance time modelling within a relationalist framework. Time is modeled as a single dimension, flowing continuously through independent privileged poi...This article broadens terminology and approaches that continue to advance time modelling within a relationalist framework. Time is modeled as a single dimension, flowing continuously through independent privileged points. Introduced as absolute point-time, abstract continuous time is a backdrop for concrete relational-based time that is finite and discrete, bound to the limits of a real-world system. We discuss how discrete signals at a point are used to temporally anchor zero-temporal points [t = 0] in linear time. Object-oriented temporal line elements, flanked by temporal point elements, have a proportional geometric identity quantifiable by a standard unit system and can be mapped on a natural number line. Durations, line elements, are divisible into ordered unit ratio elements using ancient timekeeping formulas. The divisional structure provides temporal classes for rotational (Rt24t) and orbital (Rt18) sample periods, as well as a more general temporal class (Rt12) applicable to either sample or frame periods. We introduce notation for additive cyclic counts of sample periods, including divisional units, for calendar-like formatting. For system modeling, unit structures with dihedral symmetry, group order, and numerical order are shown to be applicable to Euclidean modelling. We introduce new functions for bijective and non-bijective mapping, modular arithmetic for cyclic-based time counts, and a novel formula relating to a subgroup of Pythagorean triples, preserving dihedral n-polygon symmetries. This article presents a new approach to model time in a relationalistic framework.展开更多
This study identified castor oil and phosphate ester as effective retarders through setting time,tensile,and flexural tests,and determined their optimal dosages.The mechanism by which phosphate ester affects the setti...This study identified castor oil and phosphate ester as effective retarders through setting time,tensile,and flexural tests,and determined their optimal dosages.The mechanism by which phosphate ester affects the setting time of polyurethane was further investigated using molecular dynamics simulations.Fourier transform infrared spectroscopy was also employed to systematically study the physical and chemical interactions between phosphate esters and polyurethane materials.The results demonstrate that a 1%concentration of phosphate ester provides the most effective retarding effect with minimal impact on the strength of polyurethane.When phosphate ester is added to the B component of the two-component polyurethane system,its interaction energy with component A decreases,as do the diffusion coefficient and aggregation degree of component B on the surface of component A.This reduction in interaction slows the setting time.Additionally,the addition of phosphate ester to polyurethane leads to the disappearance or weakening of functional groups,indicating competitive interactions within the phosphate ester components that inhibit the reaction rate.展开更多
Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the origin...Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.展开更多
Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is k...Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is known about the long-term stability of these biomarker proteins in plasma samples stored at-80°C.We aimed to explore how storage time would affect the diagnostic accuracy of these biomarkers using a large cohort.Plasma samples from 229 cognitively unimpaired individuals,encompassing healthy controls and those experiencing subjective cognitive decline,as well as 99 patients with cognitive impairment,comprising those with mild cognitive impairment and dementia,were acquired from the Sino Longitudinal Study on Cognitive Decline project.These samples were stored at-80°C for up to 6 years before being used in this study.Our results showed that plasma levels of Aβ42,Aβ40,neurofilament light chain,and glial fibrillary acidic protein were not significantly correlated with sample storage time.However,the level of total tau showed a negative correlation with sample storage time.Notably,in individuals without cognitive impairment,plasma levels of total protein and tau phosphorylated protein threonine 181(p-tau181)also showed a negative correlation with sample storage time.This was not observed in individuals with cognitive impairment.Consequently,we speculate that the diagnostic accuracy of plasma p-tau181 and the p-tau181 to total tau ratio may be influenced by sample storage time.Therefore,caution is advised when using these plasma biomarkers for the identification of neurodegenerative diseases,such as Alzheimer's disease.Furthermore,in cohort studies,it is important to consider the impact of storage time on the overall results.展开更多
BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time ...BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time between injury and isolated meniscus repair on patient outcomes is not well described.Assessing this relationship is important as it may influence clinical decision-making and can add to the preoperative patient education process.We hypothesized that increasing the time from injury to meniscus surgery would worsen postoperative outcomes.AIM To investigate the current literature for data on the relationship between time between meniscus injury and repair on patient outcomes.METHODS PubMed,Academic Search Complete,MEDLINE,CINAHL,and SPORTDiscus were searched for studies published between January 1,1995 and July 13,2023 on isolated meniscus repair.Exclusion criteria included concomitant ligament surgery,incomplete outcomes or time to surgery data,and meniscectomies.Patient demographics,time to injury,and postoperative outcomes from each study were abstracted and analyzed.RESULTS Five studies met all inclusion and exclusion criteria.There were 204(121 male,83 female)patients included.Three of five(60%)studies determined that time between injury and surgery was not statistically significant for postoperative Lysholm scores(P=0.62),Tegner scores(P=0.46),failure rate(P=0.45,P=0.86),and International Knee Documentation Committee scores(P=0.65).Two of five(40%)studies found a statistically significant increase in Lysholm scores with shorter time to surgery(P=0.03)and a statistically significant association between progression of medial meniscus extrusion ratio(P=0.01)and increasing time to surgery.CONCLUSION Our results do not support the hypothesis that increased time from injury to isolated meniscus surgery worsens postoperative outcomes.Decision-making primarily based on injury interval is thus not recommended.展开更多
Time series forecasting is essential for generating predictive insights across various domains, including healthcare, finance, and energy. This study focuses on forecasting patient health data by comparing the perform...Time series forecasting is essential for generating predictive insights across various domains, including healthcare, finance, and energy. This study focuses on forecasting patient health data by comparing the performance of traditional linear time series models, namely Autoregressive Integrated Moving Average (ARIMA), Seasonal ARIMA, and Moving Average (MA) against neural network architectures. The primary goal is to evaluate the effectiveness of these models in predicting healthcare outcomes using patient records, specifically the Cancerpatient.xlsx dataset, which tracks variables such as patient age, symptoms, genetic risk factors, and environmental exposures over time. The proposed strategy involves training each model on historical patient data to predict age progression and other related health indicators, with performance evaluated using Mean Squared Error (MSE) and Root Mean Squared Error (RMSE) metrics. Our findings reveal that neural networks consistently outperform ARIMA and SARIMA by capturing non-linear patterns and complex temporal dependencies within the dataset, resulting in lower forecasting errors. This research highlights the potential of neural networks to enhance predictive accuracy in healthcare applications, supporting better resource allocation, patient monitoring, and long-term health outcome predictions.展开更多
This paper presents a comparative study of ARIMA and Neural Network AutoRegressive (NNAR) models for time series forecasting. The study focuses on simulated data generated using ARIMA(1, 1, 0) and applies both models ...This paper presents a comparative study of ARIMA and Neural Network AutoRegressive (NNAR) models for time series forecasting. The study focuses on simulated data generated using ARIMA(1, 1, 0) and applies both models for training and forecasting. Model performance is evaluated using MSE, AIC, and BIC. The models are further applied to neonatal mortality data from Saudi Arabia to assess their predictive capabilities. The results indicate that the NNAR model outperforms ARIMA in both training and forecasting.展开更多
In the present paper, we study the finite time domain dynamics of a scalar field interacting with external sources. We expand both the scalar field and the corresponding Hamiltonian in annihilation and creation operat...In the present paper, we study the finite time domain dynamics of a scalar field interacting with external sources. We expand both the scalar field and the corresponding Hamiltonian in annihilation and creation operators and evaluate the relevant path integral. So, we get the Green function within a finite time interval. We apply the solution to the relevant Cauchy problem and further, we study the dynamics of scalar fields coupled with electromagnetic fields via perturbative methods.展开更多
This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal ...This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal allocation and sizing of multiple PV-DG units to minimize power losses using a probabilistic PV model and time-series power flow analysis. Addressing the uncertainties in PV output due to weather variability and diurnal cycles is critical. A probabilistic assessment offers a more robust analysis of DG integration’s impact on the grid, potentially leading to more reliable system planning. The presented approach employs a genetic algorithm (GA) and a determined PV output profile and probabilistic PV generation profile based on experimental measurements for one year of solar radiation in Cairo, Egypt. The proposed algorithms are validated using a co-simulation framework that integrates MATLAB and OpenDSS, enabling analysis on a 33-bus test system. This framework can act as a guideline for creating other co-simulation algorithms to enhance computing platforms for contemporary modern distribution systems within smart grids concept. The paper presents comparisons with previous research studies and various interesting findings such as the considered hours for developing the probabilistic model presents different results.展开更多
This paper aims to define the concept of time and justify its properties within the universal context, shedding new light on the nature of time. By employing the concept of the extrinsic universe, the paper explains t...This paper aims to define the concept of time and justify its properties within the universal context, shedding new light on the nature of time. By employing the concept of the extrinsic universe, the paper explains the observable universe as the three-dimensional surface of a four-dimensional 3-sphere (hypersphere), expanding at the speed of light. This expansion process gives rise to what we perceive as time and its associated aspects, providing a novel interpretation of time as a geometric property emerging from the dynamics of the universe’s expansion. The work offers insights into how this extrinsic perspective can address phenomena such as the universe’s accelerated expansion and dark matter, aligning the model with current observational data.展开更多
In this article, a finite volume element algorithm is presented and discussed for the numerical solutions of a time-fractional nonlinear fourth-order diffusion equation with time delay. By choosing the second-order sp...In this article, a finite volume element algorithm is presented and discussed for the numerical solutions of a time-fractional nonlinear fourth-order diffusion equation with time delay. By choosing the second-order spatial derivative of the original unknown as an additional variable, the fourth-order problem is transformed into a second-order system. Then the fully discrete finite volume element scheme is formulated by using L1approximation for temporal Caputo derivative and finite volume element method in spatial direction. The unique solvability and stable result of the proposed scheme are proved. A priori estimate of L2-norm with optimal order of convergence O(h2+τ2−α)where τand hare time step length and space mesh parameter, respectively, is obtained. The efficiency of the scheme is supported by some numerical experiments.展开更多
A performer breathes fire during Chinese New Year celebrations at Binondo district,considered the world’s oldest Chinatown,on January 29 in Manila,Philippines.The celebrations,lasting approximately 15 days,were fille...A performer breathes fire during Chinese New Year celebrations at Binondo district,considered the world’s oldest Chinatown,on January 29 in Manila,Philippines.The celebrations,lasting approximately 15 days,were filled with traditional activities such as family gatherings,lion dances,and the exchange of red envelopes,joining the vibrant cultural event observed by Chinese communities worldwide.展开更多
Working toward an efficient duration and timeline for the preconstruction phase should be one of the main objectives for project owners.Failing to plan for and coordinate preconstruction decisions in order to control ...Working toward an efficient duration and timeline for the preconstruction phase should be one of the main objectives for project owners.Failing to plan for and coordinate preconstruction decisions in order to control preconstruction duration and manage time variances can lead to financial insecurities,incomplete contract documents,permitting issues,and unrealistic schedules and resource allocation during this phase.To minimize time variances and ensure a productive decision-making process,project owners should be familiar with critical elements in a project that cause variances in the preconstruction phase timeline.In this study,the impacts of eleven critical preconstruction elements on time variances were analyzed.These eleven preconstruction elements are considered critical in how they impact time variances during the preconstruction phase.They were determined to be critical based either on significantly impacting time variance during the preconstruction phase or believed to be critical from findings from previous studies,however,the findings from this study showed no significant impact on the time variances.In most previous studies focusing on the elements impacting project schedules,data were collected by surveying construction professionals.In this study,objective and quantitative data related to project preconstruction elements were used as opposed to self-reported data.Using the results of this study,project owners and stakeholders will be able to evaluate the critical preconstruction elements impacting the timing of their projects and prioritize decisions related to the critical elements early on during the preconstruction phase.展开更多
To quantify physicians'risk and time preferences and explore the association between the preferences and their willingness to detect and disclose secondary findings(SFs)derived from genome-scale sequencing.We desi...To quantify physicians'risk and time preferences and explore the association between the preferences and their willingness to detect and disclose secondary findings(SFs)derived from genome-scale sequencing.We designed a webbased survey incorporating a multiple price list(MPL)as the instrument for risk and time preference measurement.The estimation was under the expected utility theory(EUT)and rank dependent utility(RDU)frameworks,respectively.The isoelastic and power function utility models were applied.We received responses from 87 physicians,among whom 46 completed the questionnaire(a completion rate of 52.9%).We observed positive risk-aversion coefficients under EUT(0.33,95%CI 0.15-0.51)and RDU(0.51,95%CI 0.32-0.71),suggesting that physicians were generally risk-averse.Respondents were likely to underestimate probabilities of low to moderate levels,and slightly overestimate high-level probabilities.Physicians supporting the detection and disclosure of SFs had a larger risk-aversion coefficient and a smaller discounting parameter than the non-supporters,suggesting that they were more risk-averse and discounted future utility less.Assuming heterogeneous risk perception,we found respondents underestimated low/moderate risk and slightly overestimated high risk.This study indicates that physicians who are risk-averse and discount future utility slightly are willing to detect and return SFs.The findings contribute to the debate surrounding SF disclosure and generate implications for shared decision-making in clinical genome-scale sequencing.展开更多
We experimentally analyze the effect of the optical power on the time delay signature identification and the random bit generation in chaotic semiconductor laser with optical feedback.Due to the inevitable noise durin...We experimentally analyze the effect of the optical power on the time delay signature identification and the random bit generation in chaotic semiconductor laser with optical feedback.Due to the inevitable noise during the photoelectric detection and analog-digital conversion,the varying of output optical power would change the signal to noise ratio,then impact time delay signature identification and the random bit generation.Our results show that,when the optical power is less than-14 dBm,with the decreasing of the optical power,the actual identified time delay signature degrades and the entropy of the chaotic signal increases.Moreover,the extracted random bit sequence with lower optical power is more easily pass through the randomness testing.展开更多
基金research was funded by Science and Technology Project of State Grid Corporation of China under grant number 5200-202319382A-2-3-XG.
文摘Iced transmission line galloping poses a significant threat to the safety and reliability of power systems,leading directly to line tripping,disconnections,and power outages.Existing early warning methods of iced transmission line galloping suffer from issues such as reliance on a single data source,neglect of irregular time series,and lack of attention-based closed-loop feedback,resulting in high rates of missed and false alarms.To address these challenges,we propose an Internet of Things(IoT)empowered early warning method of transmission line galloping that integrates time series data from optical fiber sensing and weather forecast.Initially,the method applies a primary adaptive weighted fusion to the IoT empowered optical fiber real-time sensing data and weather forecast data,followed by a secondary fusion based on a Back Propagation(BP)neural network,and uses the K-medoids algorithm for clustering the fused data.Furthermore,an adaptive irregular time series perception adjustment module is introduced into the traditional Gated Recurrent Unit(GRU)network,and closed-loop feedback based on attentionmechanism is employed to update network parameters through gradient feedback of the loss function,enabling closed-loop training and time series data prediction of the GRU network model.Subsequently,considering various types of prediction data and the duration of icing,an iced transmission line galloping risk coefficient is established,and warnings are categorized based on this coefficient.Finally,using an IoT-driven realistic dataset of iced transmission line galloping,the effectiveness of the proposed method is validated through multi-dimensional simulation scenarios.
文摘It is shown that time asymmetry is essential for deriving thermodynamic law and arises from the turnover of energy while reducing its information content and driving entropy increase. A dynamically interpreted principle of least action enables time asymmetry and time flow as a generation of action and redefines useful energy as an information system which implements a form of acting information. This is demonstrated using a basic formula, originally applied for time symmetry/energy conservation considerations, relating time asymmetry (which is conventionally denied but here expressly allowed), to energy behaviour. The results derived then explained that a dynamic energy is driving time asymmetry. It is doing it by decreasing the information content of useful energy, thus generating action and entropy increase, explaining action-time as an information phenomenon. Thermodynamic laws follow directly. The formalism derived readily explains what energy is, why it is conserved (1st law of thermodynamics), why entropy increases (2nd law) and that maximum entropy production within the restraints of the system controls self-organized processes of non-linear irreversible thermodynamics. The general significance of the principle of least action arises from its role of controlling the action generating oriented time of nature. These results contrast with present understanding of time neutrality and clock-time, which are here considered a source of paradoxes, intellectual contradictions and dead-end roads in models explaining nature and the universe.
基金supported by the National Natural Science Foundation of China(No.42374226)Jiangxi Provincial Natural Science Foundation(Nos.20232BAB201043 and 20232BCJ23006)+2 种基金Nuclear Energy Development Project(20201192-01)National Key Laboratory of Uranium Resource Exploration-Mining and Nuclear Remote Sensing(ECUT)(2024QZ-TD-09)Fundamental Science on Radioactive Geology and Exploration Technology Laboratory(2022RGET20).
文摘The acquisition of neutron time spectrum data plays a pivotal role in the precise quantification of uranium via prompt fission neutron uranium logging(PFNUL).However,the impact of the detector dead-time effect remains paramount in the accurate acquisition of the neutron time spectrum.Therefore,it is imperative for neutron logging instruments to establish a dead-time correction method that is not only uncomplicated but also practical and caters to various logging sites.This study has formulated an innovative equation for determining dead time and introduced a dead-time correction method for the neutron time spectrum,called the“dual flux method.”Using this approach,a logging instrument captures two neutron time spectra under disparate neutron fluxes.By carefully selecting specific“windows”on the neutron time spectrum,the dead time can be accurately ascertained.To substantiate its efficacy and discern the influencing factors,experiments were conducted utilizing a deuterium-tritium(D-T)neutron source,a Helium-3(3He)detector,and polyethylene shielding to collate and analyze the neutron time spectrum under varying neutron fluxes(at high voltages).The findings underscore that the“height”and“spacing”of the two windows are the most pivotal influencing factors.Notably,the“height”(fd)should surpass 2,and the“spacing”twd should exceed 200μs.The dead time of the 3 He detector determined in the experiment was 7.35μs.After the dead-time correction,the deviation of the decay coefficients from the theoretical values for the neutron time spectrum under varying neutron fluxes decreased from 12.4%to within 5%.Similarly,for the PFNUL instrument,the deviation in the decay coefficients decreased from 22.94 to 0.49%after correcting for the dead-time effect.These results demonstrate the exceptional efficacy of the proposed method in ensuring precise uranium quantification.The dual flux method was experimentally validated as a universal approach applicable to pulsed neutron logging instruments and holds immense significance for uranium exploration.
基金supported by the National Natural Science Foundation of China(NO.42104028,42174030 and 42004017)the Open Fund of Hubei Luojia Laboratory(No.220100048 and 230100021)the Scientific Research Project of Hubei Provincial Department of Education,and Research Foundation of the Department of Natural Resources of Hunan Province(No.20230104CH)。
文摘The time-varying periodic variations in Global Navigation Satellite System(GNSS)stations affect the reliable time series analysis and appropriate geophysical interpretation.In this study,we apply the singular spectrum analysis(SSA)method to characterize and interpret the periodic patterns of GNSS deformations in China using multiple geodetic datasets.These include 23-year observations from the Crustal Movement Observation Network of China(CMONOC),displacements inferred from the Gravity Recovery and Climate Experiment(GRACE),and loadings derived from Geophysical models(GM).The results reveal that all CMONOC time series exhibit seasonal signals characterized by amplitude and phase modulations,and the SSA method outperforms the traditional least squares fitting(LSF)method in extracting and interpreting the time-varying seasonal signals from the original time series.The decrease in the root mean square(RMS)correlates well with the annual cycle variance estimated by the SSA method,and the average reduction in noise amplitudes is nearly twice as much for SSA filtered results compared with those from the LSF method.With SSA analysis,the time-varying seasonal signals for all the selected stations can be identified in the reconstructed components corresponding to the first ten eigenvalues.Moreover,both RMS reduction and correlation analysis imply the advantages of GRACE solutions in explaining the GNSS periodic variations,and the geophysical effects can account for 71%of the GNSS annual amplitudes,and the average RMS reduction is 15%.The SSA method has proved to be useful for investigating the GNSS timevarying seasonal signals.It could be applicable as an auxiliary tool in the improvement of nonlinear variations investigations.
文摘This article broadens terminology and approaches that continue to advance time modelling within a relationalist framework. Time is modeled as a single dimension, flowing continuously through independent privileged points. Introduced as absolute point-time, abstract continuous time is a backdrop for concrete relational-based time that is finite and discrete, bound to the limits of a real-world system. We discuss how discrete signals at a point are used to temporally anchor zero-temporal points [t = 0] in linear time. Object-oriented temporal line elements, flanked by temporal point elements, have a proportional geometric identity quantifiable by a standard unit system and can be mapped on a natural number line. Durations, line elements, are divisible into ordered unit ratio elements using ancient timekeeping formulas. The divisional structure provides temporal classes for rotational (Rt24t) and orbital (Rt18) sample periods, as well as a more general temporal class (Rt12) applicable to either sample or frame periods. We introduce notation for additive cyclic counts of sample periods, including divisional units, for calendar-like formatting. For system modeling, unit structures with dihedral symmetry, group order, and numerical order are shown to be applicable to Euclidean modelling. We introduce new functions for bijective and non-bijective mapping, modular arithmetic for cyclic-based time counts, and a novel formula relating to a subgroup of Pythagorean triples, preserving dihedral n-polygon symmetries. This article presents a new approach to model time in a relationalistic framework.
基金Funded by the National Natural Science Foundation of China(No.52370128)the Fundamental Research Funds for the Central Universities(No.2572022AW54)。
文摘This study identified castor oil and phosphate ester as effective retarders through setting time,tensile,and flexural tests,and determined their optimal dosages.The mechanism by which phosphate ester affects the setting time of polyurethane was further investigated using molecular dynamics simulations.Fourier transform infrared spectroscopy was also employed to systematically study the physical and chemical interactions between phosphate esters and polyurethane materials.The results demonstrate that a 1%concentration of phosphate ester provides the most effective retarding effect with minimal impact on the strength of polyurethane.When phosphate ester is added to the B component of the two-component polyurethane system,its interaction energy with component A decreases,as do the diffusion coefficient and aggregation degree of component B on the surface of component A.This reduction in interaction slows the setting time.Additionally,the addition of phosphate ester to polyurethane leads to the disappearance or weakening of functional groups,indicating competitive interactions within the phosphate ester components that inhibit the reaction rate.
基金supported in part by the Interdisciplinary Project of Dalian University(DLUXK-2023-ZD-001).
文摘Multivariate time series forecasting iswidely used in traffic planning,weather forecasting,and energy consumption.Series decomposition algorithms can help models better understand the underlying patterns of the original series to improve the forecasting accuracy of multivariate time series.However,the decomposition kernel of previous decomposition-based models is fixed,and these models have not considered the differences in frequency fluctuations between components.These problems make it difficult to analyze the intricate temporal variations of real-world time series.In this paper,we propose a series decomposition-based Mamba model,DecMamba,to obtain the intricate temporal dependencies and the dependencies among different variables of multivariate time series.A variable-level adaptive kernel combination search module is designed to interact with information on different trends and periods between variables.Two backbone structures are proposed to emphasize the differences in frequency fluctuations of seasonal and trend components.Mamba with superior performance is used instead of a Transformer in backbone structures to capture the dependencies among different variables.A new embedding block is designed to capture the temporal features better,especially for the high-frequency seasonal component whose semantic information is difficult to acquire.A gating mechanism is introduced to the decoder in the seasonal backbone to improve the prediction accuracy.A comparison with ten state-of-the-art models on seven real-world datasets demonstrates that DecMamba can better model the temporal dependencies and the dependencies among different variables,guaranteeing better prediction performance for multivariate time series.
基金supported by the National Key Research&Development Program of China,Nos.2021YFC2501205(to YC),2022YFC24069004(to JL)the STI2030-Major Project,Nos.2021ZD0201101(to YC),2022ZD0211800(to YH)+2 种基金the National Natural Science Foundation of China(Major International Joint Research Project),No.82020108013(to YH)the Sino-German Center for Research Promotion,No.M-0759(to YH)a grant from Beijing Municipal Science&Technology Commission(Beijing Brain Initiative),No.Z201100005520018(to JL)。
文摘Several promising plasma biomarker proteins,such as amyloid-β(Aβ),tau,neurofilament light chain,and glial fibrillary acidic protein,are widely used for the diagnosis of neurodegenerative diseases.However,little is known about the long-term stability of these biomarker proteins in plasma samples stored at-80°C.We aimed to explore how storage time would affect the diagnostic accuracy of these biomarkers using a large cohort.Plasma samples from 229 cognitively unimpaired individuals,encompassing healthy controls and those experiencing subjective cognitive decline,as well as 99 patients with cognitive impairment,comprising those with mild cognitive impairment and dementia,were acquired from the Sino Longitudinal Study on Cognitive Decline project.These samples were stored at-80°C for up to 6 years before being used in this study.Our results showed that plasma levels of Aβ42,Aβ40,neurofilament light chain,and glial fibrillary acidic protein were not significantly correlated with sample storage time.However,the level of total tau showed a negative correlation with sample storage time.Notably,in individuals without cognitive impairment,plasma levels of total protein and tau phosphorylated protein threonine 181(p-tau181)also showed a negative correlation with sample storage time.This was not observed in individuals with cognitive impairment.Consequently,we speculate that the diagnostic accuracy of plasma p-tau181 and the p-tau181 to total tau ratio may be influenced by sample storage time.Therefore,caution is advised when using these plasma biomarkers for the identification of neurodegenerative diseases,such as Alzheimer's disease.Furthermore,in cohort studies,it is important to consider the impact of storage time on the overall results.
文摘BACKGROUND Meniscal tears are one of the most common knee injuries.After the diagnosis of a meniscal tear has been made,there are several factors physicians use to guide clinical decision-making.The influence of time between injury and isolated meniscus repair on patient outcomes is not well described.Assessing this relationship is important as it may influence clinical decision-making and can add to the preoperative patient education process.We hypothesized that increasing the time from injury to meniscus surgery would worsen postoperative outcomes.AIM To investigate the current literature for data on the relationship between time between meniscus injury and repair on patient outcomes.METHODS PubMed,Academic Search Complete,MEDLINE,CINAHL,and SPORTDiscus were searched for studies published between January 1,1995 and July 13,2023 on isolated meniscus repair.Exclusion criteria included concomitant ligament surgery,incomplete outcomes or time to surgery data,and meniscectomies.Patient demographics,time to injury,and postoperative outcomes from each study were abstracted and analyzed.RESULTS Five studies met all inclusion and exclusion criteria.There were 204(121 male,83 female)patients included.Three of five(60%)studies determined that time between injury and surgery was not statistically significant for postoperative Lysholm scores(P=0.62),Tegner scores(P=0.46),failure rate(P=0.45,P=0.86),and International Knee Documentation Committee scores(P=0.65).Two of five(40%)studies found a statistically significant increase in Lysholm scores with shorter time to surgery(P=0.03)and a statistically significant association between progression of medial meniscus extrusion ratio(P=0.01)and increasing time to surgery.CONCLUSION Our results do not support the hypothesis that increased time from injury to isolated meniscus surgery worsens postoperative outcomes.Decision-making primarily based on injury interval is thus not recommended.
文摘Time series forecasting is essential for generating predictive insights across various domains, including healthcare, finance, and energy. This study focuses on forecasting patient health data by comparing the performance of traditional linear time series models, namely Autoregressive Integrated Moving Average (ARIMA), Seasonal ARIMA, and Moving Average (MA) against neural network architectures. The primary goal is to evaluate the effectiveness of these models in predicting healthcare outcomes using patient records, specifically the Cancerpatient.xlsx dataset, which tracks variables such as patient age, symptoms, genetic risk factors, and environmental exposures over time. The proposed strategy involves training each model on historical patient data to predict age progression and other related health indicators, with performance evaluated using Mean Squared Error (MSE) and Root Mean Squared Error (RMSE) metrics. Our findings reveal that neural networks consistently outperform ARIMA and SARIMA by capturing non-linear patterns and complex temporal dependencies within the dataset, resulting in lower forecasting errors. This research highlights the potential of neural networks to enhance predictive accuracy in healthcare applications, supporting better resource allocation, patient monitoring, and long-term health outcome predictions.
文摘This paper presents a comparative study of ARIMA and Neural Network AutoRegressive (NNAR) models for time series forecasting. The study focuses on simulated data generated using ARIMA(1, 1, 0) and applies both models for training and forecasting. Model performance is evaluated using MSE, AIC, and BIC. The models are further applied to neonatal mortality data from Saudi Arabia to assess their predictive capabilities. The results indicate that the NNAR model outperforms ARIMA in both training and forecasting.
文摘In the present paper, we study the finite time domain dynamics of a scalar field interacting with external sources. We expand both the scalar field and the corresponding Hamiltonian in annihilation and creation operators and evaluate the relevant path integral. So, we get the Green function within a finite time interval. We apply the solution to the relevant Cauchy problem and further, we study the dynamics of scalar fields coupled with electromagnetic fields via perturbative methods.
文摘This paper presents an optimized strategy for multiple integrations of photovoltaic distributed generation (PV-DG) within radial distribution power systems. The proposed methodology focuses on identifying the optimal allocation and sizing of multiple PV-DG units to minimize power losses using a probabilistic PV model and time-series power flow analysis. Addressing the uncertainties in PV output due to weather variability and diurnal cycles is critical. A probabilistic assessment offers a more robust analysis of DG integration’s impact on the grid, potentially leading to more reliable system planning. The presented approach employs a genetic algorithm (GA) and a determined PV output profile and probabilistic PV generation profile based on experimental measurements for one year of solar radiation in Cairo, Egypt. The proposed algorithms are validated using a co-simulation framework that integrates MATLAB and OpenDSS, enabling analysis on a 33-bus test system. This framework can act as a guideline for creating other co-simulation algorithms to enhance computing platforms for contemporary modern distribution systems within smart grids concept. The paper presents comparisons with previous research studies and various interesting findings such as the considered hours for developing the probabilistic model presents different results.
文摘This paper aims to define the concept of time and justify its properties within the universal context, shedding new light on the nature of time. By employing the concept of the extrinsic universe, the paper explains the observable universe as the three-dimensional surface of a four-dimensional 3-sphere (hypersphere), expanding at the speed of light. This expansion process gives rise to what we perceive as time and its associated aspects, providing a novel interpretation of time as a geometric property emerging from the dynamics of the universe’s expansion. The work offers insights into how this extrinsic perspective can address phenomena such as the universe’s accelerated expansion and dark matter, aligning the model with current observational data.
文摘In this article, a finite volume element algorithm is presented and discussed for the numerical solutions of a time-fractional nonlinear fourth-order diffusion equation with time delay. By choosing the second-order spatial derivative of the original unknown as an additional variable, the fourth-order problem is transformed into a second-order system. Then the fully discrete finite volume element scheme is formulated by using L1approximation for temporal Caputo derivative and finite volume element method in spatial direction. The unique solvability and stable result of the proposed scheme are proved. A priori estimate of L2-norm with optimal order of convergence O(h2+τ2−α)where τand hare time step length and space mesh parameter, respectively, is obtained. The efficiency of the scheme is supported by some numerical experiments.
文摘A performer breathes fire during Chinese New Year celebrations at Binondo district,considered the world’s oldest Chinatown,on January 29 in Manila,Philippines.The celebrations,lasting approximately 15 days,were filled with traditional activities such as family gatherings,lion dances,and the exchange of red envelopes,joining the vibrant cultural event observed by Chinese communities worldwide.
文摘Working toward an efficient duration and timeline for the preconstruction phase should be one of the main objectives for project owners.Failing to plan for and coordinate preconstruction decisions in order to control preconstruction duration and manage time variances can lead to financial insecurities,incomplete contract documents,permitting issues,and unrealistic schedules and resource allocation during this phase.To minimize time variances and ensure a productive decision-making process,project owners should be familiar with critical elements in a project that cause variances in the preconstruction phase timeline.In this study,the impacts of eleven critical preconstruction elements on time variances were analyzed.These eleven preconstruction elements are considered critical in how they impact time variances during the preconstruction phase.They were determined to be critical based either on significantly impacting time variance during the preconstruction phase or believed to be critical from findings from previous studies,however,the findings from this study showed no significant impact on the time variances.In most previous studies focusing on the elements impacting project schedules,data were collected by surveying construction professionals.In this study,objective and quantitative data related to project preconstruction elements were used as opposed to self-reported data.Using the results of this study,project owners and stakeholders will be able to evaluate the critical preconstruction elements impacting the timing of their projects and prioritize decisions related to the critical elements early on during the preconstruction phase.
文摘To quantify physicians'risk and time preferences and explore the association between the preferences and their willingness to detect and disclose secondary findings(SFs)derived from genome-scale sequencing.We designed a webbased survey incorporating a multiple price list(MPL)as the instrument for risk and time preference measurement.The estimation was under the expected utility theory(EUT)and rank dependent utility(RDU)frameworks,respectively.The isoelastic and power function utility models were applied.We received responses from 87 physicians,among whom 46 completed the questionnaire(a completion rate of 52.9%).We observed positive risk-aversion coefficients under EUT(0.33,95%CI 0.15-0.51)and RDU(0.51,95%CI 0.32-0.71),suggesting that physicians were generally risk-averse.Respondents were likely to underestimate probabilities of low to moderate levels,and slightly overestimate high-level probabilities.Physicians supporting the detection and disclosure of SFs had a larger risk-aversion coefficient and a smaller discounting parameter than the non-supporters,suggesting that they were more risk-averse and discounted future utility less.Assuming heterogeneous risk perception,we found respondents underestimated low/moderate risk and slightly overestimated high risk.This study indicates that physicians who are risk-averse and discount future utility slightly are willing to detect and return SFs.The findings contribute to the debate surrounding SF disclosure and generate implications for shared decision-making in clinical genome-scale sequencing.
基金Project supported in part by the National Natural Science Foundation of China(Grant Nos.62005129 and 62175116)。
文摘We experimentally analyze the effect of the optical power on the time delay signature identification and the random bit generation in chaotic semiconductor laser with optical feedback.Due to the inevitable noise during the photoelectric detection and analog-digital conversion,the varying of output optical power would change the signal to noise ratio,then impact time delay signature identification and the random bit generation.Our results show that,when the optical power is less than-14 dBm,with the decreasing of the optical power,the actual identified time delay signature degrades and the entropy of the chaotic signal increases.Moreover,the extracted random bit sequence with lower optical power is more easily pass through the randomness testing.