Groundwater is a crucial water source for urban areas in Africa, particularly where surface water is insufficient to meet demand. This study analyses the water quality of five shallow wells (WW1-WW5) in Half-London Wa...Groundwater is a crucial water source for urban areas in Africa, particularly where surface water is insufficient to meet demand. This study analyses the water quality of five shallow wells (WW1-WW5) in Half-London Ward, Tunduma Town, Tanzania, using Principal Component Analysis (PCA) to identify the primary factors influencing groundwater contamination. Monthly samples were collected over 12 months and analysed for physical, chemical, and biological parameters. The PCA revealed between four and six principal components (PCs) for each well, explaining between 84.61% and 92.55% of the total variance in water quality data. In WW1, five PCs captured 87.53% of the variability, with PC1 (33.05%) dominated by pH, EC, TDS, and microbial contamination, suggesting significant influences from surface runoff and pit latrines. In WW2, six PCs explained 92.55% of the variance, with PC1 (36.17%) highlighting the effects of salinity, TDS, and agricultural runoff. WW3 had four PCs explaining 84.61% of the variance, with PC1 (39.63%) showing high contributions from pH, hardness, and salinity, indicating geological influences and contamination from human activities. Similarly, in WW4, six PCs explained 90.83% of the variance, where PC1 (43.53%) revealed contamination from pit latrines and fertilizers. WW5 also had six PCs, accounting for 92.51% of the variance, with PC1 (42.73%) indicating significant contamination from agricultural runoff and pit latrines. The study concludes that groundwater quality in Half-London Ward is primarily affected by a combination of surface runoff, pit latrine contamination, agricultural inputs, and geological factors. The presence of microbial contaminants and elevated nitrate and phosphate levels underscores the need for improved sanitation and sustainable agricultural practices. Recommendations include strengthening sanitation infrastructure, promoting responsible farming techniques, and implementing regular groundwater monitoring to safeguard water resources and public health in the region.展开更多
In the age of information explosion and artificial intelligence, sentiment analysis tailored for the tobacco industry has emerged as a pivotal avenue for cigarette manufacturers to enhance their tobacco products. Exis...In the age of information explosion and artificial intelligence, sentiment analysis tailored for the tobacco industry has emerged as a pivotal avenue for cigarette manufacturers to enhance their tobacco products. Existing solutions have primarily focused on intrinsic features within consumer reviews and achieved significant progress through deep feature extraction models. However, they still face these two key limitations: (1) neglecting the influence of fundamental tobacco information on analyzing the sentiment inclination of consumer reviews, resulting in a lack of consistent sentiment assessment criteria across thousands of tobacco brands;(2) overlooking the syntactic dependencies between Chinese word phrases and the underlying impact of sentiment scores between word phrases on sentiment inclination determination. To tackle these challenges, we propose the External Knowledge-enhanced Cross-Attention Fusion model, CITSA. Specifically, in the Cross Infusion Layer, we fuse consumer comment information and tobacco fundamental information through interactive attention mechanisms. In the Textual Attention Enhancement Layer, we introduce an emotion-oriented syntactic dependency graph and incorporate sentiment-syntactic relationships into consumer comments through a graph convolution network module. Subsequently, the Textual Attention Layer is introduced to combine these two feature representations. Additionally, we compile a Chinese-oriented tobacco sentiment analysis dataset, comprising 55,096 consumer reviews and 2074 tobacco fundamental information entries. Experimental results on our self-constructed datasets consistently demonstrate that our proposed model outperforms state-of-the-art methods in terms of accuracy, precision, recall, and F1-score.展开更多
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This...With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques.展开更多
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e...This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.展开更多
In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without ...In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without a multitaper approach for spectral estimation.There are several common ways to increase the reliability of the Fourier spectral estimation from experimental(noisy)data;for example to subdivide the experimental time series into segments,taper these segments(using single taper),perform the Fourier transform of the individual segments,and average the resulting spectra.展开更多
In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge pr...In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.展开更多
In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the in...In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the influences of baking process parameters, such as baking time, surface heating temperature and bottom heating temperature, on the quality of the cookie were studied to optimize the baking process parameters. The results showed that the baking process parameters had obvious effects on the texture, color, deformation, moisture content, and temperature of the cookie. All of the roasting surface heating temperature, bottom heating temperature and baking time had positive influences on the hardness, crunchiness, crispiness, and the total color difference(ΔE) of the cookie. When the heating temperatures of the surfac and bottom increased, the diameter and thickness deformation rate of the cookie increased. However,with the extension of baking time, the diameter and thickness deformation rate of the cookie first increased and then decreased. With the surface heating temperature of 180 ℃, the bottom heating temperature of 150 ℃, and baking time of 15 min, the cookie was crisp and moderate with moderate deformation and uniform color. There was no burnt phenomenon with the desired quality. Research results provided a theoretical basis for cookie manufactory based on food 3D printing technology.展开更多
Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentime...Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentiment analysisin widely spoken languages such as English, Chinese, Arabic, Roman Arabic, and more, we come to grapplingwith resource-poor languages like Urdu literature which becomes a challenge. Urdu is a uniquely crafted language,characterized by a script that amalgamates elements from diverse languages, including Arabic, Parsi, Pashtu,Turkish, Punjabi, Saraiki, and more. As Urdu literature, characterized by distinct character sets and linguisticfeatures, presents an additional hurdle due to the lack of accessible datasets, rendering sentiment analysis aformidable undertaking. The limited availability of resources has fueled increased interest among researchers,prompting a deeper exploration into Urdu sentiment analysis. This research is dedicated to Urdu languagesentiment analysis, employing sophisticated deep learning models on an extensive dataset categorized into fivelabels: Positive, Negative, Neutral, Mixed, and Ambiguous. The primary objective is to discern sentiments andemotions within the Urdu language, despite the absence of well-curated datasets. To tackle this challenge, theinitial step involves the creation of a comprehensive Urdu dataset by aggregating data from various sources such asnewspapers, articles, and socialmedia comments. Subsequent to this data collection, a thorough process of cleaningand preprocessing is implemented to ensure the quality of the data. The study leverages two well-known deeplearningmodels, namely Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN), for bothtraining and evaluating sentiment analysis performance. Additionally, the study explores hyperparameter tuning tooptimize the models’ efficacy. Evaluation metrics such as precision, recall, and the F1-score are employed to assessthe effectiveness of the models. The research findings reveal that RNN surpasses CNN in Urdu sentiment analysis,gaining a significantly higher accuracy rate of 91%. This result accentuates the exceptional performance of RNN,solidifying its status as a compelling option for conducting sentiment analysis tasks in the Urdu language.展开更多
Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive r...Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.展开更多
Based on the perception of flood risk factors derived from the lessons learned by the main stakeholders, namely the members of the National Emergency Response Plan (ORSEC) and the people affected by floods in the stud...Based on the perception of flood risk factors derived from the lessons learned by the main stakeholders, namely the members of the National Emergency Response Plan (ORSEC) and the people affected by floods in the study area (Thies, Senegal), this work consists of modelling the flood risk using Hierarchical Process Analysis (HPA). This modelling made it possible to determine the coherence index (CI) and the coherence ratio, which were evaluated respectively at 0.27% and 5% according to the perception of the members of the ORSEC Plan, and at 0.28% and 5% according to the perception of the disaster victims. These results show that the working approach is coherent and acceptable. We then carried out Hierarchical Fuzzy Process Analysis (HFPA), an extension of HFPA, which seeks to minimize the margins of error. FPHA uses fuzzification of perception contributions, interference rules and defuzzification to determine the Net Flood Risk Index (NFRI). Integrated with ArcGIS software, the NFRI is used to generate flood risk maps that reveal a high risk of vulnerability of the main outlets occupied by human settlements.展开更多
Deep shale reservoirs are characterized by elevated breakdown pressures,diminished fracture complexity,and reduced modified volumes compared to medium and shallow reservoirs.Therefore,it is urgent to investigate parti...Deep shale reservoirs are characterized by elevated breakdown pressures,diminished fracture complexity,and reduced modified volumes compared to medium and shallow reservoirs.Therefore,it is urgent to investigate particular injection strategies that can optimize breakdown pressure and fracturing efficiency to address the increasing demands for deep shale reservoir stimulation.In this study,the efficiency of various stimulation strategies,including multi-cluster simultaneous fracturing,modified alternating fracturing,alternating shut-in fracturing,and cyclic alternating fracturing,was evaluated.Subsequently,the sensitivity of factors such as the cycle index,shut-in time,cluster spacing,and horizontal permeability was investigated.Additionally,the flow distribution effect within the wellbore was discussed.The results indicate that relative to multi-cluster simultaneous fracturing,modified alternating fracturing exhibits reduced susceptibility to the stress shadow effect,which results in earlier breakdown,extended hydraulic fracture lengths,and more consistent propagation despite an increase in breakdown pressure.The alternating shut-in fracturing benefits the increase of fracture length,which is closely related to the shut-in time.Furthermore,cyclic alternating fracturing markedly lowers breakdown pressure and contributes to uniform fracture propagation,in which the cycle count plays an important role.Modified alternating fracturing demonstrates insensitivity to variations in cluster spacing,whereas horizontal permeability is a critical factor affecting fracture length.The wellbore effect restrains the accumulation of pressure and flow near the perforation,delaying the initiation of hydraulic fractures.The simulation results can provide valuable numerical insights for optimizing injection strategies for deep shale hydraulic fracturing.展开更多
Accurate dynamic modeling of landslides could help understand the movement mechanisms and guide disaster mitigation and prevention.Discontinuous deformation analysis(DDA)is an effective approach for investigating land...Accurate dynamic modeling of landslides could help understand the movement mechanisms and guide disaster mitigation and prevention.Discontinuous deformation analysis(DDA)is an effective approach for investigating landslides.However,DDA fails to accurately capture the degradation in shear strength of rock joints commonly observed in high-speed landslides.In this study,DDA is modified by incorporating simplified joint shear strength degradation.Based on the modified DDA,the kinematics of the Baige landslide that occurred along the Jinsha River in China on 10 October 2018 are reproduced.The violent starting velocity of the landslide is considered explicitly.Three cases with different violent starting velocities are investigated to show their effect on the landslide movement process.Subsequently,the landslide movement process and the final accumulation characteristics are analyzed from multiple perspectives.The results show that the violent starting velocity affects the landslide motion characteristics,which is found to be about 4 m/s in the Baige landslide.The movement process of the Baige landslide involves four stages:initiation,high-speed sliding,impact-climbing,low-speed motion and accumulation.The accumulation states of sliding masses in different zones are different,which essentially corresponds to reality.The research results suggest that the modified DDA is applicable to similar high-level rock landslides.展开更多
Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode che...Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.展开更多
Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in ...Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.展开更多
This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze t...This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.展开更多
The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic suppor...The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.展开更多
Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inc...Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.展开更多
Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In ord...Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.展开更多
[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analy...[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.展开更多
The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical ...The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.展开更多
文摘Groundwater is a crucial water source for urban areas in Africa, particularly where surface water is insufficient to meet demand. This study analyses the water quality of five shallow wells (WW1-WW5) in Half-London Ward, Tunduma Town, Tanzania, using Principal Component Analysis (PCA) to identify the primary factors influencing groundwater contamination. Monthly samples were collected over 12 months and analysed for physical, chemical, and biological parameters. The PCA revealed between four and six principal components (PCs) for each well, explaining between 84.61% and 92.55% of the total variance in water quality data. In WW1, five PCs captured 87.53% of the variability, with PC1 (33.05%) dominated by pH, EC, TDS, and microbial contamination, suggesting significant influences from surface runoff and pit latrines. In WW2, six PCs explained 92.55% of the variance, with PC1 (36.17%) highlighting the effects of salinity, TDS, and agricultural runoff. WW3 had four PCs explaining 84.61% of the variance, with PC1 (39.63%) showing high contributions from pH, hardness, and salinity, indicating geological influences and contamination from human activities. Similarly, in WW4, six PCs explained 90.83% of the variance, where PC1 (43.53%) revealed contamination from pit latrines and fertilizers. WW5 also had six PCs, accounting for 92.51% of the variance, with PC1 (42.73%) indicating significant contamination from agricultural runoff and pit latrines. The study concludes that groundwater quality in Half-London Ward is primarily affected by a combination of surface runoff, pit latrine contamination, agricultural inputs, and geological factors. The presence of microbial contaminants and elevated nitrate and phosphate levels underscores the need for improved sanitation and sustainable agricultural practices. Recommendations include strengthening sanitation infrastructure, promoting responsible farming techniques, and implementing regular groundwater monitoring to safeguard water resources and public health in the region.
基金supported by the Global Research and Innovation Platform Fund for Scientific Big Data Transmission(Grant No.241711KYSB20180002)National Key Research and Development Project of China(Grant No.2019YFB1405801).
文摘In the age of information explosion and artificial intelligence, sentiment analysis tailored for the tobacco industry has emerged as a pivotal avenue for cigarette manufacturers to enhance their tobacco products. Existing solutions have primarily focused on intrinsic features within consumer reviews and achieved significant progress through deep feature extraction models. However, they still face these two key limitations: (1) neglecting the influence of fundamental tobacco information on analyzing the sentiment inclination of consumer reviews, resulting in a lack of consistent sentiment assessment criteria across thousands of tobacco brands;(2) overlooking the syntactic dependencies between Chinese word phrases and the underlying impact of sentiment scores between word phrases on sentiment inclination determination. To tackle these challenges, we propose the External Knowledge-enhanced Cross-Attention Fusion model, CITSA. Specifically, in the Cross Infusion Layer, we fuse consumer comment information and tobacco fundamental information through interactive attention mechanisms. In the Textual Attention Enhancement Layer, we introduce an emotion-oriented syntactic dependency graph and incorporate sentiment-syntactic relationships into consumer comments through a graph convolution network module. Subsequently, the Textual Attention Layer is introduced to combine these two feature representations. Additionally, we compile a Chinese-oriented tobacco sentiment analysis dataset, comprising 55,096 consumer reviews and 2074 tobacco fundamental information entries. Experimental results on our self-constructed datasets consistently demonstrate that our proposed model outperforms state-of-the-art methods in terms of accuracy, precision, recall, and F1-score.
文摘With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques.
文摘This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.
文摘In order to attain good quality transfer function estimates from magnetotelluric field data(i.e.,smooth behavior and small uncertainties across all frequencies),we compare time series data processing with and without a multitaper approach for spectral estimation.There are several common ways to increase the reliability of the Fourier spectral estimation from experimental(noisy)data;for example to subdivide the experimental time series into segments,taper these segments(using single taper),perform the Fourier transform of the individual segments,and average the resulting spectra.
文摘In this paper, CiteSpace, a bibliometrics software, was adopted to collect research papers published on the Web of Science, which are relevant to biological model and effluent quality prediction in activated sludge process in the wastewater treatment. By the way of trend map, keyword knowledge map, and co-cited knowledge map, specific visualization analysis and identification of the authors, institutions and regions were concluded. Furthermore, the topics and hotspots of water quality prediction in activated sludge process through the literature-co-citation-based cluster analysis and literature citation burst analysis were also determined, which not only reflected the historical evolution progress to a certain extent, but also provided the direction and insight of the knowledge structure of water quality prediction and activated sludge process for future research.
基金Supported by Heilongjiang Provincial Fruit Tree Modernization Agro-industrial Technology Collaborative Innovation and Promotion System Project(2019-13)。
文摘In order to obtain better quality cookies, food 3D printing technology was employed to prepare cookies. The texture, color, deformation, moisture content, and temperature of the cookie as evaluation indicators, the influences of baking process parameters, such as baking time, surface heating temperature and bottom heating temperature, on the quality of the cookie were studied to optimize the baking process parameters. The results showed that the baking process parameters had obvious effects on the texture, color, deformation, moisture content, and temperature of the cookie. All of the roasting surface heating temperature, bottom heating temperature and baking time had positive influences on the hardness, crunchiness, crispiness, and the total color difference(ΔE) of the cookie. When the heating temperatures of the surfac and bottom increased, the diameter and thickness deformation rate of the cookie increased. However,with the extension of baking time, the diameter and thickness deformation rate of the cookie first increased and then decreased. With the surface heating temperature of 180 ℃, the bottom heating temperature of 150 ℃, and baking time of 15 min, the cookie was crisp and moderate with moderate deformation and uniform color. There was no burnt phenomenon with the desired quality. Research results provided a theoretical basis for cookie manufactory based on food 3D printing technology.
文摘Sentiment analysis, a crucial task in discerning emotional tones within the text, plays a pivotal role in understandingpublic opinion and user sentiment across diverse languages.While numerous scholars conduct sentiment analysisin widely spoken languages such as English, Chinese, Arabic, Roman Arabic, and more, we come to grapplingwith resource-poor languages like Urdu literature which becomes a challenge. Urdu is a uniquely crafted language,characterized by a script that amalgamates elements from diverse languages, including Arabic, Parsi, Pashtu,Turkish, Punjabi, Saraiki, and more. As Urdu literature, characterized by distinct character sets and linguisticfeatures, presents an additional hurdle due to the lack of accessible datasets, rendering sentiment analysis aformidable undertaking. The limited availability of resources has fueled increased interest among researchers,prompting a deeper exploration into Urdu sentiment analysis. This research is dedicated to Urdu languagesentiment analysis, employing sophisticated deep learning models on an extensive dataset categorized into fivelabels: Positive, Negative, Neutral, Mixed, and Ambiguous. The primary objective is to discern sentiments andemotions within the Urdu language, despite the absence of well-curated datasets. To tackle this challenge, theinitial step involves the creation of a comprehensive Urdu dataset by aggregating data from various sources such asnewspapers, articles, and socialmedia comments. Subsequent to this data collection, a thorough process of cleaningand preprocessing is implemented to ensure the quality of the data. The study leverages two well-known deeplearningmodels, namely Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN), for bothtraining and evaluating sentiment analysis performance. Additionally, the study explores hyperparameter tuning tooptimize the models’ efficacy. Evaluation metrics such as precision, recall, and the F1-score are employed to assessthe effectiveness of the models. The research findings reveal that RNN surpasses CNN in Urdu sentiment analysis,gaining a significantly higher accuracy rate of 91%. This result accentuates the exceptional performance of RNN,solidifying its status as a compelling option for conducting sentiment analysis tasks in the Urdu language.
文摘Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.
文摘Based on the perception of flood risk factors derived from the lessons learned by the main stakeholders, namely the members of the National Emergency Response Plan (ORSEC) and the people affected by floods in the study area (Thies, Senegal), this work consists of modelling the flood risk using Hierarchical Process Analysis (HPA). This modelling made it possible to determine the coherence index (CI) and the coherence ratio, which were evaluated respectively at 0.27% and 5% according to the perception of the members of the ORSEC Plan, and at 0.28% and 5% according to the perception of the disaster victims. These results show that the working approach is coherent and acceptable. We then carried out Hierarchical Fuzzy Process Analysis (HFPA), an extension of HFPA, which seeks to minimize the margins of error. FPHA uses fuzzification of perception contributions, interference rules and defuzzification to determine the Net Flood Risk Index (NFRI). Integrated with ArcGIS software, the NFRI is used to generate flood risk maps that reveal a high risk of vulnerability of the main outlets occupied by human settlements.
基金supported by the National Natural Science Foundation of China(NSFC)(Grant Nos.42377156,42077251 and 42202305).
文摘Deep shale reservoirs are characterized by elevated breakdown pressures,diminished fracture complexity,and reduced modified volumes compared to medium and shallow reservoirs.Therefore,it is urgent to investigate particular injection strategies that can optimize breakdown pressure and fracturing efficiency to address the increasing demands for deep shale reservoir stimulation.In this study,the efficiency of various stimulation strategies,including multi-cluster simultaneous fracturing,modified alternating fracturing,alternating shut-in fracturing,and cyclic alternating fracturing,was evaluated.Subsequently,the sensitivity of factors such as the cycle index,shut-in time,cluster spacing,and horizontal permeability was investigated.Additionally,the flow distribution effect within the wellbore was discussed.The results indicate that relative to multi-cluster simultaneous fracturing,modified alternating fracturing exhibits reduced susceptibility to the stress shadow effect,which results in earlier breakdown,extended hydraulic fracture lengths,and more consistent propagation despite an increase in breakdown pressure.The alternating shut-in fracturing benefits the increase of fracture length,which is closely related to the shut-in time.Furthermore,cyclic alternating fracturing markedly lowers breakdown pressure and contributes to uniform fracture propagation,in which the cycle count plays an important role.Modified alternating fracturing demonstrates insensitivity to variations in cluster spacing,whereas horizontal permeability is a critical factor affecting fracture length.The wellbore effect restrains the accumulation of pressure and flow near the perforation,delaying the initiation of hydraulic fractures.The simulation results can provide valuable numerical insights for optimizing injection strategies for deep shale hydraulic fracturing.
基金supported by the National Natural Science Foundations of China(grant numbers U22A20601 and 52209142)the Opening fund of State Key Laboratory of Geohazard Prevention and Geoenvironment Protection(Chengdu University of Technology)(grant number SKLGP2022K018)+1 种基金the Science&Technology Department of Sichuan Province(grant number 2023NSFSC0284)the Science and Technology Major Project of Tibetan Autonomous Region of China(grant number XZ202201ZD0003G)。
文摘Accurate dynamic modeling of landslides could help understand the movement mechanisms and guide disaster mitigation and prevention.Discontinuous deformation analysis(DDA)is an effective approach for investigating landslides.However,DDA fails to accurately capture the degradation in shear strength of rock joints commonly observed in high-speed landslides.In this study,DDA is modified by incorporating simplified joint shear strength degradation.Based on the modified DDA,the kinematics of the Baige landslide that occurred along the Jinsha River in China on 10 October 2018 are reproduced.The violent starting velocity of the landslide is considered explicitly.Three cases with different violent starting velocities are investigated to show their effect on the landslide movement process.Subsequently,the landslide movement process and the final accumulation characteristics are analyzed from multiple perspectives.The results show that the violent starting velocity affects the landslide motion characteristics,which is found to be about 4 m/s in the Baige landslide.The movement process of the Baige landslide involves four stages:initiation,high-speed sliding,impact-climbing,low-speed motion and accumulation.The accumulation states of sliding masses in different zones are different,which essentially corresponds to reality.The research results suggest that the modified DDA is applicable to similar high-level rock landslides.
基金Supported by the National Natural Science Foundation of China(61273160,61403418)the Natural Science Foundation of Shandong Province(ZR2011FM014)+1 种基金the Fundamental Research Funds for the Central Universities(10CX04046A)the Doctoral Fund of Shandong Province(BS2012ZZ011)
文摘Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not well in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis(LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis(PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Finally a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.
文摘Genetic diversity of 18 processing apple varieties and two fresh varieties were evaluated using 12 simple sequence repeats (SSR) primer pairs previously identified in Malus domestica Borkh. A total of 87 alleles in 10 loci were detected using 10 polymorphic SSR markers selected within the range of 5-14 alleles per locus. All the 20 varieties could be distinguished using two primer pairs and they were divided into four groups using cluster analysis. The genetic similarity (GS) of groups analyzed using cluster analysis varied from 0.14 to 0.83. High acid variety Avrolles separated from other varieties with GS less than 0.42. The second group contained Longfeng and Dolgo from Northeast of China, the inherited genes of Chinese crab apple. The five cider varieties with high tannin contents, namely, Dabinette, Frequin rouge, Kermerrien, M.Menard, and D.Coetligne were clustered into the third group. The fourth group was mainly composed of 12 juice and fresh varieties. Principal coordinate analysis (PCO) also divided all the varieties into four groups. Juice and fresh apple varieties, Longfeng and Dolgo were clustered together, respectively, using both the analyses. Both the analyses showed there was much difference between cider and juice varieties, cider and fresh varieties, as well as Chinese crab apple and western European crab apple, whereas juice varieties and fresh varieties had a similar genetic background. The genetic diversity and differentiation could be sufficiently reflected by combining the two analytical methods.
文摘This paper, based on the material processes and relational processes, aims to analysis the deep meaning of chapter one of Pride and Prejudice. The relevant theories will come first in this paper. I will then analyze this extract from three aspects: the analysis of the objective plane of narration, the analysis of Mrs. Bennet' s discourse and the analysis of Mr. Bennet' s discourse.
基金supported by the National Natural Science Foundation of China(71171008)
文摘The uncertainty analysis is an effective sensitivity analysis method for system model analysis and optimization. However,the existing single-factor uncertainty analysis methods are not well used in the logistic support systems with multiple decision-making factors. The multiple transfer parameters graphical evaluation and review technique(MTP-GERT) is used to model the logistic support process in consideration of two important factors, support activity time and support activity resources, which are two primary causes for the logistic support process uncertainty. On this basis,a global sensitivity analysis(GSA) method based on covariance is designed to analyze the logistic support process uncertainty. The aircraft support process is selected as a case application which illustrates the validity of the proposed method to analyze the support process uncertainty, and some feasible recommendations are proposed for aircraft support decision making on carrier.
基金National Science &Technology Pillar Program during the Twelfth Five-year Plan Period(No.2012BAF01B02)National Science and Technology Major Project of China(No.2012ZX04005031)
文摘Despite spending considerable effort on the development of manufacturing technology during the production process,manufacturing companies experience resources waste and worse ecological influences. To overcome the inconsistencies between energy-saving and environmental conservation,a uniform way of reporting the information and classification was presented. Based on the establishment of carbon footprint( CFP) for machine tools operation,carbon footprint per kilogram( CFK) was proposed as the normalized index to evaluate the machining process.Furthermore,a classification approach was developed as a tracking and analyzing system for the machining process. In addition,a case study was also used to illustrate the validity of the methodology. The results show that the approach is reasonable and feasible for machining process evaluation,which provides a reliable reference to the optimization measures for low carbon manufacturing.
基金Supported by the National Basic Research Program of China (2013CB733600), the National Natural Science Foundation of China (21176073), the Doctoral Fund of Ministry of Education of China (20090074110005), the Program for New Century Excellent Talents in University (NCET-09-0346), Shu Guang Project (09SG29) and the Fundamental Research Funds for the Central Universities.
文摘Fault diagnosis and monitoring are very important for complex chemical process. There are numerous methods that have been studied in this field, in which the effective visualization method is still challenging. In order to get a better visualization effect, a novel fault diagnosis method which combines self-organizing map (SOM) with Fisher discriminant analysis (FDA) is proposed. FDA can reduce the dimension of the data in terms of maximizing the separability of the classes. After feature extraction by FDA, SOM can distinguish the different states on the output map clearly and it can also be employed to monitor abnormal states. Tennessee Eastman (TE) process is employed to illustrate the fault diagnosis and monitoring performance of the proposed method. The result shows that the SOM integrated with FDA method is efficient and capable for real-time monitoring and fault diagnosis in complex chemical process.
文摘[Objective] The aim was to analyze one cold wave weather process in Chengdu in March in 2010.[Method] Based on the NCEP 1°×1° 6 h interval reanalysis data and daily observation data,using synoptic analysis and diagnosis methods,and combining with the cold wave forecast index in spring of Sichuan,a cold wave event covering the whole region between March 21 and 24,2010 was analyzed from the aspects of circulation background,influencing weather systems and weather causation.[Result] Results showed that the 500 high-altitude cold vortex,700-850 hPa low layer shear,and ground cold front were the main systems that influenced this cold wave;there was a ridge from Lake Balkhash across Lake Baikal at 500 hPa.The early stage of the process was controlled by the high pressure ridge and the temperature was increasing obviously.The daily mean temperature was high.The range of cold high pressure was large and the central intensity was 1 043.0 hPa;the cold air was strong and deep which was in accordance with the strong surface temperature reduction center.The strong north airstream of Lake Balkhash to Lake Baikal,ground cold high pressure center intensity changes,north and south ocean pressure and temperature differences,850 hPa temperature changes,cold advection movement route and intensity were considered as reference factors for the forecast of cold wave intensity.[Conclusion] The study provided theoretical basis for improving the forecast ability of cold wave weather.
基金Supported by National Natural Science Foundation of China(Grant No.51805260)National Natural Science Foundation for Distinguished Young Scholars of China(Grant No.51925505)National Natural Science Foundation of China(Grant No.51775278).
文摘The rapidly increasing demand and complexity of manufacturing process potentiates the usage of manufacturing data with the highest priority to achieve precise analyze and control,rather than using simplified physical models and human expertise.In the era of data-driven manufacturing,the explosion of data amount revolutionized how data is collected and analyzed.This paper overviews the advance of technologies developed for in-process manufacturing data collection and analysis.It can be concluded that groundbreaking sensoring technology to facilitate direct measurement is one important leading trend for advanced data collection,due to the complexity and uncertainty during indirect measurement.On the other hand,physical model-based data analysis contains inevitable simplifications and sometimes ill-posed solutions due to the limited capacity of describing complex manufacturing process.Machine learning,especially deep learning approach has great potential for making better decisions to automate the process when fed with abundant data,while trending data-driven manufacturing approaches succeeded by using limited data to achieve similar or even better decisions.And these trends can demonstrated be by analyzing some typical applications of manufacturing process.