Enterprise Business Intelligence(BI)system refers to data mining through the existing database of the enterprise,and data analysis according to customer requirements through comprehensive processing.The data analysis ...Enterprise Business Intelligence(BI)system refers to data mining through the existing database of the enterprise,and data analysis according to customer requirements through comprehensive processing.The data analysis efficiency is high and the operation is convenient.This paper mainly analyzes the application of enterprise BI data analysis system in enterprises.展开更多
With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This...With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques.展开更多
Maintaining the integrity and longevity of structures is essential in many industries,such as aerospace,nuclear,and petroleum.To achieve the cost-effectiveness of large-scale systems in petroleum drilling,a strong emp...Maintaining the integrity and longevity of structures is essential in many industries,such as aerospace,nuclear,and petroleum.To achieve the cost-effectiveness of large-scale systems in petroleum drilling,a strong emphasis on structural durability and monitoring is required.This study focuses on the mechanical vibrations that occur in rotary drilling systems,which have a substantial impact on the structural integrity of drilling equipment.The study specifically investigates axial,torsional,and lateral vibrations,which might lead to negative consequences such as bit-bounce,chaotic whirling,and high-frequency stick-slip.These events not only hinder the efficiency of drilling but also lead to exhaustion and harm to the system’s components since they are difficult to be detected and controlled in real time.The study investigates the dynamic interactions of these vibrations,specifically in their high-frequency modes,usingfield data obtained from measurement while drilling.Thefindings have demonstrated the effect of strong coupling between the high-frequency modes of these vibrations on drilling sys-tem performance.The obtained results highlight the importance of considering the interconnected impacts of these vibrations when designing and implementing robust control systems.Therefore,integrating these compo-nents can increase the durability of drill bits and drill strings,as well as improve the ability to monitor and detect damage.Moreover,by exploiting thesefindings,the assessment of structural resilience in rotary drilling systems can be enhanced.Furthermore,the study demonstrates the capacity of structural health monitoring to improve the quality,dependability,and efficiency of rotary drilling systems in the petroleum industry.展开更多
DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expres...DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.展开更多
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e...This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.展开更多
Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpe...Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.展开更多
This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combin...This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.展开更多
Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanal...Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanalysis database(ERA5)is used.Seeing calculated from ERA5 is compared consistently with the Differential Image Motion Monitor seeing at the height of 12 m.Results show that seeing decays exponentially with height at the Muztagh-Ata site.Seeing decays the fastest in fall in 2021 and most slowly with height in summer.The seeing condition is better in fall than in summer.The median value of seeing at 12 m is 0.89 arcsec,the maximum value is1.21 arcsec in August and the minimum is 0.66 arcsec in October.The median value of seeing at 12 m is 0.72arcsec in the nighttime and 1.08 arcsec in the daytime.Seeing is a combination of annual and about biannual variations with the same phase as temperature and wind speed indicating that seeing variation with time is influenced by temperature and wind speed.The Richardson number Ri is used to analyze the atmospheric stability and the variations of seeing are consistent with Ri between layers.These quantitative results can provide an important reference for a telescopic observation strategy.展开更多
In order to overcome the defects that the analysis of multi-well typical curves of shale gas reservoirs is rarely applied to engineering,this study proposes a robust production data analysis method based on deconvolut...In order to overcome the defects that the analysis of multi-well typical curves of shale gas reservoirs is rarely applied to engineering,this study proposes a robust production data analysis method based on deconvolution,which is used for multi-well inter-well interference research.In this study,a multi-well conceptual trilinear seepage model for multi-stage fractured horizontal wells was established,and its Laplace solutions under two different outer boundary conditions were obtained.Then,an improved pressure deconvolution algorithm was used to normalize the scattered production data.Furthermore,the typical curve fitting was carried out using the production data and the seepage model solution.Finally,some reservoir parameters and fracturing parameters were interpreted,and the intensity of inter-well interference was compared.The effectiveness of the method was verified by analyzing the production dynamic data of six shale gas wells in Duvernay area.The results showed that the fitting effect of typical curves was greatly improved due to the mutual restriction between deconvolution calculation parameter debugging and seepage model parameter debugging.Besides,by using the morphological characteristics of the log-log typical curves and the time corresponding to the intersection point of the log-log typical curves of two models under different outer boundary conditions,the strength of the interference between wells on the same well platform was well judged.This work can provide a reference for the optimization of well spacing and hydraulic fracturing measures for shale gas wells.展开更多
Peanut allergy is majorly related to severe food induced allergic reactions.Several food including cow's milk,hen's eggs,soy,wheat,peanuts,tree nuts(walnuts,hazelnuts,almonds,cashews,pecans and pistachios),fis...Peanut allergy is majorly related to severe food induced allergic reactions.Several food including cow's milk,hen's eggs,soy,wheat,peanuts,tree nuts(walnuts,hazelnuts,almonds,cashews,pecans and pistachios),fish and shellfish are responsible for more than 90%of food allergies.Here,we provide promising insights using a large-scale data-driven analysis,comparing the mechanistic feature and biological relevance of different ingredients presents in peanuts,tree nuts(walnuts,almonds,cashews,pecans and pistachios)and soybean.Additionally,we have analysed the chemical compositions of peanuts in different processed form raw,boiled and dry-roasted.Using the data-driven approach we are able to generate new hypotheses to explain why nuclear receptors like the peroxisome proliferator-activated receptors(PPARs)and its isoform and their interaction with dietary lipids may have significant effect on allergic response.The results obtained from this study will direct future experimeantal and clinical studies to understand the role of dietary lipids and PPARisoforms to exert pro-inflammatory or anti-inflammatory functions on cells of the innate immunity and influence antigen presentation to the cells of the adaptive immunity.展开更多
The Internet of Multimedia Things(IoMT)refers to a network of interconnected multimedia devices that communicate with each other over the Internet.Recently,smart healthcare has emerged as a significant application of ...The Internet of Multimedia Things(IoMT)refers to a network of interconnected multimedia devices that communicate with each other over the Internet.Recently,smart healthcare has emerged as a significant application of the IoMT,particularly in the context of knowledge‐based learning systems.Smart healthcare systems leverage knowledge‐based learning to become more context‐aware,adaptable,and auditable while maintain-ing the ability to learn from historical data.In smart healthcare systems,devices capture images,such as X‐rays,Magnetic Resonance Imaging.The security and integrity of these images are crucial for the databases used in knowledge‐based learning systems to foster structured decision‐making and enhance the learning abilities of AI.Moreover,in knowledge‐driven systems,the storage and transmission of HD medical images exert a burden on the limited bandwidth of the communication channel,leading to data trans-mission delays.To address the security and latency concerns,this paper presents a lightweight medical image encryption scheme utilising bit‐plane decomposition and chaos theory.The results of the experiment yield entropy,energy,and correlation values of 7.999,0.0156,and 0.0001,respectively.This validates the effectiveness of the encryption system proposed in this paper,which offers high‐quality encryption,a large key space,key sensitivity,and resistance to statistical attacks.展开更多
Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision...Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision-making across diverse domains. Conversely, Python is indispensable for professional programming due to its versatility, readability, extensive libraries, and robust community support. It enables efficient development, advanced data analysis, data mining, and automation, catering to diverse industries and applications. However, one primary issue when using Microsoft Excel with Python libraries is compatibility and interoperability. While Excel is a widely used tool for data storage and analysis, it may not seamlessly integrate with Python libraries, leading to challenges in reading and writing data, especially in complex or large datasets. Additionally, manipulating Excel files with Python may not always preserve formatting or formulas accurately, potentially affecting data integrity. Moreover, dependency on Excel’s graphical user interface (GUI) for automation can limit scalability and reproducibility compared to Python’s scripting capabilities. This paper covers the integration solution of empowering non-programmers to leverage Python’s capabilities within the familiar Excel environment. This enables users to perform advanced data analysis and automation tasks without requiring extensive programming knowledge. Based on Soliciting feedback from non-programmers who have tested the integration solution, the case study shows how the solution evaluates the ease of implementation, performance, and compatibility of Python with Excel versions.展开更多
With the rapid development of the Internet,many enterprises have launched their network platforms.When users browse,search,and click the products of these platforms,most platforms will keep records of these network be...With the rapid development of the Internet,many enterprises have launched their network platforms.When users browse,search,and click the products of these platforms,most platforms will keep records of these network behaviors,these records are often heterogeneous,and it is called log data.To effectively to analyze and manage these heterogeneous log data,so that enterprises can grasp the behavior characteristics of their platform users in time,to realize targeted recommendation of users,increase the sales volume of enterprises’products,and accelerate the development of enterprises.Firstly,we follow the process of big data collection,storage,analysis,and visualization to design the system,then,we adopt HDFS storage technology,Yarn resource management technology,and gink load balancing technology to build a Hadoop cluster to process the log data,and adopt MapReduce processing technology and data warehouse hive technology analyze the log data to obtain the results.Finally,the obtained results are displayed visually,and a log data analysis system is successfully constructed.It has been proved by practice that the system effectively realizes the collection,analysis and visualization of log data,and can accurately realize the recommendation of products by enterprises.The system is stable and effective.展开更多
This research paper compares Excel and R language for data analysis and concludes that R language is more suitable for complex data analysis tasks.R language’s open-source nature makes it accessible to everyone,and i...This research paper compares Excel and R language for data analysis and concludes that R language is more suitable for complex data analysis tasks.R language’s open-source nature makes it accessible to everyone,and its powerful data management and analysis tools make it suitable for handling complex data analysis tasks.It is also highly customizable,allowing users to create custom functions and packages to meet their specific needs.Additionally,R language provides high reproducibility,making it easy to replicate and verify research results,and it has excellent collaboration capabilities,enabling multiple users to work on the same project simultaneously.These advantages make R language a more suitable choice for complex data analysis tasks,particularly in scientific research and business applications.The findings of this study will help people understand that R is not just a language that can handle more data than Excel and demonstrate that r is essential to the field of data analysis.At the same time,it will also help users and organizations make informed decisions regarding their data analysis needs and software preferences.展开更多
Objective:To explain the use of concept mapping in a study about family members'experiences in taking care of people with cancer.Methods:This study used a phenomenological study design.In this study,we describe th...Objective:To explain the use of concept mapping in a study about family members'experiences in taking care of people with cancer.Methods:This study used a phenomenological study design.In this study,we describe the analytical process of using concept mapping in our phenomenological studies about family members'experiences in taking care of people with cancer.Results:We developed several concept maps that aided us in analyzing our collected data from the interviews.Conclusions:The use of concept mapping is suggested to researchers who intend to analyze their data in any qualitative studies,including those using a phenomenological design,because it is a time-efficient way of dealing with large amounts of qualitative data during the analytical process.展开更多
Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive r...Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.展开更多
The connectivity of sandbodies is a key constraint to the exploration effectiveness of Bohai A Oilfield.Conventional connectivity studies often use methods such as seismic attribute fusion,while the development of con...The connectivity of sandbodies is a key constraint to the exploration effectiveness of Bohai A Oilfield.Conventional connectivity studies often use methods such as seismic attribute fusion,while the development of contiguous composite sandbodies in this area makes it challenging to characterize connectivity changes with conventional seismic attributes.Aiming at the above problem in the Bohai A Oilfield,this study proposes a big data analysis method based on the Deep Forest algorithm to predict the sandbody connectivity.Firstly,by compiling the abundant exploration and development sandbodies data in the study area,typical sandbodies with reliable connectivity were selected.Then,sensitive seismic attribute were extracted to obtain training samples.Finally,based on the Deep Forest algorithm,mapping model between attribute combinations and sandbody connectivity was established through machine learning.This method achieves the first quantitative determination of the connectivity for continuous composite sandbodies in the Bohai Oilfield.Compared with conventional connectivity discrimination methods such as high-resolution processing and seismic attribute analysis,this method can combine the sandbody characteristics of the study area in the process of machine learning,and jointly judge connectivity by combining multiple seismic attributes.The study results show that this method has high accuracy and timeliness in predicting connectivity for continuous composite sandbodies.Applied to the Bohai A Oilfield,it successfully identified multiple sandbody connectivity relationships and provided strong support for the subsequent exploration potential assessment and well placement optimization.This method also provides a new idea and method for studying sandbody connectivity under similar complex geological conditions.展开更多
The advent of the big data era has made data visualization a crucial tool for enhancing the efficiency and insights of data analysis. This theoretical research delves into the current applications and potential future...The advent of the big data era has made data visualization a crucial tool for enhancing the efficiency and insights of data analysis. This theoretical research delves into the current applications and potential future trends of data visualization in big data analysis. The article first systematically reviews the theoretical foundations and technological evolution of data visualization, and thoroughly analyzes the challenges faced by visualization in the big data environment, such as massive data processing, real-time visualization requirements, and multi-dimensional data display. Through extensive literature research, it explores innovative application cases and theoretical models of data visualization in multiple fields including business intelligence, scientific research, and public decision-making. The study reveals that interactive visualization, real-time visualization, and immersive visualization technologies may become the main directions for future development and analyzes the potential of these technologies in enhancing user experience and data comprehension. The paper also delves into the theoretical potential of artificial intelligence technology in enhancing data visualization capabilities, such as automated chart generation, intelligent recommendation of visualization schemes, and adaptive visualization interfaces. The research also focuses on the role of data visualization in promoting interdisciplinary collaboration and data democratization. Finally, the paper proposes theoretical suggestions for promoting data visualization technology innovation and application popularization, including strengthening visualization literacy education, developing standardized visualization frameworks, and promoting open-source sharing of visualization tools. This study provides a comprehensive theoretical perspective for understanding the importance of data visualization in the big data era and its future development directions.展开更多
In the data envelopment analysis(DEA)literature,productivity change captured by the Malmquist productivity index,especially in terms of a deterministic environment and stochastic variability in inputs and outputs,has ...In the data envelopment analysis(DEA)literature,productivity change captured by the Malmquist productivity index,especially in terms of a deterministic environment and stochastic variability in inputs and outputs,has been somewhat ignored.Therefore,this study developed a firm-specific,DEA-based Malmquist index model to examine the efficiency and productivity change of banks in a stochastic environment.First,in order to estimate bank-specific efficiency,we employed a two-stage double bootstrap DEA procedure.Specifically,in the first stage,the technical efficiency scores of banks were calculated by the classic DEA model,while in the second stage,the double bootstrap DEA model was applied to determine the effect of the contextual variables on bank efficiency.Second,we applied a two-stage procedure for measuring productivity change in which the first stage included the estimation of stochastic technical efficiency and the second stage included the regression of the estimated efficiency scores on a set of explanatory variables that influence relative performance.Finally,an empirical investigation of the Iranian banking sector,consisting of 120 bank-year observations of 15 banks from 2014 to 2021,was performed to measure their efficiency and productivity change.Based on the findings,the explanatory variables(i.e.,the nonperforming loan ratio and the number of branches)indicated an inverse relationship with stochastic technical efficiency and productivity change.The implication of the findings is that,in order to improve the efficiency and productivity of banks,it is important to optimize these factors.展开更多
An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing...An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.展开更多
文摘Enterprise Business Intelligence(BI)system refers to data mining through the existing database of the enterprise,and data analysis according to customer requirements through comprehensive processing.The data analysis efficiency is high and the operation is convenient.This paper mainly analyzes the application of enterprise BI data analysis system in enterprises.
文摘With the advent of the big data era,real-time data analysis and decision-support systems have been recognized as essential tools for enhancing enterprise competitiveness and optimizing the decision-making process.This study aims to explore the development strategies of real-time data analysis and decision-support systems,and analyze their application status and future development trends in various industries.The article first reviews the basic concepts and importance of real-time data analysis and decision-support systems,and then discusses in detail the key technical aspects such as system architecture,data collection and processing,analysis methods,and visualization techniques.
文摘Maintaining the integrity and longevity of structures is essential in many industries,such as aerospace,nuclear,and petroleum.To achieve the cost-effectiveness of large-scale systems in petroleum drilling,a strong emphasis on structural durability and monitoring is required.This study focuses on the mechanical vibrations that occur in rotary drilling systems,which have a substantial impact on the structural integrity of drilling equipment.The study specifically investigates axial,torsional,and lateral vibrations,which might lead to negative consequences such as bit-bounce,chaotic whirling,and high-frequency stick-slip.These events not only hinder the efficiency of drilling but also lead to exhaustion and harm to the system’s components since they are difficult to be detected and controlled in real time.The study investigates the dynamic interactions of these vibrations,specifically in their high-frequency modes,usingfield data obtained from measurement while drilling.Thefindings have demonstrated the effect of strong coupling between the high-frequency modes of these vibrations on drilling sys-tem performance.The obtained results highlight the importance of considering the interconnected impacts of these vibrations when designing and implementing robust control systems.Therefore,integrating these compo-nents can increase the durability of drill bits and drill strings,as well as improve the ability to monitor and detect damage.Moreover,by exploiting thesefindings,the assessment of structural resilience in rotary drilling systems can be enhanced.Furthermore,the study demonstrates the capacity of structural health monitoring to improve the quality,dependability,and efficiency of rotary drilling systems in the petroleum industry.
文摘DNA microarray technology is an extremely effective technique for studying gene expression patterns in cells, and the main challenge currently faced by this technology is how to analyze the large amount of gene expression data generated. To address this, this paper employs a mixed-effects model to analyze gene expression data. In terms of data selection, 1176 genes from the white mouse gene expression dataset under two experimental conditions were chosen, setting up two conditions: pneumococcal infection and no infection, and constructing a mixed-effects model. After preprocessing the gene chip information, the data were imported into the model, preliminary results were calculated, and permutation tests were performed to biologically validate the preliminary results using GSEA. The final dataset consists of 20 groups of gene expression data from pneumococcal infection, which categorizes functionally related genes based on the similarity of their expression profiles, facilitating the study of genes with unknown functions.
文摘This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.
基金supported in part by the National Key Research and Development Program of China under Grant 2024YFE0200600in part by the National Natural Science Foundation of China under Grant 62071425+3 种基金in part by the Zhejiang Key Research and Development Plan under Grant 2022C01093in part by the Zhejiang Provincial Natural Science Foundation of China under Grant LR23F010005in part by the National Key Laboratory of Wireless Communications Foundation under Grant 2023KP01601in part by the Big Data and Intelligent Computing Key Lab of CQUPT under Grant BDIC-2023-B-001.
文摘Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.
文摘This study investigates university English teachers’acceptance and willingness to use learning management system(LMS)data analysis tools in their teaching practices.The research employs a mixed-method approach,combining quantitative surveys and qualitative interviews to understand teachers’perceptions and attitudes,and the factors influencing their adoption of LMS data analysis tools.The findings reveal that perceived usefulness,perceived ease of use,technical literacy,organizational support,and data privacy concerns significantly impact teachers’willingness to use these tools.Based on these insights,the study offers practical recommendations for educational institutions to enhance the effective adoption of LMS data analysis tools in English language teaching.
基金funded by the National Natural Science Foundation of China(NSFC)the Chinese Academy of Sciences(CAS)(grant No.U2031209)the National Natural Science Foundation of China(NSFC,grant Nos.11872128,42174192,and 91952111)。
文摘Seeing is an important index to evaluate the quality of an astronomical site.To estimate seeing at the Muztagh-Ata site with height and time quantitatively,the European Centre for Medium-Range Weather Forecasts reanalysis database(ERA5)is used.Seeing calculated from ERA5 is compared consistently with the Differential Image Motion Monitor seeing at the height of 12 m.Results show that seeing decays exponentially with height at the Muztagh-Ata site.Seeing decays the fastest in fall in 2021 and most slowly with height in summer.The seeing condition is better in fall than in summer.The median value of seeing at 12 m is 0.89 arcsec,the maximum value is1.21 arcsec in August and the minimum is 0.66 arcsec in October.The median value of seeing at 12 m is 0.72arcsec in the nighttime and 1.08 arcsec in the daytime.Seeing is a combination of annual and about biannual variations with the same phase as temperature and wind speed indicating that seeing variation with time is influenced by temperature and wind speed.The Richardson number Ri is used to analyze the atmospheric stability and the variations of seeing are consistent with Ri between layers.These quantitative results can provide an important reference for a telescopic observation strategy.
基金financial support from PetroChina Innovation Foundation。
文摘In order to overcome the defects that the analysis of multi-well typical curves of shale gas reservoirs is rarely applied to engineering,this study proposes a robust production data analysis method based on deconvolution,which is used for multi-well inter-well interference research.In this study,a multi-well conceptual trilinear seepage model for multi-stage fractured horizontal wells was established,and its Laplace solutions under two different outer boundary conditions were obtained.Then,an improved pressure deconvolution algorithm was used to normalize the scattered production data.Furthermore,the typical curve fitting was carried out using the production data and the seepage model solution.Finally,some reservoir parameters and fracturing parameters were interpreted,and the intensity of inter-well interference was compared.The effectiveness of the method was verified by analyzing the production dynamic data of six shale gas wells in Duvernay area.The results showed that the fitting effect of typical curves was greatly improved due to the mutual restriction between deconvolution calculation parameter debugging and seepage model parameter debugging.Besides,by using the morphological characteristics of the log-log typical curves and the time corresponding to the intersection point of the log-log typical curves of two models under different outer boundary conditions,the strength of the interference between wells on the same well platform was well judged.This work can provide a reference for the optimization of well spacing and hydraulic fracturing measures for shale gas wells.
文摘Peanut allergy is majorly related to severe food induced allergic reactions.Several food including cow's milk,hen's eggs,soy,wheat,peanuts,tree nuts(walnuts,hazelnuts,almonds,cashews,pecans and pistachios),fish and shellfish are responsible for more than 90%of food allergies.Here,we provide promising insights using a large-scale data-driven analysis,comparing the mechanistic feature and biological relevance of different ingredients presents in peanuts,tree nuts(walnuts,almonds,cashews,pecans and pistachios)and soybean.Additionally,we have analysed the chemical compositions of peanuts in different processed form raw,boiled and dry-roasted.Using the data-driven approach we are able to generate new hypotheses to explain why nuclear receptors like the peroxisome proliferator-activated receptors(PPARs)and its isoform and their interaction with dietary lipids may have significant effect on allergic response.The results obtained from this study will direct future experimeantal and clinical studies to understand the role of dietary lipids and PPARisoforms to exert pro-inflammatory or anti-inflammatory functions on cells of the innate immunity and influence antigen presentation to the cells of the adaptive immunity.
文摘The Internet of Multimedia Things(IoMT)refers to a network of interconnected multimedia devices that communicate with each other over the Internet.Recently,smart healthcare has emerged as a significant application of the IoMT,particularly in the context of knowledge‐based learning systems.Smart healthcare systems leverage knowledge‐based learning to become more context‐aware,adaptable,and auditable while maintain-ing the ability to learn from historical data.In smart healthcare systems,devices capture images,such as X‐rays,Magnetic Resonance Imaging.The security and integrity of these images are crucial for the databases used in knowledge‐based learning systems to foster structured decision‐making and enhance the learning abilities of AI.Moreover,in knowledge‐driven systems,the storage and transmission of HD medical images exert a burden on the limited bandwidth of the communication channel,leading to data trans-mission delays.To address the security and latency concerns,this paper presents a lightweight medical image encryption scheme utilising bit‐plane decomposition and chaos theory.The results of the experiment yield entropy,energy,and correlation values of 7.999,0.0156,and 0.0001,respectively.This validates the effectiveness of the encryption system proposed in this paper,which offers high‐quality encryption,a large key space,key sensitivity,and resistance to statistical attacks.
文摘Microsoft Excel is essential for the End-User Approach (EUA), offering versatility in data organization, analysis, and visualization, as well as widespread accessibility. It fosters collaboration and informed decision-making across diverse domains. Conversely, Python is indispensable for professional programming due to its versatility, readability, extensive libraries, and robust community support. It enables efficient development, advanced data analysis, data mining, and automation, catering to diverse industries and applications. However, one primary issue when using Microsoft Excel with Python libraries is compatibility and interoperability. While Excel is a widely used tool for data storage and analysis, it may not seamlessly integrate with Python libraries, leading to challenges in reading and writing data, especially in complex or large datasets. Additionally, manipulating Excel files with Python may not always preserve formatting or formulas accurately, potentially affecting data integrity. Moreover, dependency on Excel’s graphical user interface (GUI) for automation can limit scalability and reproducibility compared to Python’s scripting capabilities. This paper covers the integration solution of empowering non-programmers to leverage Python’s capabilities within the familiar Excel environment. This enables users to perform advanced data analysis and automation tasks without requiring extensive programming knowledge. Based on Soliciting feedback from non-programmers who have tested the integration solution, the case study shows how the solution evaluates the ease of implementation, performance, and compatibility of Python with Excel versions.
基金supported by the Huaihua University Science Foundation under Grant HHUY2019-24.
文摘With the rapid development of the Internet,many enterprises have launched their network platforms.When users browse,search,and click the products of these platforms,most platforms will keep records of these network behaviors,these records are often heterogeneous,and it is called log data.To effectively to analyze and manage these heterogeneous log data,so that enterprises can grasp the behavior characteristics of their platform users in time,to realize targeted recommendation of users,increase the sales volume of enterprises’products,and accelerate the development of enterprises.Firstly,we follow the process of big data collection,storage,analysis,and visualization to design the system,then,we adopt HDFS storage technology,Yarn resource management technology,and gink load balancing technology to build a Hadoop cluster to process the log data,and adopt MapReduce processing technology and data warehouse hive technology analyze the log data to obtain the results.Finally,the obtained results are displayed visually,and a log data analysis system is successfully constructed.It has been proved by practice that the system effectively realizes the collection,analysis and visualization of log data,and can accurately realize the recommendation of products by enterprises.The system is stable and effective.
文摘This research paper compares Excel and R language for data analysis and concludes that R language is more suitable for complex data analysis tasks.R language’s open-source nature makes it accessible to everyone,and its powerful data management and analysis tools make it suitable for handling complex data analysis tasks.It is also highly customizable,allowing users to create custom functions and packages to meet their specific needs.Additionally,R language provides high reproducibility,making it easy to replicate and verify research results,and it has excellent collaboration capabilities,enabling multiple users to work on the same project simultaneously.These advantages make R language a more suitable choice for complex data analysis tasks,particularly in scientific research and business applications.The findings of this study will help people understand that R is not just a language that can handle more data than Excel and demonstrate that r is essential to the field of data analysis.At the same time,it will also help users and organizations make informed decisions regarding their data analysis needs and software preferences.
基金supported by Faculty of Medicine,Ministry of Education,Cultures,Research and Technology Tanjungpura University(No.3483/UN22.9/PG/2021)。
文摘Objective:To explain the use of concept mapping in a study about family members'experiences in taking care of people with cancer.Methods:This study used a phenomenological study design.In this study,we describe the analytical process of using concept mapping in our phenomenological studies about family members'experiences in taking care of people with cancer.Results:We developed several concept maps that aided us in analyzing our collected data from the interviews.Conclusions:The use of concept mapping is suggested to researchers who intend to analyze their data in any qualitative studies,including those using a phenomenological design,because it is a time-efficient way of dealing with large amounts of qualitative data during the analytical process.
文摘Gravitational wave detection is one of the most cutting-edge research areas in modern physics, with its success relying on advanced data analysis and signal processing techniques. This study provides a comprehensive review of data analysis methods and signal processing techniques in gravitational wave detection. The research begins by introducing the characteristics of gravitational wave signals and the challenges faced in their detection, such as extremely low signal-to-noise ratios and complex noise backgrounds. It then systematically analyzes the application of time-frequency analysis methods in extracting transient gravitational wave signals, including wavelet transforms and Hilbert-Huang transforms. The study focuses on discussing the crucial role of matched filtering techniques in improving signal detection sensitivity and explores strategies for template bank optimization. Additionally, the research evaluates the potential of machine learning algorithms, especially deep learning networks, in rapidly identifying and classifying gravitational wave events. The study also analyzes the application of Bayesian inference methods in parameter estimation and model selection, as well as their advantages in handling uncertainties. However, the research also points out the challenges faced by current technologies, such as dealing with non-Gaussian noise and improving computational efficiency. To address these issues, the study proposes a hybrid analysis framework combining physical models and data-driven methods. Finally, the research looks ahead to the potential applications of quantum computing in future gravitational wave data analysis. This study provides a comprehensive theoretical foundation for the optimization and innovation of gravitational wave data analysis methods, contributing to the advancement of gravitational wave astronomy.
文摘The connectivity of sandbodies is a key constraint to the exploration effectiveness of Bohai A Oilfield.Conventional connectivity studies often use methods such as seismic attribute fusion,while the development of contiguous composite sandbodies in this area makes it challenging to characterize connectivity changes with conventional seismic attributes.Aiming at the above problem in the Bohai A Oilfield,this study proposes a big data analysis method based on the Deep Forest algorithm to predict the sandbody connectivity.Firstly,by compiling the abundant exploration and development sandbodies data in the study area,typical sandbodies with reliable connectivity were selected.Then,sensitive seismic attribute were extracted to obtain training samples.Finally,based on the Deep Forest algorithm,mapping model between attribute combinations and sandbody connectivity was established through machine learning.This method achieves the first quantitative determination of the connectivity for continuous composite sandbodies in the Bohai Oilfield.Compared with conventional connectivity discrimination methods such as high-resolution processing and seismic attribute analysis,this method can combine the sandbody characteristics of the study area in the process of machine learning,and jointly judge connectivity by combining multiple seismic attributes.The study results show that this method has high accuracy and timeliness in predicting connectivity for continuous composite sandbodies.Applied to the Bohai A Oilfield,it successfully identified multiple sandbody connectivity relationships and provided strong support for the subsequent exploration potential assessment and well placement optimization.This method also provides a new idea and method for studying sandbody connectivity under similar complex geological conditions.
文摘The advent of the big data era has made data visualization a crucial tool for enhancing the efficiency and insights of data analysis. This theoretical research delves into the current applications and potential future trends of data visualization in big data analysis. The article first systematically reviews the theoretical foundations and technological evolution of data visualization, and thoroughly analyzes the challenges faced by visualization in the big data environment, such as massive data processing, real-time visualization requirements, and multi-dimensional data display. Through extensive literature research, it explores innovative application cases and theoretical models of data visualization in multiple fields including business intelligence, scientific research, and public decision-making. The study reveals that interactive visualization, real-time visualization, and immersive visualization technologies may become the main directions for future development and analyzes the potential of these technologies in enhancing user experience and data comprehension. The paper also delves into the theoretical potential of artificial intelligence technology in enhancing data visualization capabilities, such as automated chart generation, intelligent recommendation of visualization schemes, and adaptive visualization interfaces. The research also focuses on the role of data visualization in promoting interdisciplinary collaboration and data democratization. Finally, the paper proposes theoretical suggestions for promoting data visualization technology innovation and application popularization, including strengthening visualization literacy education, developing standardized visualization frameworks, and promoting open-source sharing of visualization tools. This study provides a comprehensive theoretical perspective for understanding the importance of data visualization in the big data era and its future development directions.
文摘In the data envelopment analysis(DEA)literature,productivity change captured by the Malmquist productivity index,especially in terms of a deterministic environment and stochastic variability in inputs and outputs,has been somewhat ignored.Therefore,this study developed a firm-specific,DEA-based Malmquist index model to examine the efficiency and productivity change of banks in a stochastic environment.First,in order to estimate bank-specific efficiency,we employed a two-stage double bootstrap DEA procedure.Specifically,in the first stage,the technical efficiency scores of banks were calculated by the classic DEA model,while in the second stage,the double bootstrap DEA model was applied to determine the effect of the contextual variables on bank efficiency.Second,we applied a two-stage procedure for measuring productivity change in which the first stage included the estimation of stochastic technical efficiency and the second stage included the regression of the estimated efficiency scores on a set of explanatory variables that influence relative performance.Finally,an empirical investigation of the Iranian banking sector,consisting of 120 bank-year observations of 15 banks from 2014 to 2021,was performed to measure their efficiency and productivity change.Based on the findings,the explanatory variables(i.e.,the nonperforming loan ratio and the number of branches)indicated an inverse relationship with stochastic technical efficiency and productivity change.The implication of the findings is that,in order to improve the efficiency and productivity of banks,it is important to optimize these factors.
基金This project supported by the National High-Tech Research and Development Plan (863-804-3)
文摘An idea is presented about the development of a data processing and analysis system for ICF experiments, which is based on an object oriented framework. The design and preliminary implementation of the data processing and analysis framework based on the ROOT system have been completed. Software for unfolding soft X-ray spectra has been developed to test the functions of this framework.