期刊文献+
共找到1,053篇文章
< 1 2 53 >
每页显示 20 50 100
Turbo Message Passing Based Burst Interference Cancellation for Data Detection in Massive MIMO-OFDM Systems
1
作者 Wenjun Jiang Zhihao Ou +1 位作者 Xiaojun Yuan Li Wang 《China Communications》 SCIE CSCD 2024年第2期143-154,共12页
This paper investigates the fundamental data detection problem with burst interference in massive multiple-input multiple-output orthogonal frequency division multiplexing(MIMO-OFDM) systems. In particular, burst inte... This paper investigates the fundamental data detection problem with burst interference in massive multiple-input multiple-output orthogonal frequency division multiplexing(MIMO-OFDM) systems. In particular, burst interference may occur only on data symbols but not on pilot symbols, which means that interference information cannot be premeasured. To cancel the burst interference, we first revisit the uplink multi-user system and develop a matrixform system model, where the covariance pattern and the low-rank property of the interference matrix is discussed. Then, we propose a turbo message passing based burst interference cancellation(TMP-BIC) algorithm to solve the data detection problem, where the constellation information of target data is fully exploited to refine its estimate. Furthermore, in the TMP-BIC algorithm, we design one module to cope with the interference matrix by exploiting its lowrank property. Numerical results demonstrate that the proposed algorithm can effectively mitigate the adverse effects of burst interference and approach the interference-free bound. 展开更多
关键词 burst interference cancellation data detection massive multiple-input multiple-output(MIMO) message passing orthogonal frequency division multiplexing(OFDM)
在线阅读 下载PDF
A study on fast post-processing massive data of casting numerical simulation on personal computers 被引量:1
2
作者 Chen Tao Liao Dunming +1 位作者 Pang Shenyong Zhou Jianxin 《China Foundry》 SCIE CAS 2013年第5期321-324,共4页
When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive ... When castings become complicated and the demands for precision of numerical simulation become higher,the numerical data of casting numerical simulation become more massive.On a general personal computer,these massive numerical data may probably exceed the capacity of available memory,resulting in failure of rendering.Based on the out-of-core technique,this paper proposes a method to effectively utilize external storage and reduce memory usage dramatically,so as to solve the problem of insufficient memory for massive data rendering on general personal computers.Based on this method,a new postprocessor is developed.It is capable to illustrate filling and solidification processes of casting,as well as thermal stess.The new post-processor also provides fast interaction to simulation results.Theoretical analysis as well as several practical examples prove that the memory usage and loading time of the post-processor are independent of the size of the relevant files,but the proportion of the number of cells on surface.Meanwhile,the speed of rendering and fetching of value from the mouse is appreciable,and the demands of real-time and interaction are satisfied. 展开更多
关键词 casting numerical simulation massive data fast post-processing
在线阅读 下载PDF
Research on data load balancing technology of massive storage systems for wearable devices 被引量:1
3
作者 Shujun Liang Jing Cheng Jianwei Zhang 《Digital Communications and Networks》 SCIE CSCD 2022年第2期143-149,共7页
Because of the limited memory of the increasing amount of information in current wearable devices,the processing capacity of the servers in the storage system can not keep up with the speed of information growth,resul... Because of the limited memory of the increasing amount of information in current wearable devices,the processing capacity of the servers in the storage system can not keep up with the speed of information growth,resulting in low load balancing,long load balancing time and data processing delay.Therefore,a data load balancing technology is applied to the massive storage systems of wearable devices in this paper.We first analyze the object-oriented load balancing method,and formally describe the dynamic load balancing issues,taking the load balancing as a mapping problem.Then,the task of assigning each data node and the request of the corresponding data node’s actual processing capacity are completed.Different data is allocated to the corresponding data storage node to complete the calculation of the comprehensive weight of the data storage node.According to the load information of each data storage node collected by the scheduler in the storage system,the load weight of the current data storage node is calculated and distributed.The data load balancing of the massive storage system for wearable devices is realized.The experimental results show that the average time of load balancing using this method is 1.75h,which is much lower than the traditional methods.The results show the data load balancing technology of the massive storage system of wearable devices has the advantages of short data load balancing time,high load balancing,strong data processing capability,short processing time and obvious application. 展开更多
关键词 Wearable device massive data data storage system Load balancing Weigh
在线阅读 下载PDF
Parallelized User Clicks Recognition from Massive HTTP Data Based on Dependency Graph Model 被引量:1
4
作者 FANG Chcng LIU Jun LEI Zhenming 《China Communications》 SCIE CSCD 2014年第12期13-25,共13页
With increasingly complex website structure and continuously advancing web technologies,accurate user clicks recognition from massive HTTP data,which is critical for web usage mining,becomes more difficult.In this pap... With increasingly complex website structure and continuously advancing web technologies,accurate user clicks recognition from massive HTTP data,which is critical for web usage mining,becomes more difficult.In this paper,we propose a dependency graph model to describe the relationships between web requests.Based on this model,we design and implement a heuristic parallel algorithm to distinguish user clicks with the assistance of cloud computing technology.We evaluate the proposed algorithm with real massive data.The size of the dataset collected from a mobile core network is 228.7GB.It covers more than three million users.The experiment results demonstrate that the proposed algorithm can achieve higher accuracy than previous methods. 展开更多
关键词 cloud computing massive data graph model web usage mining
在线阅读 下载PDF
Study on Massive Vegetation Data Processing of FY-3 Based on RAM (h)
5
作者 Manyun Lin Xiangang Zhao +2 位作者 Cunqun Fan Lizi Xie Lan Wei 《Journal of Geoscience and Environment Protection》 2017年第4期75-83,共9页
The vegetation data of the Fengyun meteorological satellite are segmented according to the latitude and longitude, and can be written into 648 blocks. However, the vegetation data processing efficiency is low because ... The vegetation data of the Fengyun meteorological satellite are segmented according to the latitude and longitude, and can be written into 648 blocks. However, the vegetation data processing efficiency is low because the data belongs to massive data. This paper presents a data processing method based on RAM (h) for Fengyun-3 vegetation data. First of all, we introduce the Locality-Aware model to segment the input data, then locate the data based on geographic location, and finally fuse the independent data based on geographical location. Experimental results show that the proposed method can effectively improve the data processing efficiency. 展开更多
关键词 Meteorological Satellite VEGETATION data RAM (h) massive data Processing
在线阅读 下载PDF
Hierarchical Visualized Multi-level Information Fusion for Big Data of Digital Image
6
作者 LI Lan LIN Guoliang +1 位作者 ZHANG Yun DU Jia 《Journal of Donghua University(English Edition)》 EI CAS 2020年第3期238-244,共7页
At present,the process of digital image information fusion has the problems of low data cleaning unaccuracy and more repeated data omission,resulting in the unideal information fusion.In this regard,a visualized multi... At present,the process of digital image information fusion has the problems of low data cleaning unaccuracy and more repeated data omission,resulting in the unideal information fusion.In this regard,a visualized multicomponent information fusion method for big data based on radar map is proposed in this paper.The data model of perceptual digital image is constructed by using the linear regression analysis method.The ID tag of the collected image data as Transactin Identification(TID)is compared.If the TID of two data is the same,the repeated data detection is carried out.After the test,the data set is processed many times in accordance with the method process to improve the precision of data cleaning and reduce the omission.Based on the radar images,hierarchical visualization of processed multi-level information fusion is realized.The experiments show that the method can clean the redundant data accurately and achieve the efficient fusion of multi-level information of big data in the digital image. 展开更多
关键词 digital image big data multi-level information FUSION
在线阅读 下载PDF
基于数据挖掘的5G Massive MIMO天线权值优化方法研究 被引量:4
7
作者 田原 张亚男 +1 位作者 贾磊 李连本 《电信工程技术与标准化》 2021年第11期81-86,共6页
本文基于4G/5G数据挖掘分析,给出了一种NSA组网下5G Massive MIMO天线权值智能优化方法。该方法结合4G MDT和5G MR数据,通过聚类和成形算法分析得到待优化小区理想权值集合,可以在海量权值因子中快速寻优得到最优权值组合,采用基于风险... 本文基于4G/5G数据挖掘分析,给出了一种NSA组网下5G Massive MIMO天线权值智能优化方法。该方法结合4G MDT和5G MR数据,通过聚类和成形算法分析得到待优化小区理想权值集合,可以在海量权值因子中快速寻优得到最优权值组合,采用基于风险控制的调整算法实现Massive MIMO天线权值智能自动化迭代寻优。 展开更多
关键词 4G/5G协同 massive MIMO 天线权值 数据挖掘
在线阅读 下载PDF
红外焦平面探测器海量数据的UDP高速传输设计
8
作者 陈雅轩 陈仁 白伟 《电工技术》 2025年第2期181-184,共4页
红外探测系统逐渐采用更大面阵、更快频次的焦平面探测器对红外信号进行探测,因此系统数据量呈指数式增加。为此,设计了万兆以太网高速数据传输方案,以FPGA为主控核心芯片,对数据传输链路进行了搭建,采用UDP协议实现数据的高速传输,传... 红外探测系统逐渐采用更大面阵、更快频次的焦平面探测器对红外信号进行探测,因此系统数据量呈指数式增加。为此,设计了万兆以太网高速数据传输方案,以FPGA为主控核心芯片,对数据传输链路进行了搭建,采用UDP协议实现数据的高速传输,传输模块采用AXI4-Stream协议进行通信。在红外探测系统中对传输方案进行了测试与验证,系统数据传输速度可达9.7 Gbps,满足系统数据传输需求。对大数据量传输验证,无数据丢点出错的情况出现,表明传输方案的可靠性,具有实际的工程应用价值。 展开更多
关键词 红外焦平面探测器 高速传输 UDP协议 海量数据
在线阅读 下载PDF
哨兵一号全球海量波模式SAR数据的智能应用与科学挑战 被引量:1
9
作者 王臣 李晓明 +2 位作者 李慧敏 訾楠楠 胡清清 《海洋与湖沼》 北大核心 2025年第1期25-41,共17页
合成孔径雷达(synthetic aperture radar, SAR)是卫星遥感对地观测重要传感器之一,随着近年来关键核心技术不断发展,其在海洋科学应用中的表现日益突出。特别是欧洲空间局哨兵一号(Sentinel-1, S-1)卫星波模式在开阔大洋持续获取观测数... 合成孔径雷达(synthetic aperture radar, SAR)是卫星遥感对地观测重要传感器之一,随着近年来关键核心技术不断发展,其在海洋科学应用中的表现日益突出。特别是欧洲空间局哨兵一号(Sentinel-1, S-1)卫星波模式在开阔大洋持续获取观测数据计划的实施,为全球SAR海洋研究带来新的契机和挑战。虽然针对该海量波模式SAR数据发展了基于深度卷积神经网络的分类模型,并利用分类结果进行了降雨和海洋大气边界层等科学问题初步诊断,但这些全球海洋SAR数据的巨大潜力仍有待进一步开发,尤其是考虑到S-1已经业务化运行了近10 a,且还会在未来可见的30 a内继续。不同于传统的个例或区域分析,处理分析全球海洋海量SAR数据具有其独特性,往往需要借助人工智能方法和大模型技术。本文通过梳理作者所在研究团队过去几年在该方向上的不断尝试,包括但不限于全球SAR海面动力参数反演、海洋大气边界层常见现象观测、上层海洋动力过程诊断和极地海冰与冰山监测等,重点展示全球海洋SAR数据的典型应用场景,讨论其解决海洋大气关键科学问题的潜力,进而总结海洋SAR数据的知识转化和科学服务能力,为建立SAR持续观测支撑海洋强国战略体系提供依据,也为我国未来SAR卫星发展和布局提供方向性参考。 展开更多
关键词 合成孔径雷达(SAR) 微波海洋遥感 海洋大气现象 海量数据 科学应用 机器学习
在线阅读 下载PDF
Optimal decorrelated score subsampling for generalized linear models with massive data 被引量:1
10
作者 Junzhuo Gao Lei Wang Heng Lian 《Science China Mathematics》 SCIE CSCD 2024年第2期405-430,共26页
In this paper, we consider the unified optimal subsampling estimation and inference on the lowdimensional parameter of main interest in the presence of the nuisance parameter for low/high-dimensionalgeneralized linear... In this paper, we consider the unified optimal subsampling estimation and inference on the lowdimensional parameter of main interest in the presence of the nuisance parameter for low/high-dimensionalgeneralized linear models (GLMs) with massive data. We first present a general subsampling decorrelated scorefunction to reduce the influence of the less accurate nuisance parameter estimation with the slow convergencerate. The consistency and asymptotic normality of the resultant subsample estimator from a general decorrelatedscore subsampling algorithm are established, and two optimal subsampling probabilities are derived under theA- and L-optimality criteria to downsize the data volume and reduce the computational burden. The proposedoptimal subsampling probabilities provably improve the asymptotic efficiency of the subsampling schemes in thelow-dimensional GLMs and perform better than the uniform subsampling scheme in the high-dimensional GLMs.A two-step algorithm is further proposed to implement, and the asymptotic properties of the correspondingestimators are also given. Simulations show satisfactory performance of the proposed estimators, and twoapplications to census income and Fashion-MNIST datasets also demonstrate its practical applicability. 展开更多
关键词 A-OPTIMALITY decorrelated score subsampling high-dimensional inference L-optimality massive data
原文传递
Adaptive Distributed Inference for Multi-source Massive Heterogeneous Data
11
作者 Xin YANG Qi Jing YAN Mi Xia WU 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2024年第11期2751-2770,共20页
In this paper,we consider the distributed inference for heterogeneous linear models with massive datasets.Noting that heterogeneity may exist not only in the expectations of the subpopulations,but also in their varian... In this paper,we consider the distributed inference for heterogeneous linear models with massive datasets.Noting that heterogeneity may exist not only in the expectations of the subpopulations,but also in their variances,we propose the heteroscedasticity-adaptive distributed aggregation(HADA)estimation,which is shown to be communication-efficient and asymptotically optimal,regardless of homoscedasticity or heteroscedasticity.Furthermore,a distributed test for parameter heterogeneity across subpopulations is constructed based on the HADA estimator.The finite-sample performance of the proposed methods is evaluated using simulation studies and the NYC flight data. 展开更多
关键词 Distributed estimation HETEROGENEITY Levene’s test massive heterogeneous data
原文传递
HSIT:一种针对海量数据的分布式相似性查询索引
12
作者 姚回 刘文 《计算机与数字工程》 2025年第3期718-724,共7页
相似性查询常用于信息检索、生物学和网络安全等领域,用来分析数据之间的关联关系。传统方法执行相似性查询往往需要查询点与数据库中的每一条数据进行计算。随着数据量的增大,计算量会成线性式增长。为提升海量数据的分布式相似性查询... 相似性查询常用于信息检索、生物学和网络安全等领域,用来分析数据之间的关联关系。传统方法执行相似性查询往往需要查询点与数据库中的每一条数据进行计算。随着数据量的增大,计算量会成线性式增长。为提升海量数据的分布式相似性查询效率,提出一种基于HBase的相似性查询索引结构HSIT(HBase Similarity Index Tree)。在数据存储的过程中,该算法实现动态建立相似性索引树结构。HSIT索引能够按照相似度阈值,划分相似性的数据在HBase的相邻区域存储;在用户执行相似性查询时,查询节点可以通过HSIT快速检索相似区域。该索引能够实现高效剪枝,使得只有相似的区域才需要两两计算。通过2万条数据指数型增长到128万条数据执行相似性查询,与DSCS-LTS算法比较,实验结果证明,HSIT算法效率有所提升。 展开更多
关键词 海量数据 分布式 相似性查询 索引 HBASE
在线阅读 下载PDF
云计算存储技术助力档案数字化海量数据管理
13
作者 蒙勤 《计算机应用文摘》 2025年第6期105-107,共3页
随着信息技术的不断进步,云计算存储技术已成为处理海量数据、实现档案数字化的重要工具。在档案数字化过程中,海量数据的存储、处理、安全性和可扩展性等问题成为关键挑战。凭借高效的资源管理、灵活的扩展能力以及强大的数据处理能力... 随着信息技术的不断进步,云计算存储技术已成为处理海量数据、实现档案数字化的重要工具。在档案数字化过程中,海量数据的存储、处理、安全性和可扩展性等问题成为关键挑战。凭借高效的资源管理、灵活的扩展能力以及强大的数据处理能力,云计算存储技术能够有效应对这些挑战。通过将档案数据存储在云端,能够大幅减轻本地硬件负担,提高存储资源的利用率,同时降低数据丢失的风险。 展开更多
关键词 云计算存储技术 档案数字化 海量数据管理 数据安全 可扩展性
在线阅读 下载PDF
海量多源遥感影像数据服务系统设计与研究
14
作者 权西瑞 王凯 +1 位作者 王博 王小飞 《测绘与空间地理信息》 2025年第2期70-72,76,共4页
针对传统遥感影像数据共享分发过程中存在的问题,结合海量多源遥感影像存储管理特点及数据分发模式,分析了遥感影像数据应用需求和系统业务功能需求,基于影像快速检索技术和影像元数据自动建模技术,构建了海量多源遥感影像数据服务系统... 针对传统遥感影像数据共享分发过程中存在的问题,结合海量多源遥感影像存储管理特点及数据分发模式,分析了遥感影像数据应用需求和系统业务功能需求,基于影像快速检索技术和影像元数据自动建模技术,构建了海量多源遥感影像数据服务系统总体架构,提出了影像数据存储管理及数据分发的技术路线,设计了影像存储及查询展示、影像申请、影像审批、数据分发及统计分析等功能,实现了海量多源遥感影像全流程科学高效管理,提升了遥感影像数据的共享分发应用效率和精细化程度。 展开更多
关键词 遥感影像 海量数据 分布式存储 影像分发
在线阅读 下载PDF
面向电网海量数据的复杂人工智能模型小型化
15
作者 英晓勇 阙波 +1 位作者 曹刚 李永欢 《自动化与仪表》 2025年第4期157-161,共5页
为更好处理电网海量数据,提出面向电网海量数据的复杂人工智能模型小型化方法。构建面向电网海量数据的小型化人工智能模型,将标准卷积替换为深度可分离卷积,减少深度学习模型的参数和计算量,实现主体结构小型化处理;并引入小型化残差... 为更好处理电网海量数据,提出面向电网海量数据的复杂人工智能模型小型化方法。构建面向电网海量数据的小型化人工智能模型,将标准卷积替换为深度可分离卷积,减少深度学习模型的参数和计算量,实现主体结构小型化处理;并引入小型化残差注意力模块和小型化多尺度特征融合模块,提升网络对海量电网数据中有价值特征的提取效果,融合多尺度特征促进电网海量数据分析处理精度。实验结果显示,该方法可以降低约85%以上的参数量,规模缩减程度约在86%,在不影响模型电网异常检测精度的条件下可以显著提升模型实时性,降低内存占用及能耗。 展开更多
关键词 电网海量数据 人工智能模型 深度学习 小型化 深度卷积 注意力机制
在线阅读 下载PDF
The Interdisciplinary Research of Big Data and Wireless Channel: A Cluster-Nuclei Based Channel Model 被引量:23
16
作者 Jianhua Zhang 《China Communications》 SCIE CSCD 2016年第S2期14-26,共13页
Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big... Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big volume feature,considering the massive antennas,huge bandwidth and versatile application scenarios.This article firstly presents a comprehensive survey of channel measurement and modeling research for mobile communication,especially for 5th Generation(5G) and beyond.Considering the big data research progress,then a cluster-nuclei based model is proposed,which takes advantages of both the stochastical model and deterministic model.The novel model has low complexity with the limited number of cluster-nuclei while the cluster-nuclei has the physical mapping to real propagation objects.Combining the channel properties variation principles with antenna size,frequency,mobility and scenario dug from the channel data,the proposed model can be expanded in versatile application to support future mobile research. 展开更多
关键词 channel model big data 5G massive MIMO machine learning CLUSTER
在线阅读 下载PDF
Loose architecture of multi-level massive geospatial data based on virtual quadtree 被引量:4
17
作者 YANG ChongJun,WU Sheng,REN YingChao,FU Li,ZHANG FuQing,WANG Gang,TAN Jian,LIU DongLin,MA ChaoJi & LIANG Li State Key Laboratory of Remote Sensing Science,Jointly Sponsored by the Institute of Remote Sensing Applications of Chinese Academy of Sciences and Beijing Normal University,Beijing 100101,China 《Science China(Technological Sciences)》 SCIE EI CAS 2008年第S1期114-123,共10页
This paper proposed a virtual quadtree (VQT) based loose architecture of multi-level massive geospatial data for integrating massive geospatial data dispersed in the departments of different hierarchies in the same se... This paper proposed a virtual quadtree (VQT) based loose architecture of multi-level massive geospatial data for integrating massive geospatial data dispersed in the departments of different hierarchies in the same sector into a unified GIS (Geographic Information System) platform. By virtualizing the nodes of the quad-tree,the VQT separates the structure of data organization from data storage,and screens the difference between the data storage in local computer and in the re-mote computers in network environment. And by mounting,VQT easily integrates the data from the remote computers into the local VQT so as to implement seam-less integration of distributed multi-level massive geospatial data. Based on that mode,the paper built an application system with geospatial data over 1200 GB distributed in 12 servers deployed in 12 cities. The experiment showed that all data can be seamlessly rapidly traveled and performed zooming in and zooming out smoothly. 展开更多
关键词 massive GEOSPATIAL data VIRTUAL QUADTREE LOOSE ARCHITECTURE
原文传递
Design and development of real-time query platform for big data based on hadoop 被引量:1
18
作者 刘小利 Xu Pandeng +1 位作者 Liu Mingliang Zhu Guobin 《High Technology Letters》 EI CAS 2015年第2期231-238,共8页
This paper designs and develops a framework on a distributed computing platform for massive multi-source spatial data using a column-oriented database(HBase).This platform consists of four layers including ETL(extract... This paper designs and develops a framework on a distributed computing platform for massive multi-source spatial data using a column-oriented database(HBase).This platform consists of four layers including ETL(extraction transformation loading) tier,data processing tier,data storage tier and data display tier,achieving long-term store,real-time analysis and inquiry for massive data.Finally,a real dataset cluster is simulated,which are made up of 39 nodes including 2 master nodes and 37 data nodes,and performing function tests of data importing module and real-time query module,and performance tests of HDFS's I/O,the MapReduce cluster,batch-loading and real-time query of massive data.The test results indicate that this platform achieves high performance in terms of response time and linear scalability. 展开更多
关键词 big data massive data storage real-time query HADOOP distributed computing
在线阅读 下载PDF
Managing Computing Infrastructure for IoT Data 被引量:1
19
作者 Sapna Tyagi Ashraf Darwish Mohammad Yahiya Khan 《Advances in Internet of Things》 2014年第3期29-35,共7页
Digital data have become a torrent engulfing every area of business, science and engineering disciplines, gushing into every economy, every organization and every user of digital technology. In the age of big data, de... Digital data have become a torrent engulfing every area of business, science and engineering disciplines, gushing into every economy, every organization and every user of digital technology. In the age of big data, deriving values and insights from big data using rich analytics becomes important for achieving competitiveness, success and leadership in every field. The Internet of Things (IoT) is causing the number and types of products to emit data at an unprecedented rate. Heterogeneity, scale, timeliness, complexity, and privacy problems with large data impede progress at all phases of the pipeline that can create value from data issues. With the push of such massive data, we are entering a new era of computing driven by novel and ground breaking research innovation on elastic parallelism, partitioning and scalability. Designing a scalable system for analysing, processing and mining huge real world datasets has become one of the challenging problems facing both systems researchers and data management researchers. In this paper, we will give an overview of computing infrastructure for IoT data processing, focusing on architectural and major challenges of massive data. We will briefly discuss about emerging computing infrastructure and technologies that are promising for improving massive data management. 展开更多
关键词 BIG data Cloud COMPUTING data ANALYTICS Elastic SCALABILITY Heterogeneous COMPUTING GPU PCM massive data Processing
在线阅读 下载PDF
基于深度自回归模型的电网异常流量检测算法 被引量:1
20
作者 李勇 韩俊飞 +2 位作者 李秀芬 王鹏 王蓓 《沈阳工业大学学报》 CAS 北大核心 2024年第1期24-28,共5页
针对电网中行为种类复杂多样且数量众多的问题,提出了一种基于自回归模型的电网异常流量检测算法。该算法利用深度自编码网络自动提取网络流量数据的特征,降低异常流量检测的分析周期,并自动挖掘数据的层次关系。通过支持向量机对提取... 针对电网中行为种类复杂多样且数量众多的问题,提出了一种基于自回归模型的电网异常流量检测算法。该算法利用深度自编码网络自动提取网络流量数据的特征,降低异常流量检测的分析周期,并自动挖掘数据的层次关系。通过支持向量机对提取的特征进行分类,实现对异常流量的检测。仿真实验结果表明,所提算法可以分析不同攻击向量,避免噪声数据的干扰,进而提高电网异常流量检测的精度,对于流量数据处理具有重要意义。 展开更多
关键词 自回归模型 深度学习 异常检测 海量数据 分析周期 支持向量机
在线阅读 下载PDF
上一页 1 2 53 下一页 到第
使用帮助 返回顶部