摘要
化工过程中,掌握关键工艺参数的变化趋势对于消除潜在波动、维持工况稳定作用巨大。然而,传统的浅层静态模型很难对非线性和动态性显著的复杂序列数据进行精准预测。针对上述难题,提出一种深度预测模型TA-ConvBiLSTM,将卷积神经网络(convolutional neural networks,CNN)和双向长短时记忆网络(bi-directional long short term memory,BiLSTM)集成到统一框架内,使其不仅能在每个时间步上自动挖掘高维变量间的隐含关联,更能横跨所有时间步自适应提取有用的深层时序特征。此外,引入时间注意力(temporal attention,TA)机制,为反映目标变化规律的重要信息增加权重,避免其因输入序列过长、深层特征太多而被掩盖。所提出方法的有效性在国内某延迟焦化装置炉管温度预测的案例中得到验证。
In the chemical process,mastering the change trend of key process parameters has a great effect on eliminating potential fluctuations and maintaining stable working conditions.However,traditional shallow static models are difficult to accurately predict complex sequence data with significant nonlinearity and dynamics.In response to the above problems,a deep prediction model called TA-ConvBiLSTM is proposed by seamlessly integrating convolutional neural networks(CNN)and bi-directional long short term memory(BiLSTM)into a unified framework.In this way,the integrated model can,not only automatically explore the esoteric relevance among high-dimensional variables at each time step,but also adaptively extract useful deep temporal features across all time steps.In addition,the temporal attention(TA)mechanism is further introduced to increase the weight of significant information reflecting the law of target variation,so as to prevent it from being concealed due to the overlong input sequence and over many deep features.The effectiveness of the proposed method is verified in a case of furnace tube temperature prediction in a domestic delayed coking unit.
作者
袁壮
凌逸群
杨哲
李传坤
YUAN Zhuang;LING Yiqun;YANG Zhe;LI Chuankun(State Key Laboratory of Safety and Control for Chemicals,SINOPEC Research Institute of Safety Engineering Co.,Ltd.,Qingdao 266071,Shandong,China;China Petrochemical Corporation,Beijing 100728,China)
出处
《化工学报》
EI
CAS
CSCD
北大核心
2022年第1期342-351,共10页
CIESC Journal
基金
国家自然科学基金青年科学基金项目(21706291)
中国石化重大科技项目(321123-1)。
关键词
化学过程
预测
神经网络
双向长短时记忆网络
卷积神经网络
时间注意力机制
chemical process
prediction
neural networks
bi-directional long short term memory
convolutional neural networks
temporal attention mechanism