期刊文献+

基于改进YOLOv5s的动态视觉SLAM算法

Dynamic visual SLAM algorithm based on improved YOLOv5s
在线阅读 下载PDF
导出
摘要 针对室内动态场景中存在的动态目标会降低同步定位与地图构建(SLAM)系统的鲁棒性和相机定位精度问题,提出了一种基于目标检测网络的动态视觉SLAM算法。选择YOLOv5系列中深度和特征图宽度最小的YOLOv5s作为目标检测网络,并将其主干网络替换为PPLCNet轻量级网络,在VOC2007+VOC2012数据集训练后,由实验结果可知,PP-LCNet-YOLOv5s模型较YOLOv5s模型网络参数量减少了41.89%,运行速度加快了39.13%。在视觉SLAM系统的跟踪线程中引入由改进的目标检测网络和稀疏光流法结合的并行线程,用于剔除动态特征点,仅利用静态特征点进行特征匹配和相机位姿估计。实验结果表明,所提算法在动态场景下的相机定位精度较ORB-SLAM3提升了92.38%。 A dynamic visual simultaneous localization and mapping(SLAM)algorithm based on an object detection network is proposed to address the robustness and camera localization accuracy issues caused by dynamic targets in indoor dynamic scenes.The lightweight network PP-LCNet replaces the YOLOv5 backbone network,and the YOLOv5s with the shortest depth and feature map width are chosen as the object detection network.After training on the VOC2007+VOC2012 dataset,experimental results show that the PP-LCNet-YOLOv5s model reduces the network parameters by 41.89%and improves the running speed by 39.13%compared to the YOLOv5s model.In order to eliminate dynamic feature points from the tracking thread of the visual SLAM system,a parallel thread that combines the enhanced object recognition network and sparse optical flow approach is implemented.Only static feature points are used for feature matching and camera position estimation.Experimental results show that the proposed algorithm improves the camera localization accuracy in dynamic scenes by 92.38%compared to ORBSLAM3.
作者 蒋畅江 刘朋 舒鹏 JIANG Changjiang;LIU Peng;SHU Peng(School of Automation,Chongqing University of Posts and Telecommunications,Chongqing 400065,China)
出处 《北京航空航天大学学报》 北大核心 2025年第3期763-771,共9页 Journal of Beijing University of Aeronautics and Astronautics
基金 国家自然科学基金(62277008)。
关键词 同步定位与地图构建 目标检测 动态特征点剔除 定位精度 光流法 simultaneous localization and mapping(SLAM) target detection dynamic feature point elimination positioning accuracy optical flow approach
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部