摘要
针对传统基于局部不变特征的景象匹配方法处理时间过长的问题,提出基于FREAK局部不变特征的快速景象配准算法。首先提出FAST-Difference特征点提取方法,分别提取参考图像和待配准图像中的特征点;接着计算其FREAK描述符,生成特征向量;随后利用级联匹配计算特征向量之间的汉明距离,提取出匹配的特征点对;最后利用RANSAC算法剔除误匹配点,利用最小二乘法估算出两幅图像之间的空间几何变换参数,实现两幅图像的配准。FAST-Difference相比以往的特征点检测方法速度更快;FREAK描述符与人类视网膜相似的结构提升了算法的时间性能和鲁棒性;使用扫视搜索,大大加速了匹配过程。实验证明相对于SIFT、SURF算法,本文算法不仅对于各种变换具有更好的鲁棒性,而且处理时间大大缩短,实现了景象匹配的实时处理。
To solve the long time-consuming problem of regular scene matching method based on local invariant descriptor,a new method for rapid scene matching based on FREAK local invariant descriptor is proposed. Firstly,a new feature point's detection method named FAST-Difference is proposed to extract the feature points from referenced image and sensed image. Then feature vectors are constructed using FREAK descriptor. After that,the matched pairs are extracted by computing the Hamming distance between two feature vectors. At last,wrong matches are eliminated by RANSAC and the best transform parameters are estimated by least square method to accomplish the registration process. FAST-Difference costs less time than former feature point 's detection method.FREAK descriptor has the same topology with human retina,which upgrades its timing performance and robustness.Saccadic searching in matching process speeds up the matching process as well. The experimental results indicate that,compared with other algorithms such as SIFT and SURF,the proposed method is more robust to kinds of transforms and markedly reduces computing time,which can lead to real-time scene matching.
出处
《电子测量与仪器学报》
CSCD
北大核心
2015年第2期204-212,共9页
Journal of Electronic Measurement and Instrumentation
基金
长春市科技计划(2013270)
吉林省科技发展计划(20126015)项目