A Robust Non-Contact Bridge Displacement Tracking Method via KCF-ORB with Scale-Space and Occlusion Handling

Xiaolin Ma, Hongju Hu

Abstract


With the growing demand for health monitoring in bridge engineering, non-contact displacement measurement techniques have received extensive attention. A novel bridge displacement monitoring method combining spatially constrained ORB feature extraction with kernel correlation filtering (KCF) and longshort-term tracking learning detection (TLD) algorithms is proposed in the study. The overall method is divided into two stages. First, the traditional ORB feature matching is improved by introducing spatial location constraints to enhance the accuracy of feature point detection and description. Second, the improved features are combined with the KCF tracking framework and a Gaussian pyramid (GP) is introduced to adapt to the scale change. Meanwhile, the TLD algorithm is integrated to deal with the occlusion problem to realize robust displacement tracking. The performance of the proposed method is experimentally validated on the OTB and LaSOT datasets. The results showed that with the optimal parameters (descriptor length of 256 and Gaussian scale number of 4), the feature extraction and tracking accuracy of the new model was close to 0.95, and the detection time was as short as 20ms. The tracking loss rate of the proposed model under 50% occlusion was reduced to 15%. Compared with the state-of-the-art models such as YOLOv5, Mask R-CNN, and Faster R-CNN, the proposed method performed better in terms of precision rate (92.56%), recall rate (90.11%), F1 score (91.10%), and average displacement error (0.01mm). The results show that the proposed method has higher precision, stronger robustness, and better detection efficiency in the complex bridge environment, which provides an effective and reliable technical path for bridge engineering displacement monitoring.


Full Text:

PDF


DOI: https://doi.org/10.31449/inf.v49i27.8714

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.