融合拐点约束的移动机器人跟随目标定位方法

A Following-Target Localization Method for Mobile Robots Incorporating Corner Constraints

  • 摘要: 针对自主跟随移动机器人在户外转弯环境中因连续几何遮挡易丢失目标的问题,提出一种融合拐点约束的目标定位方法。首先,设计基于自适应扩展卡尔曼滤波的鲁棒定位算法,融合激光雷达与拉线式云台传感器数据,抑制高动态下的机械抖动干扰;其次,在视线被遮挡的盲区内,结合环境几何信息与传感器测量的相对位置,将目标方位角建模为均匀分布,并基于最小均方误差准则推导出最优位置估计量,以构建遮挡期间的状态估计框架。仿真实验表明,在多拐点连续遮挡与非稳态干扰场景下,该方法将目标平均定位误差(RMSE)控制在0.24 m以内,遮挡恢复时间缩短至0.8 s,多拐点通过成功率达到100%。该方法有效抑制了视觉盲区内目标位置估计的切向发散现象,保障了移动机器人在复杂几何遮挡环境下的持续稳定跟随。

     

    Abstract: To address target loss caused by continuous geometric occlusions in outdoor environments, a following-target localization method incorporating corner constraints is proposed for mobile robots. First, an adaptive extended Kalman filter (EKF) robustly fuses LiDAR and draw-wire pan-tilt sensor data, mitigating mechanical jitter under high-dynamic conditions. Secondly, within blind zones, the target's azimuth is modeled as a uniform distribution using environmental geometry and sensor-measured relative positions. An optimal position estimate is derived via the minimum mean square error (MMSE) criterion for occlusion state estimation. Simulations show that under continuous multi-corner occlusions and non-stationary interference, the method limits the localization RMSE to 0.24 m, reduces recovery time to 0.8 s, and achieves a 100% corner-passing success rate. This effectively suppresses tangential divergence in blind zones, ensuring stable robot following in complex occluded environments.

     

/

返回文章
返回