黄兆基, 高军礼, 唐兆年, 宋海涛, 郭靖. 基于注意力机制和视触融合的机器人抓取滑动检测[J]. 信息与控制, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598
引用本文: 黄兆基, 高军礼, 唐兆年, 宋海涛, 郭靖. 基于注意力机制和视触融合的机器人抓取滑动检测[J]. 信息与控制, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598
HUANG Zhaoji, GAO Junli, TANG Zhaonian, SONG Haitao, GUO Jing. Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion[J]. INFORMATION AND CONTROL, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598
Citation: HUANG Zhaoji, GAO Junli, TANG Zhaonian, SONG Haitao, GUO Jing. Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion[J]. INFORMATION AND CONTROL, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598

基于注意力机制和视触融合的机器人抓取滑动检测

Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion

  • 摘要: 机器人抓取滑动检测对于抓取任务而言具有重要的意义。在抓取过程中, 视觉和触觉作为判断抓取状态的关键模态信息, 实现其高效融合仍具有挑战性。基于视触觉模态信息融合理念, 提出一种新型视触觉融合模型, 用于高效解决机器人抓取滑动检测问题。首先, 该模型通过卷积神经网络、多尺度时序卷积网络提取视触觉数据的空间和时序特征信息; 然后, 使用注意力机制为视触觉特征分配权重, 并通过多尺度时序卷积网络进行多模态信息融合; 最后, 通过全连接层输出机器人抓取状态的检测结果。使用7自由度XArm机械臂、D455 RGB摄像头和XELA触觉传感器进行数据采集。实验结果表明, 基于该模型的机器人抓取滑动检测的准确率高达98.98%, 该模型在可靠、顺利执行机器人抓取任务方面具有较好的研究与应用价值。

     

    Abstract: Slip detection for robot grasping is of great significance for grip tasks. In the grasping process, vision and touch are the key modal information for judging the grasping state. To achieve efficient visuotactile fusion remains challenging. A novel visual and tactile fusion model is proposed to address the issues of efficient slip detection for robot grasping based on the idea of visual and tactile fusion. First, the model extracts the spatial and temporal feature information of visual and haptic data using a convolutional neural network and a multiscale temporal convolutional network. Second, an attention mechanism is used to assign weights to visuotactile features, and multimodal information fusion is performed through a multiscale temporal convolutional network. Finally, the detection results of the grasping state are acquired through one fully connected layer. Data acquisition is implemented using an XArm 7DOF robotic arm, D455 RGB camera, and XELA tactile sensor. The experimental results show that the accuracy rate of slip detection for robot grasping using the proposed model is as high as 98.98 %. The proposed model has good research and application value in the reliable and smooth execution of robot grip tasks.

     

/

返回文章
返回