HUANG Zhaoji, GAO Junli, TANG Zhaonian, SONG Haitao, GUO Jing. Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion[J]. INFORMATION AND CONTROL, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598
Citation: HUANG Zhaoji, GAO Junli, TANG Zhaonian, SONG Haitao, GUO Jing. Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion[J]. INFORMATION AND CONTROL, 2024, 53(2): 191-198. DOI: 10.13976/j.cnki.xk.2023.2598

Slip Detection for Robot Grasping Based on Attention Mechanism and Visuo-Tactile Fusion

  • Slip detection for robot grasping is of great significance for grip tasks. In the grasping process, vision and touch are the key modal information for judging the grasping state. To achieve efficient visuotactile fusion remains challenging. A novel visual and tactile fusion model is proposed to address the issues of efficient slip detection for robot grasping based on the idea of visual and tactile fusion. First, the model extracts the spatial and temporal feature information of visual and haptic data using a convolutional neural network and a multiscale temporal convolutional network. Second, an attention mechanism is used to assign weights to visuotactile features, and multimodal information fusion is performed through a multiscale temporal convolutional network. Finally, the detection results of the grasping state are acquired through one fully connected layer. Data acquisition is implemented using an XArm 7DOF robotic arm, D455 RGB camera, and XELA tactile sensor. The experimental results show that the accuracy rate of slip detection for robot grasping using the proposed model is as high as 98.98 %. The proposed model has good research and application value in the reliable and smooth execution of robot grip tasks.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return