Exoskeleton Motion Intention Recognition Based on Multi-sensor Information Fusion
SHI Lei1,2, YIN Peng1,2, YANG Ming2, QU Shengguan1
1. Mechanical and Automotive Engineering, South China University of Technology, Guangzhou 510000, China; 2. Guangzhou Shiyuan Electronic Technology Company Limited, Guangzhou 510300, China
Abstract:Accurate identification and prediction of a wearer's motion intention in real time are necessary to realize the compliant motion control of exoskeleton robots. Thus, we use the multi-sensor information fusion method to recognize the wearer's motion intention. The comparison of various machine learning algorithms with respect to recognition accuracy, resource consumption, and real-time processing revealed that support vector machine can recognize eight daily motion patterns (sitting, standing, walking, running, ramp ascent, ramp descent, stairs ascent and stairs descent), at an average recognition accuracy rate of 95%. The neuro-fuzzy inference theory is adopted to predict motion phase and motion switching events. On the given test set, the phase recognition accuracy rate is 99%, and the average absolute value of the deviation between the predicted and real-time state switching moments is 61.5 ms. This observation meets the requirements of exoskeleton compliance control for predicting time.
石磊, 尹鹏, 杨铭, 屈盛官. 基于多传感器信息融合的外骨骼运动意图辨识[J]. 信息与控制, 2023, 52(2): 142-153.
SHI Lei, YIN Peng, YANG Ming, QU Shengguan. Exoskeleton Motion Intention Recognition Based on Multi-sensor Information Fusion. Information and control, 2023, 52(2): 142-153.
[1] 韩亚丽、王兴松、贾山. 下肢助力外骨骼机器人技术[M]. 南京:东南大学出版社.HAN Y L, WANG X S, JIA S. Lower Limb assisted exoskeleton robotics[M]. Nanjing:Southeast University Press, 2019. [2] LIU X, ZHOU Z, WANG Q. Real-time onboard recognition of gait transitions for a bionic knee exoskeleton in transparent mode[C]//40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Piscataway, USA:IEEE, 2018:3202-3205. [3] WANG S, WANG L, MEIJNEKE C, et al. Design and control of the MINDWALKER exoskeleton[J]//IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2015, 23(2):277-286. [4] BERNHARDT M, FREY M, COLOMBO G, et al. Hybrid force-position control yields cooperative behaviour of the rehabilitation robot LOKOMAT[C]//9th International Conference on Rehabilitation Robotics. Piscataway, USA:IEEE, 2005:536-539. [5] KAWAMOTO H, TAAL S, NINISS H, et al. Voluntary motion support control of Robot Suit HAL triggered by bioelectrical signal for hemiplegia[C]//2010 Annual International Conference of the IEEE Engineering in Medicine and Biology. Piscataway, USA:IEEE, 2010:462-466. [6] CB Insights. 2021年全球外骨骼机器人产业研究报告[R/OL].[2022-02-01]. https://www.dx2025.com/wp-content/uploads/2021/12/cb_insights_.pdf. CB Insights. Global Exoskeleton Robot Industry Research Report 2021[R/OL].[2022-02-01]. https://www.dx2025.com/wp-content/uploads/2021/12/cb_insights_.pdf. [7] TALATY M, ESQUENAZI A, BRICENO J E. Differentiating ability in users of the ReWalk(TM) powered exoskeleton:An analysis of walking kinematics[C/OL]//13th IEEE International Conference on Rehabilitation Robotics. Piscataway, USA:IEEE[2022-03-12]. https://ieeexplore.ieee.org/document/6650469. DOI:10.1109/ICORR.2013.6650469. [8] GHAN J, STEGER R, KAZEROONI H. Control and system identification for the Berkeley lower extremity exoskeleton (BLEEX)[J]. Advanced Robotics, 2006, 20(9):989-1014. [9] KAZEROONI H, STEGER R. The Berkeley lower extremity exoskeleton[J]. Dynamic Systems, Measurement, and Control, 2006, 128(1):14-25. [10] KAMALI K, AKBARI A A, AKBARZADEH A. Trajectory generation and control of a knee exoskeleton based on dynamic movement primitives for sit-to-stand assistance[J]. Advanced Robotics, 2016, 30(13):846-860. [11] 孙建, 余永, 葛运建, 等. 基于接触力信息的可穿戴型下肢助力机器人传感系统研究[J]. 中国科学技术大学学报, 2008, 38(12):1432-1438. SUN J, YU Y, GE Y J, et al. Research on sensing system of wearable lower limb assisted robot based on contact force information[J]. Journal of University of Science and Technology of China, 2008, 38(12):1432-1438. [12] 杨秀霞, 杨晓东, 杨智勇. 基于人机系统虚拟样机模型的外骨骼系统控制研究[J]. 机械制造与自动化, 2017, 46(3):164-167. YANG X X, YANG X D, YANG Z Y. Research on exoskeleton system control based on human-machine system virtual prototype model[J]. Machinery Manufacturing and Automation, 2017, 46(3):164-167. [13] CHEN B, ZHENG E, WANG Q. A locomotion intent prediction system based on multi-sensor fusion[J]. Sensors, 2014, 14(7):12349-12369. [14] HAO M, LIAO W H. Human gait modeling and analysis using a semi-Markov process with ground reaction forces[J]. IEEE Transactions on Neural Systems & Rehabilitation Engineering, 2017, 25(6):597-607. [15] LIM D H, KIM W S, KIM H J, et al. Development of real-time gait phase detection system for a lower extremity exoskeleton robot[J]. International Journal of Precision Engineering and Manufacturing, 2017, 18(5):681-687. [16] LIE Y, JIANBIN Z, YANG W, et al. Adaptive method for real-time gait phase detection based on ground contact forces[J]. Gait & Posture, 2015, 41(1):269-275. [17] MANCHOLA M, BERNAL M, MUNERA M, et al. Gait phase detection for lower-limb exoskeletons using foot motion data from a single inertial measurement unit in hemiparetic individuals[J/OL]. Sensors, 2019, 19(13)[2022-03-06]. https://www.mdpi.com/1424-8220/19/13/2988. DOI:10.3390/s19132988. [18] 汪亮. 基于可穿戴传感器网络的人体行为识别技术研究[D]. 南京:南京大学, 2014. WANG L. Research on human behavior recognition technology based on wearable sensor network[D]. Nanjing:Nanjing University, 2014. [19] 万良金. 基于多传感器信息融合的机器人姿态测量技术研究[D]. 北京:北京交通大学, 2015. WAN L J. Research on robot attitude measurement technology based on multi-sensor information fusion[D]. Beijing:Beijing Jiaotong University, 2015. [20] 王永雄, 陈晗, 尹钟, 等. 基于惯导信息的人体动作和路况识别[J]. 生物医学工程学杂志, 2018, 35(4):621-630. Wang Y X, Chen H, Yin Z, et al. Human motion and road condition recognition based on inertial navigation information[J]. Journal of Biomedical Engineering, 2018, 35(4):621-630. [21] KERN N, ANTIFAKOS S, SCHIELE B, et al. A model for human interruptability:Experimental evaluation and automatic estimation from wearable sensors[C/OL]//Eighth International Symposium on Wearable Computers. Piscataway, USA:IEEE, 2004[2022-02-22]. https://ieeexplore.ieee.org/document/1364705. DOI:10.1109/ISWC.2004.3. [22] 陈启明, 黄瑞. 下肢外骨骼机器人意图识别算法研究[J]. 电子科技大学学报, 2018, 47(3):330-336. CHEN Q M, HUANG R. Research on intention recognition algorithm of lower limb exoskeleton robot[J]. Journal of University of Electronic Science and technology, 2018, 47(3):330-336. [23] 徐昺. 基于MEMS传感器与Zigbee网络的人体动作捕捉系统的设计与实现[D]. 成都:电子科技大学, 2013. XU B. Design and implementation of human motion capture system based on MEMS sensor and Zigbee network[D]. Chengdu:University of Electronic Science and Technology, 2013. [24] 丁其川, 赵新刚, 李自由, 等. 基于自更新混合分类模型的肌电运动识别方法[J]. 自动化学报, 2019, 45(8):1464-1474. DING Q C, ZHAO X G, LI Z Y, et al. EMG motion recognition method based on self-updating mixed classification model[J]. Journal of Automation, 2019, 45(8):1464-1474.