Research article Special Issues

A method for robotic grasping based on improved Gaussian mixture model

  • Received: 22 August 2019 Accepted: 19 November 2019 Published: 02 December 2019
  • The present research envisages a method for the robotic grasping based on the improved Gaussian mixture model. The improved Gaussian mixture model is a method proposed by incorporating Bayesian ideas into the Gaussian model. It will use the Gaussian model to perform grasping training in a certain area which we called trained area. The improved Gaussian models utilized the trained Gaussian models as prior models. The proposed method improved the cumulative updates and the evaluation results of the improved models to make robots more adaptable to grasp in the untrained areas. The self-taught learning ability of the robot about grasping was semi-supervised. Firstly, the observable variables of objects were determined by a camera. Then, we dragged the robot to grasp object. The relationship between the variables and robot's joint angles were mapped. We obtained new samples in the close untrained area to improve the Gaussian model. With these new observable variables, the robot grasped it successfully. Finally, the effectiveness of the method was verified by experiments and comparative tests on grasping of real objects and grasping simulation of the improved Gaussian models through the virtual robot experimentation platform.

    Citation: Yong Tao, Fan Ren, Youdong Chen, Tianmiao Wang, Yu Zou, Chaoyong Chen, Shan Jiang. A method for robotic grasping based on improved Gaussian mixture model[J]. Mathematical Biosciences and Engineering, 2020, 17(2): 1495-1510. doi: 10.3934/mbe.2020077

    Related Papers:

  • The present research envisages a method for the robotic grasping based on the improved Gaussian mixture model. The improved Gaussian mixture model is a method proposed by incorporating Bayesian ideas into the Gaussian model. It will use the Gaussian model to perform grasping training in a certain area which we called trained area. The improved Gaussian models utilized the trained Gaussian models as prior models. The proposed method improved the cumulative updates and the evaluation results of the improved models to make robots more adaptable to grasp in the untrained areas. The self-taught learning ability of the robot about grasping was semi-supervised. Firstly, the observable variables of objects were determined by a camera. Then, we dragged the robot to grasp object. The relationship between the variables and robot's joint angles were mapped. We obtained new samples in the close untrained area to improve the Gaussian model. With these new observable variables, the robot grasped it successfully. Finally, the effectiveness of the method was verified by experiments and comparative tests on grasping of real objects and grasping simulation of the improved Gaussian models through the virtual robot experimentation platform.


    加载中


    [1] Y. Tao, T. Wang, H. Liu, S. Jiang, Insights and suggestions on the current situation and development trend of intelligent robots, Chin. High Technol. Lett., 29 (2019),149-163.
    [2] Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal. Mach. Int., 22 (2000), 1330-1334.
    [3] Y. Wang, C. Liu, X. Yang, Online calibration techniques of visual measurement systems for industrial robots, Robot, 33 (2011), 299-302.
    [4] L. Zhang, X. Huang, W. Feng, Space robot vision calibration with reference objects from motion trajectories, Robot, 38 (2016), 193-199.
    [5] B. Belzile, L. Birglen, A compliant self-adaptive gripper with proprioceptive haptic feedback, Auton. Rob., 36 (2014), 79-91.
    [6] D. Petković, M. Issa, N. D. Pavlović, L. Zentner, Ž. Ćojbašić, Adaptive neuro fuzzy controller for adaptive compliant robotic grippe, Expert Syst. Appl., 39 (2012), 13295-13304.
    [7] A. Ahrary, R. D. A. Ludena, A novel approach to design of an underactuated mechanism for grasping in agriculture application, in Lee R. (eds) Applied Computing and Information Technology (eds. R. Lee), Springer, (2014), 31-45.
    [8] M. Manti, T. Hassan, G. Passetti, N. d'Elia, M. Cianchetti, C. Laschi, An underactuated and adaptable soft robotic gripper, in Conference on Biomimetic and Biohybrid Systems, Springer, (2015), 64-74.
    [9] C. Qian, R. Li, B. Li, M. Hu, Y. Xin, Study on bionic manipulator based on multi-sensor data Fusion, Piezoelectr. Acoustoopt., 39 (2017), 490-493.
    [10] S. Liu, J. Deng, Y. Zhan, Y. Ye, Design and implementation of manipulator based on PWM technology, J. Detect. Control, 39 (2017), 19-23.
    [11] A. Saxena, J. Driemeyer, A. Y. Ng, Robotic grasping of novel objects using vision, Int. J. Rob. Res., 27 (2008), 157-173.
    [12] W. Dong, Research on industrial robots scraping technologies based on machine vision, Huazhong University of Science and Technology, (2011).
    [13] Z. Yan, X. Du, M. Cao, Y. Cai, T. Lu, S. Wang, A method for robotic grasping position detection based on deep learning, Chin. High Technol. Lett., 28 (2018), 58-66.
    [14] J. Xia, K. Qian, X. Ma, H. Liu, Fast planar grasp pose detection for robot based on cascaded deep convolutional neural networks, Robot, 40 (2018),794-802.
    [15] Y. Mollard, T. Munzer, A. Baisero, M. Toussaint, M. Lopes, Robot programming from demonstration, feedback and transfer, 2015 IEEE/RSJ international conference on intelligent robots and systems (IROS), 2015. Available from: https://ieeexplore.ieee.org/xpl/conhome/7347169/proceeding.
    [16] K. Bousmalis, A. Irpan, P. Wohlhart, Y. Bai, M. Kelcey, M. Kalakrishnan, Using Simulation and Domain Adaptation to Improve Efficiency of Deep Robotic Grasping, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available from: https://ieeexplore.ieee.org/xpl/conhome/8449910/proceeding.
    [17] M. Schwarz, C. Lenz, G. M. García, S. Koo, A. S. Periyasamy, M. Schreiber, Fast Object Learning and Dual-arm Coordination for Cluttered Stowing, Picking, and Packing, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available from: https://ieeexplore.ieee.org/xpl/conhome/8449910/proceeding.
    [18] P. Schmidt, N. Vahrenkamp, M. Wächter, T. Asfour, Grasping of Unknown Objects Using Deep Convolutional Neural Networks Based on Depth Images, 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018. Available from: https://ieeexplore.ieee.org/xpl/conhome/8449910/proceeding.
    [19] G. Zhao, Y. Tao, H. Liu, X. Deng, Y. Chen, H. Xiong, A robot demonstration method based on LWR and Q-learning algorithm, J. Int. Fuzzy Syst., 35 (2018), 35-46.
    [20] Y. Chen, J. Guo, Y. Tao, Adaptive grasping strategy of robot based on Gaussian process, J. Beijing Univ. Aeronaut. Astronaut., 43 (2017), 1738-1745.
    [21] E. Rohmer, S. P. N. Singh, M. Freese, V-REP: A Versatile and Scalable Robot Simulation Framework, 2013 IEEE/RSJ International Conference on Intelligent Robots and System, 2013. Available from: https://ieeexplore.ieee.org/xpl/conhome/6679723/proceeding.
    [22] N.Diego, Virtual robot experimentation platform user manual [EB/OL], 2016. Available from: http://www.coppeliarobotics.com/helpFiles/index.html.
    [23] R. Iernsalimschy, Programming in Lua, Beijing: Publishing House of Electronics Industry, (2008).
  • Reader Comments
  • © 2020 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3917) PDF downloads(470) Cited by(7)

Article outline

Figures and Tables

Figures(13)  /  Tables(2)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog