Research article

A prostate seed implantation robot system based on human-computer interactions: Augmented reality and voice control


  • Received: 18 December 2023 Revised: 13 January 2024 Accepted: 24 January 2024 Published: 15 May 2024
  • The technology of robot-assisted prostate seed implantation has developed rapidly. However, during the process, there are some problems to be solved, such as non-intuitive visualization effects and complicated robot control. To improve the intelligence and visualization of the operation process, a voice control technology of prostate seed implantation robot in augmented reality environment was proposed. Initially, the MRI image of the prostate was denoised and segmented. The three-dimensional model of prostate and its surrounding tissues was reconstructed by surface rendering technology. Combined with holographic application program, the augmented reality system of prostate seed implantation was built. An improved singular value decomposition three-dimensional registration algorithm based on iterative closest point was proposed, and the results of three-dimensional registration experiments verified that the algorithm could effectively improve the three-dimensional registration accuracy. A fusion algorithm based on spectral subtraction and BP neural network was proposed. The experimental results showed that the average delay of the fusion algorithm was 1.314 s, and the overall response time of the integrated system was 1.5 s. The fusion algorithm could effectively improve the reliability of the voice control system, and the integrated system could meet the responsiveness requirements of prostate seed implantation.

    Citation: Xinran Zhang, Yongde Zhang, Jianzhi Yang, Haiyan Du. A prostate seed implantation robot system based on human-computer interactions: Augmented reality and voice control[J]. Mathematical Biosciences and Engineering, 2024, 21(5): 5947-5971. doi: 10.3934/mbe.2024262

    Related Papers:

  • The technology of robot-assisted prostate seed implantation has developed rapidly. However, during the process, there are some problems to be solved, such as non-intuitive visualization effects and complicated robot control. To improve the intelligence and visualization of the operation process, a voice control technology of prostate seed implantation robot in augmented reality environment was proposed. Initially, the MRI image of the prostate was denoised and segmented. The three-dimensional model of prostate and its surrounding tissues was reconstructed by surface rendering technology. Combined with holographic application program, the augmented reality system of prostate seed implantation was built. An improved singular value decomposition three-dimensional registration algorithm based on iterative closest point was proposed, and the results of three-dimensional registration experiments verified that the algorithm could effectively improve the three-dimensional registration accuracy. A fusion algorithm based on spectral subtraction and BP neural network was proposed. The experimental results showed that the average delay of the fusion algorithm was 1.314 s, and the overall response time of the integrated system was 1.5 s. The fusion algorithm could effectively improve the reliability of the voice control system, and the integrated system could meet the responsiveness requirements of prostate seed implantation.



    加载中


    [1] S. Lim, C. Jun, D. Chang, D. Petrisor, M. Han, D. Stoianovici, Robotic transrectal ultrasound guided prostate biopsy, IEEE Trans. Biomed. Eng., 66 (2019), 2527–2537. https://doi.org/10.1109/TBME.2019.2891240 doi: 10.1109/TBME.2019.2891240
    [2] M. R. Tangel, A. R. Rastinehad, Advances in prostate cancer imaging, F1000Research, 7 (2018), 1337. https://doi.org/10.12688/f1000research.14498.1 doi: 10.12688/f1000research.14498.1
    [3] F. J. Siepel, B. Maris, M. K. Welleweerd, V. Groenhuis, P. Fiorini, S. Stramigioli, Needle and biopsy robots: A review, Curr. Rob. Rep., 2 (2021), 73–84. https://doi.org/10.1007/s43154-020-00042-1 doi: 10.1007/s43154-020-00042-1
    [4] J. Michael, D. Morton, D. Batchelar, M. Hilts, J. Crook, A. Fenster, Development of a 3D ultrasound guidance system for permanent breast seed implantation, Med. Phys., 45 (2018), 3481–3495. https://doi.org/10.1002/mp.12990 doi: 10.1002/mp.12990
    [5] Y. Chen, Q. Wang, H. Chen, X. Song, H. Tang, M. Tian, An overview of augmented reality technology, J. Phys.: Conf. Ser., 1237 (2019), 022082. https://doi.org/10.1088/1742-6596/1237/2/022082 doi: 10.1088/1742-6596/1237/2/022082
    [6] Z. Makhataeva, H. A. Varol, Augmented reality for robotics: A review, Robotics, 9 (2020), 21. https://doi.org/10.3390/robotics9020021 doi: 10.3390/robotics9020021
    [7] L. Qian, J. Y. Wu, S. P. DiMaio, N. Navab, P. Kazanzides, A review of augmented reality in robotic-assisted surgery, IEEE Trans. Med. Rob. Bionics, 2 (2019), 1–16. https://doi.org/10.1109/TMRB.2019.2957061 doi: 10.1109/TMRB.2019.2957061
    [8] H. Younes, J. Troccaz, S. Voros, Machine learning and registration for automatic seed localization in 3D US images for prostate brachytherapy, Med. Phys., 48 (2021), 1144–1156. https://doi.org/10.1002/mp.14628 doi: 10.1002/mp.14628
    [9] C. Rossa, J. Carriere, M. Khadem, R. Sloboda, N. Usmani, M. Tavakoli, An ultrasound-guided mechatronics-assisted system for semi-automated seed implantation and tracking in prostate brachytherapy, Brain and Cognit. Intell. Control Rob., (2022), 21–46. https://doi.org/10.1201/9781003050315 doi: 10.1201/9781003050315
    [10] Y. Zhang, Z. Lu, C. Wang, C. Liu, Y. Wang, Voice control dual arm robot based on ROS system, in 2018 IEEE International Conference on Intelligence and Safety for Robotics (ISR), IEEE, (2018), 232–237. https://doi.org/10.1109/IISR.2018.8535942
    [11] B. Li, L. Yuan, C. Wang, Y. Guo, Structural design and analysis of pneumatic prostate seed implantation robot applied in magnetic resonance imaging environment, Int. J. Med. Rob. Comput. Assisted Surg., 18 (2020), e2457. https://doi.org/10.1002/rcs.2457 doi: 10.1002/rcs.2457
    [12] G. Fichtinger, J. P. Fiene, C. W. Kennedy, G. Kronreif, I. Iordachita, D. Y. Song, et al., Robotic assistance for ultrasound-guided prostate brachytherapy, Med. Image Anal., 12 (2008), 535–545. https://doi.org/10.1007/978-3-540-75757-3_15 doi: 10.1007/978-3-540-75757-3_15
    [13] M. Djohossou, A. B. Halima, A. Valérie, J. Bert, D. Visvikis, Design and kinematics of a comanipulated robot dedicated to prostate brachytherapy, Robotica, 39 (2021), 468–482. https://doi.org/10.1017/S026357472000051X doi: 10.1017/S026357472000051X
    [14] A. B. Halima, J. Bert, J. F. Clément, D. Visvikis, Development of a 6 degrees of freedom prostate brachytherapy robot with integrated gravity compensation system, in 2021 International Symposium on Medical Robotics (ISMR), IEEE, (2021), 1–7. https://doi.org/10.1109/ISMR48346.2021.9661571
    [15] B. Wang, Y. Liang, D. Xu, Y. Zhang, Y. Xu, Design of a seed implantation robot with counterbalance and soft tissue stabilization mechanism for prostate cancer brachytherapy, Int. J. Adv. Rob. Syst., 18 (2021), 17298814211040687. https://doi.org/10.1177/17298814211040687 doi: 10.1177/17298814211040687
    [16] S. Chen, B. Gonenc, M. Li, D. Y. Song, E. C. Burdette, I. Iordachita, et al., Needle release mechanism enabling multiple insertions with an ultrasound-guided prostate brachytherapy robot, in 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), IEEE, (2017), 4339–4342. https://doi.org/10.1109/EMBC.2017.8037816
    [17] S. Jiang, Y. Yang, Z. Yang, Z. Zhang, S. Liu, Design and experiments of ultrasound image-guided multi-dof robot system for brachytherapy, Trans. Tianjin Univ., 23 (2017), 479–487. https://doi.org/10.1007/s12209-017-0067-9 doi: 10.1007/s12209-017-0067-9
    [18] X. Dai, Y. Zhang, J. Jiang, B. Li, S. Zuo, Design of a transrectal ultrasonic guided prostate low dose rate brachytherapy robot, Mech. Sci., 13 (2022), 399–409. https://doi.org/10.5194/ms-13-399-2022 doi: 10.5194/ms-13-399-2022
    [19] M. Bakouri, M. Alsehaimi, H. F. Ismail, K. Alshareef, A. Ganoun, A. Alqahtani, et al., Steering a robotic wheelchair based on voice recognition system using convolutional neural networks, Electronics, 11 (2022), 168. https://doi.org/10.3390/electronics11010168 doi: 10.3390/electronics11010168
    [20] P. Tran, S. Jeong, F. Lyu, K. Herrin, S. Bhatia, D. Elliott, et al., FLEXotendon Glove-Ⅲ: Voice-controlled soft robotic hand exoskeleton with novel fabrication method and admittance grasping control, IEEE/ASME Trans. Mechatron., 27 (2022), 3920–3931. https://doi.org/10.1109/TMECH.2022.3148032 doi: 10.1109/TMECH.2022.3148032
    [21] E. Watanabe, M. Satoh, T. Konno, M. Hirai, T. Yamaguchi, The trans-visible navigator: A see-through neuronavigation system using augmented reality, World Neurosurg., 87 (2016), 399–405. https://doi.org/10.1016/j.wneu.2015.11.084 doi: 10.1016/j.wneu.2015.11.084
    [22] D. Cohen, E. Mayer, D. Chen, A. Anstee, J. Vale, G. Z. Yang, et al., Augmented reality image guidance in minimally invasive prostatectomy, in Prostate Cancer Imaging, Computer-Aided Diagnosis, Prognosis, and Intervention, Springer, (2010), 101–110. https://doi.org/10.1007/978-3-642-15989-3_12
    [23] T. Yamamoto, N. Abolhassani, S. Jung, A. M. Okamura, T. N. Judkins, Augmented reality and haptic interfaces for robot‐assisted surgery, Int. J. Med. Rob. Comput., 8 (2012), 45–56. https://doi.org/10.1002/rcs.421 doi: 10.1002/rcs.421
    [24] T. Song, C. Yang, O. Dianat, E. Azimi, Endodontic guided treatment using augmented reality on a head‐mounted display system, Healthcare Technol. Lett., 5 (2018), 201–207. https://doi.org/10.1049/htl.2018.5062 doi: 10.1049/htl.2018.5062
    [25] F. Gîrbacia, R. Boboc, B. Gherman, T. Gîrbacia, D. Pîsla, Planning of needle insertion for robotic-assisted prostate biopsy in augmented reality using RGB-D camera, in 26th International Conference on Robotics in Alpe-Adria Danube Region (RAAD), Springer, (2017), 515–522. https://doi.org/10.1007/978-3-319-49058-8_56
    [26] D. Lee, H. W. Yu, S. Kim, J. Yoon, K. Lee, Y. J. Chai, et al., Vision-based tracking system for augmented reality to localize recurrent laryngeal nerve during robotic thyroid surgery, Sci. Rep., 10 (2020), 8437. https://doi.org/10.1038/s41598-020-65439-6 doi: 10.1038/s41598-020-65439-6
    [27] B. Xu, Z. Yang, S. Jiang, Z. Zhou, B. Jiang, S. Yin, Design and validation of a spinal surgical navigation system based on spatial augmented reality, Spine, 45 (2020), E1627–E1633. https://doi.org/10.1097/BRS.0000000000003666 doi: 10.1097/BRS.0000000000003666
    [28] G. Samei, K. Tsang, C. Kesch, J. Lobo, S. Hor, O. Mohareri, et al., A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy, Med. Image Anal., 60 (2019), 101588. https://doi.org/10.1016/j.media.2019.101588 doi: 10.1016/j.media.2019.101588
    [29] R. Schiavina, L. Bianchi, S. Lodi, L. Cercenelli, F. Chessa, B. Bortolani, et al., Real-time augmented reality three-dimensional guided robotic radical prostatectomy: Preliminary experience and evaluation of the impact on surgical planning, Eur. Urol. Focus, 7 (2020), 1260–1267. https://doi.org/10.1016/j.euf.2020.08.004 doi: 10.1016/j.euf.2020.08.004
    [30] H. Reichenspurner, R. J. Damiano, M. Mack, D. H. Boehm, H. Gulbins, C. Detter, et al., Use of the voice-controlled and computer-assisted surgical system ZEUS for endoscopic coronary artery bypass grafting, J. Thorac. Cardiovasc. Surv., 118 (1999), 11–16. https://doi.org/10.1016/S0022-5223(99)70134-0 doi: 10.1016/S0022-5223(99)70134-0
    [31] K. Zinchenko, C. Y. Wu, K. T. Song, A study on speech recognition control for a surgical robot, IEEE Trans. Ind. Inf., 13 (2016), 607–615. https://doi.org/10.1109/TII.2016.2625818 doi: 10.1109/TII.2016.2625818
    [32] K. Gundogdu, S. Bayrakdar, I. Yucedag, Developing and modeling of voice control system for prosthetic robot arm in medical systems, J. King Saud Univ. Comput. Inf. Sci., 30 (2018), 198–205. https://doi.org/10.1016/j.jksuci.2017.04.005 doi: 10.1016/j.jksuci.2017.04.005
    [33] M. F. Ruzaij, S. Neubert, N. Stoll, K. Thurow, Hybrid voice controller for intelligent wheelchair and rehabilitation robot using voice recognition and embedded technologies, J. Adv. Comput. Intell., 20 (2016), 615–622. https://doi.org/10.20965/jaciii.2016.p0615 doi: 10.20965/jaciii.2016.p0615
    [34] S. K. Pramanik, Z. A. Onik, N. Anam, M. M. Ullah, A. Saiful, S. Sultana, A voice controlled robot for continuous patient assistance, in 2016 International Conference on Medical Engineering, Health Informatics and Technology (MediTec), IEEE, (2016), 1–4. https://doi.org/10.1109/MEDITEC.2016.7835366
    [35] R. Matarneh, S. Maksymova, O. Zeleniy, V. Lyashenko, Voice control for flexible medicine robot, Int. J. Comput. Trends Technol., 55 (2018), 1–5. https://doi.org/10.14445/22312803/IJCTT-V56P101 doi: 10.14445/22312803/IJCTT-V56P101
    [36] T. S. Newman, H. Yi, A survey of the marching cubes algorithm, Comput. Graphics, 30 (2006), 854–879. https://doi.org/10.1016/j.cag.2006.07.021 doi: 10.1016/j.cag.2006.07.021
    [37] R. T. Azuma, A survey of augmented reality, Presence Teleoperators Virtual Environ., 6 (1997), 355–385. https://doi.org/10.1162/pres.1997.6.4.355 doi: 10.1162/pres.1997.6.4.355
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(335) PDF downloads(36) Cited by(0)

Article outline

Figures and Tables

Figures(22)  /  Tables(3)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog