Research article Special Issues

BreaCNet: A high-accuracy breast thermogram classifier based on mobile convolutional neural network


  • Received: 17 September 2021 Accepted: 18 November 2021 Published: 03 December 2021
  • The presence of a well-trained, mobile CNN model with a high accuracy rate is imperative to build a mobile-based early breast cancer detector. In this study, we propose a mobile neural network model breast cancer mobile network (BreaCNet) and its implementation framework. BreaCNet consists of an effective segmentation algorithm for breast thermograms and a classifier based on the mobile CNN model. The segmentation algorithm employing edge detection and second-order polynomial curve fitting techniques can effectively capture the thermograms' region of interest (ROI), thereby facilitating efficient feature extraction. The classifier was developed based on ShuffleNet by adding one block consisting of a convolutional layer with 1028 filters. The modified Shufflenet demonstrated a good fit learning with 6.1 million parameters and 22 MB size. Simulation results showed that modified ShuffleNet alone resulted in a 72% accuracy rate, but the performance excelled to a 100% accuracy rate when integrated with the proposed segmentation algorithm. In terms of diagnostic accuracy of the normal and abnormal test, BreaCNet significantly improves the sensitivity rate from 43% to 100% and specificity of 100%. We confirmed that feeding only the ROI of the input dataset to the network can improve the classifier's performance. On the implementation aspect of BreaCNet, the on-device inference is recommended to ensure users' data privacy and handle an unreliable network connection.

    Citation: Roslidar Roslidar, Mohd Syaryadhi, Khairun Saddami, Biswajeet Pradhan, Fitri Arnia, Maimun Syukri, Khairul Munadi. BreaCNet: A high-accuracy breast thermogram classifier based on mobile convolutional neural network[J]. Mathematical Biosciences and Engineering, 2022, 19(2): 1304-1331. doi: 10.3934/mbe.2022060

    Related Papers:

  • The presence of a well-trained, mobile CNN model with a high accuracy rate is imperative to build a mobile-based early breast cancer detector. In this study, we propose a mobile neural network model breast cancer mobile network (BreaCNet) and its implementation framework. BreaCNet consists of an effective segmentation algorithm for breast thermograms and a classifier based on the mobile CNN model. The segmentation algorithm employing edge detection and second-order polynomial curve fitting techniques can effectively capture the thermograms' region of interest (ROI), thereby facilitating efficient feature extraction. The classifier was developed based on ShuffleNet by adding one block consisting of a convolutional layer with 1028 filters. The modified Shufflenet demonstrated a good fit learning with 6.1 million parameters and 22 MB size. Simulation results showed that modified ShuffleNet alone resulted in a 72% accuracy rate, but the performance excelled to a 100% accuracy rate when integrated with the proposed segmentation algorithm. In terms of diagnostic accuracy of the normal and abnormal test, BreaCNet significantly improves the sensitivity rate from 43% to 100% and specificity of 100%. We confirmed that feeding only the ROI of the input dataset to the network can improve the classifier's performance. On the implementation aspect of BreaCNet, the on-device inference is recommended to ensure users' data privacy and handle an unreliable network connection.



    加载中


    [1] Y. LeCun, Y. Bengio, G. Hinton, Deep learning, Nature, 521 (2015), 436–444. doi: 10.1038/nature14539. doi: 10.1038/nature14539
    [2] A. Esteva, B. Kuprel, R. A. Novoa, J. Ko, S. M. Swetter, H. M. Blau, et al., Dermatologist-level classification of skin cancer with deep neural networks, Nature, 542 (2017), 115–118. doi: 10.1038/nature21056. doi: 10.1038/nature21056
    [3] P. Wang, X. Xiao, J. R. G. Brown, T. M. Berzin, M. Tu, F. Xiong, et al., Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy, Nat. Biomed. Eng., 2 (2018), 741–748. doi: 10.1038/s41551-018-0301-3. doi: 10.1038/s41551-018-0301-3
    [4] M. Hammad, A. M. Iliyasu, A. Subasi, E. S. L. Ho, A. A. A. El-Latif, A multitier deep learning model for arrhythmia detection, IEEE Trans. Instrum. Meas., 1 (2021), 1–9. doi: 10.1109/TIM.2020.3033072. doi: 10.1109/TIM.2020.3033072
    [5] J. G. Nam, S. Park, E. J. Hwang, J. H. Lee, K. N. Jin, K. Y. Lim, et al., Development and validation of deep learning-based automatic detection algorithm for malignant pulmonary nodules on chest radiographs, Radiology, 290 (2019), 218–228. doi: 10.1148/radiol.2018180237. doi: 10.1148/radiol.2018180237
    [6] A. Sedik, A. M. Iliyasu, A. El-Rahiem, M. E. Abdel Samea, A. Abdel-Raheem, M. Hammad, et al., Deploying machine and deep learning models for efficient data-augmented detection of covid-19 infections, Viruses, 12 (2020), 769. doi: 10.3390/v12070769. doi: 10.3390/v12070769
    [7] S. Xu, H. Wu, R. Bie, Cxnet-m1: Anomaly detection on chest x-rays with image-based deep learning, IEEE Access, 7 (2019), 4466–4477. doi: 10.1109/ACCESS.2018.2885997. doi: 10.1109/ACCESS.2018.2885997
    [8] K. Munadi, K. Muchtar, N. Maulina, B. Pradhan, Image enhancement for tuberculosis detection using deep learning, IEEE Access, 8 (2020), 897–217. doi: 10.1109/ACCESS.2020.3041867. doi: 10.1109/ACCESS.2020.3041867
    [9] H. Chougrad, H. Zouaki, O. Alheyane, Deep convolutional neural networks for breast cancer screening, Comput. Methods Programs Biomed., 157 (2018), 19–30. doi: 10.1016/j.cmpb.2018.01.011. doi: 10.1016/j.cmpb.2018.01.011
    [10] M. A. Al-Masni, M. A. Al-Antari, J. M. Park, G. Gi, T. Y. Kim, P. Rivera, et al., Simultaneous detection and classification of breast masses in digital mammograms via a deep learning yolo-based cad system, Comput. Methods Programs Biomed., 157 (2018), 85–94. doi: 10.1016/j.cmpb.2018.01.017. doi: 10.1016/j.cmpb.2018.01.017
    [11] H. Li, J. Weng, Y. Shi, W. Gu, Y. Mao, Y. Wang, et al., An improved deep learning approach for detection of thyroid papillary cancer in ultrasound images, Sci. Rep., 8 (2018), 1–12. doi: 10.1038/s41598-018-25005-7. doi: 10.1038/s41598-018-25005-7
    [12] H. K. Mewada, A. V. Patel, M. Hassaballah, M. H. Alkinani, K. Mahant, Spectral-spatial features integrated convolution neural network for breast cancer classification, Sensors, 20 (2020), 4747. doi: 10.3390/s20174747. doi: 10.3390/s20174747
    [13] R. Yan, F. Ren, Z. Wang, L. Wang, T. Zhang, Y. Liu, et al., Breast cancer histopathological image classification using a hybrid deep neural network, Methods, 173 (2020), 52–60. doi: 10.1016/j.ymeth.2019.06.014. doi: 10.1016/j.ymeth.2019.06.014
    [14] A. Rakhlin, A. Shvets, V. Iglovikov, A. A. Kalinin, Deep convolutional neural networks for breast cancer histology image analysis, in International Conference Image Analysis and Recognition, Springer, (2018), 737–744.
    [15] D. Bardou, K. Zhang, S. M. Ahmad, Classification of breast cancer based on histology images using convolutional neural networks, IEEE Access, 6 (2018), 24680–24693. doi: 10.1109/ACCESS.2018.2831280. doi: 10.1109/ACCESS.2018.2831280
    [16] D. M. Vo, N. Q. Nguyen, S. W. Lee, Classification of breast cancer histology images using incremental boosting convolution networks, Inf. Sci., 482 (2019), 123–138. doi: 10.1016/j.ins.2018.12.089. doi: 10.1016/j.ins.2018.12.089
    [17] R. Roslidar, K. Saddami, F. Arnia, M. Syukri, K. Munadi, A study of fine-tuning CNN models based on thermal imaging for breast cancer classification, in 2019 IEEE International Conference on Cybernetics and Computational Intelligence, (2019), 77–81.
    [18] F. J. Fernández-Ovies, E. S. Alférez-Baquero, E. J. de Andrés-Galiana, A. Cernea, Z. Fernández-Muñiz, J. L. Fernández-Martínez, Detection of breast cancer using infrared thermography and deep neural networks, in International Work-Conference on Bioinformatics and Biomedical Engineering, Springer, (2019), 514–523.
    [19] J. Zuluaga-Gomez, Z. Al Masry, K. Benaggoune, S. Meraghni, N. Zerhouni, A cnn-based methodology for breast cancer diagnosis using thermal images, Comput. Methods Biomech. Biomed. Eng. Imaging Vis., 9 (2021), 1–15. doi: 10.1080/21681163.2020.1824685. doi: 10.1080/21681163.2020.1824685
    [20] J. C. Torres-Galván, E. Guevara, F. J. González, Comparison of deep learning architectures for pre-screening of breast cancer thermograms, in 2019 Photonics North. IEEE, (2019), 1–2.
    [21] S. Tello-Mijares, F. Woo, F. Flores, Breast cancer identification via thermography image segmentation with a gradient vector flow and a convolutional neural network, J. Healthc. Eng., 2019 (2019), 1–13. doi: 10.1155/2019/9807619. doi: 10.1155/2019/9807619
    [22] R. Sánchez-Cauce, J. Pérez-Martín, M. Luque, Multi-input convolutional neural network for breast cancer detection using thermal images and clinical data, Comput. Methods Programs Biomed., 204 (2021), 106045. doi: 10.1016/j.cmpb.2021.106045. doi: 10.1016/j.cmpb.2021.106045
    [23] R. Roslidar, A. Rahman, R. Muharar, M. R. Syahputra, F. Arnia, M. Syukri, et al., A review on recent progress in thermal imaging and deep learning approaches for breast cancer detection, IEEE Access, 8 (2020), 116176–116194. doi: 10.1109/ACCESS.2020.3004056. doi: 10.1109/ACCESS.2020.3004056
    [24] B. O. Anderson, S. Braun, S. Lim, R. A. Smith, S. Taplin, D. B. Thomas, et al., Early detection of breast cancer in countries with limited resources, Breast J., 9 (2003), S51–S59. doi: 10.1046/j.1524-4741.9.s2.4.x. doi: 10.1046/j.1524-4741.9.s2.4.x
    [25] R. Sankaranarayanan, K. Ramadas, S. Thara, R. Muwonge, J. Prabhakar, P. Augustine, et al., Clinical breast examination: preliminary results from a cluster randomized controlled trial in india, J. Natl. Cancer Inst., 103 (20011), 1476–1480. doi: 10.1093/jnci/djr304.
    [26] World Health Organization, Breast cancer: prevention and control, (2019). Available from: https://www.who.int/cancer/detection/breastcancer/en.
    [27] Cisco, Cisco annual internet report (2018–2023), Report, (2020). Available from: https://www.cisco.com/c/en/us/solutions/collateral/executive-perspectives/annual-internet-report/white-paper-c11-741490.html.
    [28] Caterpillar, Integrated thermal imaging, (2020). Available from: https://www.catphones.com/en-dk/features/integrated-thermal-imaging.
    [29] FLIR, Blackview bv9800 pro featuring flir lepton thermal camera available now, (2020). Available from: https://www.flir.com/news-center/press-releases/blackview-bv9800-pro-featuring-flir-lepton-thermal-camera-available-now.
    [30] Teledyne Fire, Flir one gen 3, 2020. Available from: https://www.flir.com/products/flir-one-gen-3.
    [31] J. Wang, B. Cao, P. Yu, L. Sun, W. Bao, X. Zhu, Deep learning towards mobile applications, in Proceeding of 2018 IEEE 38th International Conference on Distributed Computing Systems, (2018), 1385–1393.
    [32] Roslidar, M. K. Muchamad, F. Arnia, M. Syukri, K. Munadi, A conceptual framework of deploying a trained cnn model for mobile breast self-screening, in 2021 18th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, (2021), 533–537.
    [33] A. Koul, S. Ganju, M. Kasam, Practical deep learning for cloud, mobile, and ddge: Real-world AI & computer-vision projects using python, keras & tensorflow, O'Reilly Media, 2019.
    [34] Apk expansion files, 2020. Available from: https://developer.android.com/google/play/expansion-files.
    [35] L. Silva, D. Saade, G. Sequeiros, A. Silva, A. Paiva, R. Bravo, et al., A new database for breast research with infrared image, J. Med. Imaging Health Inform., 4 (2014), 92–100. doi: 10.1166/jmihi.2014.1226. doi: 10.1166/jmihi.2014.1226
    [36] T. B. Borchartt, A. Conci, R. C. Lima, R. Resmini, A. Sanchez, Breast thermography from an image processing viewpoint: A survey, Signal Process., 93 (2013), 2785–2803. doi: 10.1016/j.sigpro.2012.08.012. doi: 10.1016/j.sigpro.2012.08.012
    [37] Y. Zhou, S. Chen, Y. Wang, W. Huan, Review of research on lightweight convolutional neural networks, in 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference, (2020), 1713–1720.
    [38] A. S. Winoto, M. Kristianus, C. Premachandra, Small and slim deep convolutional neural network for mobile device, IEEE Access, 8 (2020), 125210–125222. doi: 10.1109/ACCESS.2020.3005161. doi: 10.1109/ACCESS.2020.3005161
    [39] S. B. Shuvo, S. N. Ali, S. I. Swapnil, T. Hasan, M. I. H. Bhuiyan, A lightweight cnn model for detecting respiratory diseases from lung auscultation sounds using emd-cwt-based hybrid scalogram, IEEE J. Biomed. Health Inform., 2020 (2020).
    [40] M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, L. C. Chen, MobileNetV2: Inverted residuals and linear bottlenecks, in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2018), 4510–4520.
    [41] F. Chollet, Xception: Deep learning with depthwise separable convolutions, in Proceedings of the IEEE conference on computer vision and pattern recognition, (2017), 1251–1258.
    [42] K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in Proceedings of the IEEE conference on computer vision and pattern recognition, (2016), 770–778.
    [43] X. Zhang, X. Zhou, M. Lin, J. Sun, Shufflenet: An extremely efficient convolutional neural network for mobile devices, in Proceedings of the IEEE conference on computer vision and pattern recognition, (2018), 6848–6856.
    [44] A. G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, et al., Mobilenets: Efficient convolutional neural networks for mobile vision applications, preprint, arXiv: 1704.04861.
    [45] MathWorks, Pretrained deep neural networks, 2020. Available from: https://www.mathworks.com/help/deeplearning/ug/pretrained-convolutional-neural-networks.html.
    [46] N. Tajbakhsh, J. Y. Shin, S. R. Gurudu, R. T. Hurst, C. B. Kendall, M. B. Gotway, et al., Convolutional neural networks for medical image analysis: Full training or fine tuning?, IEEE Trans. Med. Imaging, 35 (2016), 1299–1312. doi: 10.1109/TMI.2016.2535302. doi: 10.1109/TMI.2016.2535302
    [47] M. A. Garduño-Ramón, S. G. Vega-Mancilla, L. A. Morales-Henández, R. A. Osornio-Rios, Supportive noninvasive tool for the diagnosis of breast cancer using a thermographic camera as sensor, Sensors, 17 (2017), 497. doi: 10.3390/s17030497. doi: 10.3390/s17030497
    [48] J. Cho, K. Lee, E. Shin, G. Choy, S. Do, How much data is needed to train a medical image deep learning system to achieve necessary high accuracy?, preprint, arXiv preprint arXiv: 1511.06348.
    [49] X. H. Zhou, D. K. McClish, N. A. Obuchowski, Statistical methods in diagnostic medicine, John Wiley & Sons, (2009).
    [50] Q. Zhou, Z. Li, J. K. Aggarwal, Boundary extraction in thermal images by edge map, in Proceedings of the 2004 ACM Symposium on Applied Computing, (2004), 254–258.
    [51] G. Gui, K. Behranwala, N. Abdullah, J. Seet, P. Osin, A. Nerurkar, et al., The inframammary fold: contents, clinical significance and implications for immediate breast reconstruction, Br. J. Plast. Surg., 57 (2004), 146–149. doi: 10.1016/j.bjps.2003.11.030. doi: 10.1016/j.bjps.2003.11.030
    [52] D. Sathish, S. Kamath, K. Prasad, R. Kadavigere, R. J. Martis, Asymmetry analysis of breast thermograms using automated segmentation and texture features, Signal Image and Video Process., 11 (2016), 745–752. doi: 10.1007/s11760-016-1018-y. doi: 10.1007/s11760-016-1018-y
    [53] R. P. Canale, S. C. Chapra, Numerical Methods for Engineers with Personal Computer Applications, McGraw-Hill, 2000.
    [54] R. P. dos Santos, G. S. Clemente, T. I. Ren, G. D. Cavalcanti, Text line segmentation based on morphology and histogram projection, in 2009 10th International Conference on Document Analysis and Recognition, IEEE, 11 (2009), 651–655. doi: 10.1109/ICDAR.2009.183.
    [55] A. Krizhevsky, I. Sutskever, G. E. Hinton, Imagenet classification with deep convolutional neural networks, Commun. ACM, 60 (2017), 84–90. doi: 10.1145/3065386. doi: 10.1145/3065386
    [56] S. Khan, H. Rahmani, S. A. A. Shah, M. Bennamoun, A guide to convolutional neural networks for computer vision, Synth. Lect. Comput. Vision, 8 (2018), 1–207. doi: 10.2200/S00822ED1V01Y201712COV015. doi: 10.2200/S00822ED1V01Y201712COV015
    [57] A. Géron, Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems, O'Reilly Media, (2019).
    [58] V. Nair, G. E. Hinton, Rectified linear units improve restricted boltzmann machines, ICML, (2010).
    [59] D. Scherer, A. Müller, S. Behnke, Evaluation of pooling operations in convolutional architectures for object recognition, in International conference on artificial neural networks, Springer, (2010), 92–101.
    [60] N. Qian, On the momentum term in gradient descent learning algorithms, Neural networks, 12 (1999), 145–151. doi: 10.1016/S0893-6080(98)00116-6. doi: 10.1016/S0893-6080(98)00116-6
    [61] S. C. Kothari, H. Oh, Neural networks for pattern recognition, Elsevier, (1993), 119–166.
    [62] I. Goodfellow, Y. Bengio, A. Courville, Deep learning, Massachusetts: The MIT Press, 2016.
  • Reader Comments
  • © 2022 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(3561) PDF downloads(217) Cited by(18)

Article outline

Figures and Tables

Figures(12)  /  Tables(5)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog