Review

Deep learning for Flight Maneuver Recognition: A survey

  • Received: 14 July 2022 Revised: 18 September 2022 Accepted: 20 September 2022 Published: 24 October 2022
  • Deep learning for Flight Maneuver Recognition involves flight maneuver detection and recognition tasks in different areas, including pilot training, aviation safety, and autonomous air combat. As a key technology for these applications, deep learning for Flight Maneuver Recognition research is underdeveloped and limited by domain knowledge and data sources. This paper presents a comprehensive survey of all Flight Maneuver Recognition studies since the 1980s to accurately define the research and describe its significance for the first time. In an analogy to the flourishing Human Action Recognition research, we divided deep learning for Flight Maneuver Recognition into vision-based and sensor-based studies, combed through all the literature, and referred to existing reviews of Human Action Recognition to demonstrate the similarities and differences between Flight Maneuver Recognition and Human Action Recognition in terms of problem essentials, research methods, and publicly available datasets. This paper presents the dataset-The Civil Aviation Flight University of China, which was generated from real training of a fixed-wing flight at Civil Aviation Flight University of China. We used this dataset to reproduce and evaluate several important methods of Flight Maneuver Recognition and visualize the results. Based on the evaluation results, the paper discusses the advantages, disadvantages, and overall shortcomings of these methods, as well as the challenges and future directions for deep learning for Flight Maneuver Recognition.

    Citation: Jing Lu, Longfei Pan, Jingli Deng, Hongjun Chai, Zhou Ren, Yu Shi. Deep learning for Flight Maneuver Recognition: A survey[J]. Electronic Research Archive, 2023, 31(1): 75-102. doi: 10.3934/era.2023005

    Related Papers:

  • Deep learning for Flight Maneuver Recognition involves flight maneuver detection and recognition tasks in different areas, including pilot training, aviation safety, and autonomous air combat. As a key technology for these applications, deep learning for Flight Maneuver Recognition research is underdeveloped and limited by domain knowledge and data sources. This paper presents a comprehensive survey of all Flight Maneuver Recognition studies since the 1980s to accurately define the research and describe its significance for the first time. In an analogy to the flourishing Human Action Recognition research, we divided deep learning for Flight Maneuver Recognition into vision-based and sensor-based studies, combed through all the literature, and referred to existing reviews of Human Action Recognition to demonstrate the similarities and differences between Flight Maneuver Recognition and Human Action Recognition in terms of problem essentials, research methods, and publicly available datasets. This paper presents the dataset-The Civil Aviation Flight University of China, which was generated from real training of a fixed-wing flight at Civil Aviation Flight University of China. We used this dataset to reproduce and evaluate several important methods of Flight Maneuver Recognition and visualize the results. Based on the evaluation results, the paper discusses the advantages, disadvantages, and overall shortcomings of these methods, as well as the challenges and future directions for deep learning for Flight Maneuver Recognition.



    加载中


    [1] The Federal Aviation Administration, Pilot's Handbook of Aeronautical Knowledge, 2016. Available from: https://www.faa.gov/regulations_policies/handbooks_manuals/aviation/phak
    [2] O. D. Lara, M. A. Labrador, A survey on human activity recognition using wearable sensors, IEEE Commun. Surv. Tutorials, 15 (2013), 1192–1209. https://doi.org/10.1109/surv.2012.110112.00192 doi: 10.1109/surv.2012.110112.00192
    [3] A. Bulling, B. Ulf, S. Bernt, A tutorial on human activity recognition using body-worn inertial sensors, ACM Comput. Surv., 46 (2014), 1–33. https://doi.org/10.1145/2499621 doi: 10.1145/2499621
    [4] J. Wang, Y. Chen, S. Hao, X. Peng, L. Hu, Deep learning for sensor-based activity recognition: A survey, Pattern Recogn. Let., 119 (2019), 3–11. https://doi.org/10.1016/j.patrec.2018.02.010 doi: 10.1016/j.patrec.2018.02.010
    [5] A. Gavrilovski, H. Jimenez, D. N. Mavris, A. H. Rao, Challenges and opportunities in flight data mining: A review of the state of the art, in AIAA SciTech 2016, (2016). https://doi.org/10.2514/6.2016-0923
    [6] International Civil Aviation Organization, Manual on Flight Data Analysis Programmes (FDAP), 2nd edition, International Civil Aviation Organization, Quebec, 2021.
    [7] P. Oriana, Air Force's Pilot Training Experiment Still Evolving as New Class Begins, 2019. Available from: https://www.military.com/daily-news/2019/12/26/air-forces-pilot-training-experiment-still-evolving-new-class-begins.html
    [8] NTRS-NASA Technical Reports Server, An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat, 2013. Available from: https://ntrs.nasa.gov/citations/19750022744
    [9] NTRS-NASA Technical Reports Server, Improvements to the adaptive maneuvering logic program, 2013. Available from: https://ntrs.nasa.gov/citations/19880002266
    [10] J. D. Kendrick, P. S. Maybeck, J. G. Reid, Estimation of aircraft target motion using orientation measurements, IEEE Trans. Aerosp. Electron. Syst., 2 (1981), 254–260. https://doi.org/10.1109/taes.1981.309153 doi: 10.1109/taes.1981.309153
    [11] L. Pechaud, D. Kim, Maneuver recognition and prediction of empennage flight loads of general aviation aircraft, in 2001 1st Aircraft, Technology Integration, and Operations Forum (AIAA), (2001), 5273. https://doi.org/10.2514/6.2001-5273
    [12] C. Gueret, N. Jussien, O. Lhomme, C. Pavageau, C. Prins, Loading aircraft for military operations, J. Oper. Res. Soc., 54 (2003), 458–465. https://doi.org/10.2307/4101733 doi: 10.2307/4101733
    [13] G. Barndt, S. Sarkar, C. Miller, Maneuver regime recognition development and verification for H-60 structural monitoring, in 2007 Annual Forum Proceedings-American Helicopter Society (AFPAHS), 63 (2007), 317.
    [14] R. Poppe, A survey on vision-based Human Action Recognition, Image Vision Comput., 28 (2010), 976–990. https://doi.org/10.1016/j.imavis.2009.11.014 doi: 10.1016/j.imavis.2009.11.014
    [15] P. Pareek, A. Thakkar, A survey on video-based Human Action Recognition: Recent updates, datasets, challenges, and applications, Artif. Intell. Rev., 54 (2021), 2259–2322. https://doi.org/10.1007/s10462-020-09904-8 doi: 10.1007/s10462-020-09904-8
    [16] J. Wang, Y. Chen, S. Hao, Deep learning for sensor-based activity recognition: A survey, Pattern Recogn. Lett., 119 (2019), 3–11. https://doi.org/10.1016/j.patrec.2018.02.010 doi: 10.1016/j.patrec.2018.02.010
    [17] J. M. Chaquet, E. J. Carmona, A. Fernández-Caballero, A survey of video datasets for human action and activity recognition, Comput. Vis. Image Und., 117 (2013), 633–659. https://doi.org/10.1016/j.cviu.2013.01.013 doi: 10.1016/j.cviu.2013.01.013
    [18] L. Gorelick, M. Blank, E. Shechtman, M. Irani, R. Basri, Actions as space-time shapes, IEEE Trans. Pattern Anal., 29 (2007), 2247–2253. https://doi.org/10.1109/TPAMI.2007.70711 doi: 10.1109/TPAMI.2007.70711
    [19] C. Schuldt, I. Laptev, B. Caputo, Recognizing human actions: A local SVM approach, in 2004 Proceedings of the 17th International Conference on Pattern Recognition (ICPR), 3 (2004), 32–36. https://doi.org/10.1109/icpr.2004.1334462
    [20] Z. Jiang, Z. Lin, L. Davis, Recognizing human actions by learning and matching shape-motion prototype trees, IEEE Trans. Pattern Anal., 34 (2012), 533–547. https://doi.org/10.1109/tpami.2011.147 doi: 10.1109/tpami.2011.147
    [21] R. Messing, C. Pal, H. Kautz, Activity recognition using the velocity histories of tracked keypoints, in 2009 IEEE 12th International Conference on Computer Vision (ICCV), (2009), 104–111. https://doi.org/10.1109/iccv.2009.5459154
    [22] K. K. Reddy, M. Shah, Recognizing 50 human action categories of web videos, Mach. Vision Appl., 24 (2013), 971–981. https://doi.org/10.1007/s00138-012-0450-4 doi: 10.1007/s00138-012-0450-4
    [23] J. Sullivan, S. Carlsson, Recognizing and tracking human action, in 2002 European Conference on Computer Vision (ECCV), (2002), 629–644. https://doi.org/10.1007/3-540-47969-4_42
    [24] R. Parasuraman, T. B. Sheridan, C. D. Wickens, A model for types and levels of human interaction with automation, IEEE Trans. Syst. Man, Cy. A, 30 (2000), 286–297. https://doi.org/10.1109/3468.844354 doi: 10.1109/3468.844354
    [25] M. Ravanbakhsh, M. Nabi, E. Sangineto, L. Marcenaro, C. Regazzoni, N. Sebe, Abnormal event detection in videos using generative adversarial nets, in 2017 IEEE International Conference on Image Processing (ICIP), (2017), 1577–1581. https://doi.org/10.1109/icip.2017.8296547
    [26] F. J. O. Morales, D. Roggen, Deep convolutional feature transfer across mobile activity recognition domains, sensor modalities and locations, in 2016 ACM International Symposium on Wearable Computers (ISWC), (2016), 92–99. https://doi.org/10.1145/2971763.2971764
    [27] T. Plotz, N. Y. Hammerla, P. Olivier, Feature learning for activity recognition in ubiquitous computing, in 2011 21st International Joint Conference on Artificial Intelligence (IJCAI), (2011), 1729.
    [28] Y. Zheng, Q. Liu, E. Chen, Y. Ge, J. L. Zhao, Time series classification using multi-channels deep convolutional neural networks, in 2014 International Conference on Web-Age Information Management (ICWAIM), (2014), 298–310. https://doi.org/10.1007/978-3-319-08010-9_33
    [29] B. Pourbabaee, M. J. Roshtkhari, K. Khorasani, Deep convolution neural networks and learning ECG features for screening paroxysmal atrial fibrillatio patients, IEEE Trans. Syst. Man Cy.: Syst., 48 (2018), 2095–2104. https://doi.org/10.1109/TSMC.2017.2705582 doi: 10.1109/TSMC.2017.2705582
    [30] J. B. Yang, M. N. Nguyen, P. P. San, X. L. Li, S. Krishnaswamy, Deep convolutional neural networks on multichannel time series for human activity recognition, in 2015 Twenty-fourth International Joint Conference on Artificial Intelligence (IJCAI), (2015), 25–31.
    [31] S. Todorovic, M. C. Nechyba, A vision system for intelligent mission profiles of micro air vehicles, IEEE Trans. Veh. Technol., 53 (2004), 1713–1725. https://doi.org/10.1109/tvt.2004.834880 doi: 10.1109/tvt.2004.834880
    [32] R. Thomas, C. Lee, Development of training scenarios in the flight training device for flight courses at Embry-Riddle Aeronautical University, JAAER, 24 (2015), 65–82. https://doi.org/10.15394/jaaer.2015.1627 doi: 10.15394/jaaer.2015.1627
    [33] L. Lukács, In-flight horizon line detection for airplanes using image processing, in 2015 IEEE 13th International Symposium on Intelligent Systems and Informatics (SISY), (2015), 49–54. https://doi.org/10.1109/SISY.2015.7325350
    [34] J. H. Enders, Study urges application of flight operational quality assurance methods in U.S. air carrier operations, Flight Saf. Dig., (1993), 1–13.
    [35] S. E. Lowe, E. M. Pfleiderer, T. R. Chidester, Perceptions and Efficacy of Flight Operational Quality Assurance (FOQA) programs among small-Scale operators, 2012. Available from: https://www.faa.gov/data_research/research/med_humanfacs/oamtechreports/2010s/media/201201.pdf
    [36] Federal Aviation Administration, AC 120-82-Flight Operational Quality Assurance Document Information, 2004. Available from: https://www.faa.gov/documentLibrary/media/Advisory_Circular/AC_120-82.pdf
    [37] G. Wackers, J. Korte, Drift and vulnerability in a complex technical system: Reliability of condition monitoring systems in north sea offshore helicopter transport, in 2003 International Journal of Engineering Education (IJEE), 19 (2003), 192–205.
    [38] D. He, S. Wu, E. Bechhoefer, Development of regime recognition tools for usage monitoring, in IEEE Aerospace Conference, (2007), 1–11. https://doi.org/10.1109/aero.2007.352829
    [39] R. E. Rajnicek, Application of kalman filtering to real-time flight regime recognition algorithms in a helicopter health and usage monitoring system, M.D Thesis, Embry-Riddle Aeronautical University in Daytona Beach, 2008.
    [40] D. He, S. Wu, E. Bechhoefer, A regime recognition algorithm for helicopter usage monitoring, Aerosp. Technol. Adv., (2010), 391–404. https://doi.org/10.5772/7165 doi: 10.5772/7165
    [41] J. S. Wang, B. S. Xiong, Y. Mo, J. Hang, X. Li, P. Zhao, A random forest-based approach for helicopter flight status identification, Comput. Eng. Appl., 53 (2017), 149–152.
    [42] J. Yang, C. Duan, S. Xie, Fuzzy least squares support vector machine based aircraft Flight Maneuver Recognition, J. Ballist. Arrow Guid., 6 (2004), 395–398.
    [43] C. Xie, S. Ni, Z. Zhang, Y. Wang, A knowledge-based method for fast recognition of aerobatic maneuvers, Comput. Eng., 30 (2004), 116–118.
    [44] H. Mao, F. Zhang, H. Feng, Research on flight maneuver evaluation method based on singular value decomposition, Comput. Eng. Appl., 44 (2008), 240–242.
    [45] H. Mao, F. Zhang, H. Feng, H. Lv, Similar pattern query for multivariate flight data, Comput. Eng. Appl., 47 (2011), 151–155.
    [46] Y. Zhang, Y. Wang, C. Wang, H. Peng, Analysis of parametric correlation and temporal features for flight action recognition method, Comput. Eng. Appl., 52 (2016), 246–249.
    [47] Y. Wang, Y. Gao, A flight action recognition rule extraction method based on whale optimization algorithm, J. Nav. Aviat. Eng. Coll., 33 (2019), 447–451. http://dx.doi.org/10.7682/j.issn.1673-1522.2018.05.005 doi: 10.7682/j.issn.1673-1522.2018.05.005
    [48] W. Fang, Y. Wang, W. Yan, Y. Gong, Flight action recognition based on differential ideas and convolutional neural networks, J. Chin. Acad. Electro. Sci., 16 (2021), 347–353.
    [49] X. Zhang, Z. Yin, F. Liu, Q. Huang, Data mining method for aircraft maneuvering division, J. Northwest. Polytech. Univ., 34 (2016), 33–40.
    [50] J. Qu, M. Lv, Y. Yang, Y. Tang, Flight motion recognition method based on multivariate phase space reconstruction and approximate entropy, in 2021 40th IEEE Chinese Control Conference (CCC), (2021), 7247–7253. https://doi.org/10.23919/CCC52363.2021.9550605
    [51] Y. Li, S. Ni, Z. Zhang, A fuzzy kohonen network-based intelligent processing method for flight data, Sys. Eng. Electron. Technol., 24 (2002), 53–55.
    [52] S. Ni, Z. Shi, C. Xie, Y. Wang, Establishment of a knowledge base for maneuvering Flight Maneuvers Recognition of military warplanes, Comput. Simul., 22 (2005), 23–26. https://doi.org/10.3969/j.issn.1006-9348.2005.04.007 doi: 10.3969/j.issn.1006-9348.2005.04.007
    [53] H. J. Travert, Flight Regime and Maneuver Recognition for Complex Maneuvers, M.D Thesis, Embry-Riddle Aeronautical University in Daytona Beach, 2009.
    [54] Z. Li, F. Zhang, K. Li, X. Zhang, A multivariate time series indexing structure supporting DTW distance, J. Software, 25 (2014), 560–575. https://doi.org/10.13328/j.cnki.jos.004410 doi: 10.13328/j.cnki.jos.004410
    [55] W. Xu, A fuzzy neural network-based approach for shipboard aircraft landing maneuvers recognition, Appl. Sci. Technol., 2 (2013), 26–29.
    [56] H. Li, Z. Shan, H. Guo, MDTW-based flight action recognition algorithm, Comput. Eng. Appl., 51 (2015), 267–270.
    [57] Y. Wang, J. Dong, X. Liu, L. Zhang, Identification and standardization of maneuvers based upon operational flight data, Chin. J. Aeronaut., 28 (2015), 133–140. https://doi.org/10.1016/j.cja.2014.12.026 doi: 10.1016/j.cja.2014.12.026
    [58] H. Tian, S. Xie, L. Wang, L. Ren, L. Wang, Flight trajectory identification based on rough set theory, Firepower Command Control, 40 (2015), 29–33.
    [59] Y. Shen, S. Ni, P. Zhang, A bayesian network-based approach for flight action recognition, Compu. Eng. Appl., 53 (2017), 161–167.
    [60] Y. Wang, Y. Gao, Research on complex action recognition method based on basic flight movements, Ship Electron. Eng., 38 (2018), 74–76.
    [61] X. Cheng, Intelligent evaluation system for flight training quality of general aviation aircraft, Shenyang Univ. Aeronaut. Astronaut., 6 (2018), 95–102.
    [62] Y. Shen, S. Ni, P. Zhang, A similar subsequence query method for flight data, J. Air Force Eng. Univ., 20 (2019), 7–12.
    [63] L. Wang, C. Huang, Z. Wei, Automatic extraction of flight action rules based on SSA algorithm, Comput. Eng. Appl., 14 (2019), 15–26.
    [64] Y. Kou, L. Jiang, High-order reconstruction of the decision process of close air combat maneuver, J. Syst. Simul., 31 (2019), 2085–2091. https://doi.org/10.16182/j.issn1004731x.joss.19-0068 doi: 10.16182/j.issn1004731x.joss.19-0068
    [65] L. Zhang, A non-supervised automatic method of aircraft maneuver partition, J. Comput. Methods Sci. Eng., 21 (2021), 383–395. https://doi.org/10.3233/jcm-204511 doi: 10.3233/jcm-204511
    [66] X. Liu, Maneuver flight partitioning based on important segments in multivariate flight parameters, in 2021 International Conference on Civil Aviation Flight Operations and Computer Technology (CAFOCT), (2021). https://doi.org/10.1145/3544109.3544118
    [67] W. Fang, Y. Wang, W. Yan, Symbolic flight action recognition based on neural networks, Syst. Eng. Electron., 13 (2021), 963–969.
    [68] Z. Jia, X. Fan, M. Xue, S. Zhang, Online identification method for tactical maneuvers of enemy aircraft based on maneuver elements, J. Beijing Univ. Technol., 8 (2018), 459–463.
    [69] C. Gui, L. Zhen, B. Yu, G. Shi, Y. Duan, D. Jian, et al., Recognition of flight operation action based on expert system inference engine, in 2019 11th IEEE International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), (2019), 17–20. https://doi.org/10.1109/ihmsc.2019.00012
    [70] G. Meng, H. Zhang, H. Park, X. Liang, M. Zhou, Maneuver recognition of warplanes in automated flight training evaluation, J. Beijing Univ. Aeronaut. Astronauti., 46 (2020), 1267–1274.
    [71] F. Han, F. Hong, G. Rui, Research on air target maneuver recognition based on LSTM network, in 2020 IEEE International Workshop on Electronic Communication and Artificial Intelligence (IWECAI), (2020), 6–10. https://doi.org/10.1109/iwecai50956.2020.00009
    [72] S. Xu, R. Yang, Y. Yu, T. Zhang, Air combat target maneuver recognition based on motion decomposition and H-SVM, Control Decis. Making, 35 (2020), 1265–1272. https://doi.org/10.13195/j.kzyjc.2018.1210 doi: 10.13195/j.kzyjc.2018.1210
    [73] D. Zhou, F. Li, Genetic algorithm-based tactical flight maneuver decision for aircraft, J. Northwest. Polytech. Univ., 20 (2002), 109–112.
    [74] Y. Zhong, J. Liu, G. Shen, Tactical maneuver recognition of enemy aircraft in autonomous close air combat, J. Beijing Univ. Aeronaut. Astronaut., 33 (2007), 1056–1059.
    [75] H. Ten, B. Li, Y. Gao, D. Yang, Y. Zhang, Evaluation model of UAV level flight action quality based on flight data, J. Beijing Univ. Aeronaut. Astronaut., 45 (2019), 2108–2114.
    [76] Z. Wei, D. Ding, H. Zhou, Z. Zhang, L. Xie, L. Wang, A Flight Maneuver Recognition method based on multi-strategy affine canonical time warping, Appl. Soft Comput., 95 (2020), 106527. https://doi.org/10.1016/j.asoc.2020.106527 doi: 10.1016/j.asoc.2020.106527
    [77] S. Moon, N. Phan, D. Churchill, Maneuver recognition verification & validation using visualization, in 2011 Fourteenth Australian International Aerospace Congress (AIAC), 28 (2011).
    [78] H. Guo, J. Pang, L. Han, Z. Shan, Flight data visualization for simulation & evaluation: a general framework, in 2012 IEEE Fifth International Symposium on Computational Intelligence and Design (ISCID), (2012), 497–502. https://doi.org/10.1109/iscid.2012.130
    [79] X. Du, D. Wang, S. He, C. Ren, Algorithm for flight mission segmentation of measured loads for transport class aircraft, Sci. Technol. Eng., 17 (2017), 352–355.
    [80] S. Liu, P. Wang, B. Ye. Intelligent monitoring technology for flight test based on automatic test point identification, Comput. Appl. Software, 37 (2020), 59–64.
    [81] S. Wu, D. He, E. Bechhoefer, A practical regime prediction approach for HUMS applications, in 2007 Annual Forum Proceedings American Helicopter Society (AFPAHS), 63 (2007), 1440.
    [82] D. J. Berndt, J. Clifford, Using dynamic time warping to find patterns in time series, in 1994 Knowledge Discovery and Data Mining (KDD), 10 (1994), 359–370.
    [83] R. Matheson, Air Force sign agreement to launch AI Accelerator, 2019. Available from: https://news.mit.edu/2019/mit-and-us-air-force-sign-agreement-new-ai-accelerator-0520
    [84] J. Zhang, P. Zhang, Time Series Analysis Methods and Applications for Flight Data, Springer, Berlin, 2017.
    [85] Z. Kang, J. Shang, Y. Feng, L. Zheng, Q. Wang, H. Sun, et al., A deep sequence‐to‐sequence method for accurate long landing prediction based on flight data, IET Intell. Transp. Syst., (2021). https://doi.org/10.1049/itr2.12078 doi: 10.1049/itr2.12078
    [86] X. Li, J. Shang, L. Zheng, Q. Wang, H. Sun, L. Qi, Curvecluster+: Curve clustering for hard landing pattern recognition and risk evaluation based on flight data, IEEE Trans. Intell. Transp. Syst., 23 (2022), 1028–1042. https://doi.org/10.1109/TITS.2021.3117846 doi: 10.1109/TITS.2021.3117846
    [87] J. Lu, H. Chai, R. Jia, A general framework for flight maneuvers automatic recognition, Mathematics, 10 (2022), 1–15. https://doi.org/10.3390/math10071196 doi: 10.3390/math10071196
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(2269) PDF downloads(228) Cited by(1)

Article outline

Figures and Tables

Figures(15)  /  Tables(7)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog