Generative AI orchestration competencies, the abilities to evaluate, adapt, and integrate AI outputs with human judgment, remain inadequately addressed by general AI literacy frameworks. This paper applies and extends the SAGE (Structured AI-Guided Education) framework to systems analysis and design education through three experimental assessments targeting requirements synthesis, formal modelling correction, and design evaluation. In the Brisbane baseline cohort ($n = 13$ groups), 84% demonstrated selective and justified AI use (Balanced Integrator or Selective Adapter), and no group reached expert-level synthesis; the subsequent cross-campus cohorts (n = 5 groups, yielding $N = 18$ template-compliant groups analysed overall out of 21 participating groups) corroborated these patterns. Students consistently recognised accessibility needs when working with user requirements and interface designs, but almost entirely neglected them when producing system architecture diagrams, suggesting that this competency depends on continuous prompting rather than transferring automatically between tasks. When AI was used to generate formal data flow diagrams from the students' specifications, the resulting errors were not random but clustered around system boundary classification, exception handling, and data lifecycle completeness. These findings carry implications for how educators structure AI collaboration in technical curricula, particularly regarding the sequencing of human and AI contributions, the need for stage-specific scaffolding, and the role of explicit justification in developing professional judgment.
Citation: Mahmoud Elkhodr, Ergun Gide. AI leads, humans lead, or collaborate? Empirical findings and the SAGE roadmap for embedding GenAI in systems analysis and design education[J]. STEM Education, 2026, 6(2): 194-229. doi: 10.3934/steme.2026009
Generative AI orchestration competencies, the abilities to evaluate, adapt, and integrate AI outputs with human judgment, remain inadequately addressed by general AI literacy frameworks. This paper applies and extends the SAGE (Structured AI-Guided Education) framework to systems analysis and design education through three experimental assessments targeting requirements synthesis, formal modelling correction, and design evaluation. In the Brisbane baseline cohort ($n = 13$ groups), 84% demonstrated selective and justified AI use (Balanced Integrator or Selective Adapter), and no group reached expert-level synthesis; the subsequent cross-campus cohorts (n = 5 groups, yielding $N = 18$ template-compliant groups analysed overall out of 21 participating groups) corroborated these patterns. Students consistently recognised accessibility needs when working with user requirements and interface designs, but almost entirely neglected them when producing system architecture diagrams, suggesting that this competency depends on continuous prompting rather than transferring automatically between tasks. When AI was used to generate formal data flow diagrams from the students' specifications, the resulting errors were not random but clustered around system boundary classification, exception handling, and data lifecycle completeness. These findings carry implications for how educators structure AI collaboration in technical curricula, particularly regarding the sequencing of human and AI contributions, the need for stage-specific scaffolding, and the role of explicit justification in developing professional judgment.
| [1] |
Crompton, H. and Burke, D., Artificial intelligence in higher education: the state of the field. International journal of educational technology in higher education, 2023, 20(1): 1–22. https://doi.org/10.1186/s41239-023-00392-8 doi: 10.1186/s41239-023-00392-8
|
| [2] |
Khreisat, M.N., Khilani, D., Rusho, M.A., Karkkulainen, E.A., Tabuena, A.C. and Uberas, A.D., Ethical implications of ai integration in educational decision making: Systematic review. Educational Administration: Theory and Practice, 2024, 30(5): 8521–8527. https://doi.org/10.53555/kuey.v30i5.4406 doi: 10.53555/kuey.v30i5.4406
|
| [3] | Levin, I., Marom, M. and Kojukhov, A., Rethinking ai in education: Highlighting the metacognitive challenge. BRAIN. Broad Research in Artificial Intelligence and Neuroscience, 2025, 16(1, Sup. 1): 250–263. |
| [4] |
Elkhodr, M. and Gide, E., The sage framework for developing critical thinking and responsible generative ai use in cybersecurity education. Discover Education, 2025, 4(1): 517. https://doi.org/10.70594/brain/16.S1/21 doi: 10.70594/brain/16.S1/21
|
| [5] |
Sifakis, J., System design automation: Challenges and limitations. Proceedings of the IEEE, 2015,103(11): 2093–2103. https://doi.org/10.1109/JPROC.2015.2484060 doi: 10.1109/JPROC.2015.2484060
|
| [6] | Messina, S. and Panciroli, C., Rethinking teaching with genai: Theoretical models and operational tools. Journal of Inclusive Methodology and Technology in Learning and Teaching, 2025, 5(1). |
| [7] | Qadir, J., Engineering education in the era of chatgpt: Promise and pitfalls of generative ai for education. In 2023 IEEE global engineering education conference (EDUCON), 2023, 1–9, IEEE. https://doi.org/10.1109/EDUCON54358.2023.10125121 |
| [8] | North, M. and Riskas, T., Assessing is learning outcomes effectively in the age of genai. Issues in Information Systems, 2025, 26(2): 1–13. |
| [9] |
Chee, H., Ahn, S. and Lee, J., A competency framework for ai literacy: Variations by different learner groups and an implied learning pathway. British Journal of Educational Technology, 2025, 56(5): 2146–2182. https://doi.org/10.1111/bjet.13556 doi: 10.1111/bjet.13556
|
| [10] | Long, D. and Magerko, B., What is ai literacy? competencies and design considerations. In Proceedings of the 2020 CHI conference on human factors in computing systems, 2020, 1–16. https://doi.org/10.1145/3313831.3376727 |
| [11] |
Ng, D.T.K., Leung, J.K.L., Chu, K.W.S. and Qiao, M.S., Ai literacy: Definition, teaching, evaluation and ethical issues. Proceedings of the association for information science and technology, 2021, 58(1): 504–509. https://doi.org/10.1002/pra2.487 doi: 10.1002/pra2.487
|
| [12] | Nehru, R., Paredes, S., Roy, S.C., Cuong, T.Q. and Huong, B.T.T., Implementing the revised bloom's taxonomy (2001) in ai-digital and online learning environments: A strategic approach. Indian Journal of Educational Technology, 2025, 7(1): 173–189. |
| [13] | Holmes, W. and Miao, F., Guidance for generative AI in education and research, Unesco Publishing, 2023. |
| [14] |
Nguyen, K.V., The use of generative ai tools in higher education: Ethical and pedagogical principles. Journal of Academic Ethics, 2025, 1–21. https://doi.org/10.2139/ssrn.5003394 doi: 10.2139/ssrn.5003394
|
| [15] |
Kasneci, E., Seßler, K., Küchemann, S., Bannert, M., Dementieva, D., Fischer, F., et al., Chatgpt for good? on opportunities and challenges of large language models for education. Learning and individual differences, 2023,103: 102274. https://doi.org/10.1016/j.lindif.2023.102274 doi: 10.1016/j.lindif.2023.102274
|
| [16] | Fuchs, K., Exploring the opportunities and challenges of nlp models in higher education: is chat gpt a blessing or a curse?. In Frontiers in education, 2023, 8: 1166682, Frontiers Media SA. https://doi.org/10.3389/feduc.2023.1166682 |
| [17] |
Silva, C.A.G.D., Ramos, F.N., De Moraes, R.V. and Santos, E.L.D., Chatgpt: Challenges and benefits in software programming for higher education. Sustainability, 2024, 16(3): 1245. https://doi.org/10.3390/su16031245 doi: 10.3390/su16031245
|
| [18] |
Marques, N., Silva, R.R. and Bernardino, J., Using chatgpt in software requirements engineering: A comprehensive review. Future Internet, 2024, 16(6): 180. https://doi.org/10.3390/fi16060180 doi: 10.3390/fi16060180
|
| [19] |
Cotton, D.R., Cotton, P.A. and Shipway, J.R., Chatting and cheating: Ensuring academic integrity in the era of chatgpt. Innovations in education and teaching international, 2024, 61(2): 228–239. https://doi.org/10.1080/14703297.2023.2190148 doi: 10.1080/14703297.2023.2190148
|
| [20] | Biggs, J., Tang, C. and Kennedy, G., Teaching for quality learning at university 5e, McGraw-hill education (UK), 2022. |
| [21] | Paul, R. and Elder, L., The miniature guide to critical thinking concepts & tools, Langara College, 2011. |
| [22] | Ennis, R.H., The nature of critical thinking: An outline of critical thinking dispositions and abilities. University of Illinois, 2011, 2(4): 1–8. |
| [23] |
Rittel, H.W. and Webber, M.M., Dilemmas in a general theory of planning. Policy sciences, 1973, 4(2): 155–169. https://doi.org/10.1007/BF01405730 doi: 10.1007/BF01405730
|
| [24] | Anderson, L.W. and Krathwohl, D.R., eds., A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives, Longman, 2001. |
| [25] |
Krathwohl, D.R., A revision of bloom's taxonomy: An overview. Theory into Practice, 2002, 41(4): 212–218. https://doi.org/10.1207/s15430421tip4104_2 doi: 10.1207/s15430421tip4104_2
|
| [26] |
Pintrich, P.R., The role of metacognitive knowledge in learning, teaching, and assessing. Theory into practice, 2002, 41(4): 219–225. https://doi.org/10.1207/s15430421tip4104_3 doi: 10.1207/s15430421tip4104_3
|
| [27] | W3C Web Accessibility Initiative, Web content accessibility guidelines (wcag) 2.2., 2023. Accessed 2025-10-07. Available from: https://www.w3.org/TR/WCAG22/. |