Research article

A ranking comparison of the traditional, online and mixed laboratory mode learning objectives in engineering: Uncovering different priorities


  • Received: 13 September 2023 Revised: 29 November 2023 Accepted: 05 December 2023 Published: 28 December 2023
  • The laboratory, an integral component of engineering education, can be conducted via traditional, online or mixed modes. Within these modes is a diverse range of implementation formats, each with different strengths and weaknesses. Empirical evidence investigating laboratory learning is rather scattered, with objectives measurement focused on the innovation in question (e.g., new simulation or experiment). Recently, a clearer picture of the most important laboratory learning objectives has formed. Missing is an understanding of whether academics implementing laboratories across different modes think about learning objectives differently. Using a survey based on the Laboratory Learning Objectives Measurement instrument, academics from a diverse range of engineering disciplines from across the world undertook a ranking exercise. The findings show that those implementing traditional and mixed laboratories align closely in their ranking choices, while those implementing online-only laboratories think about the objectives slightly differently. These findings provide an opportunity for reflection, enabling engineering educators to refine the alignment of their teaching modes, implementations and assessments with their intended learning objectives.

    Citation: Sasha Nikolic, Sarah Grundy, Rezwanul Haque, Sulakshana Lal, Ghulam M. Hassan, Scott Daniel, Marina Belkina, Sarah Lyden, Thomas F. Suesse. A ranking comparison of the traditional, online and mixed laboratory mode learning objectives in engineering: Uncovering different priorities[J]. STEM Education, 2023, 3(4): 331-349. doi: 10.3934/steme.2023020

    Related Papers:

    [1] Erdem Memik, Sasha Nikolic . The Virtual reality electrical substation field trip: Exploring student perceptions and cognitive learning. STEM Education, 2021, 1(1): 47-59. doi: 10.3934/steme.2021004
    [2] William Guo . Design and implementation of multi-purpose quizzes to improve mathematics learning for transitional engineering students. STEM Education, 2022, 2(3): 245-261. doi: 10.3934/steme.2022015
    [3] Zhongyan Hu, Yun Fah Chang, Ming Kang Ho . A comparative study on the effectiveness of blended learning, physical learning, and online learning in functional skills training among higher vocational education. STEM Education, 2024, 4(3): 247-262. doi: 10.3934/steme.2024015
    [4] Eduard Krylov, Sergey Devyaterikov . Developing students' cognitive skills in MMS classes. STEM Education, 2023, 3(1): 28-42. doi: 10.3934/steme.2023003
    [5] José Luis Díaz Palencia, Yanko Ordónez Ontiveros . A classroom experience about the application of research-based learning for the teaching of probability in engineering. STEM Education, 2024, 4(2): 127-141. doi: 10.3934/steme.2024008
    [6] Huda Munjy, Stephanie Botros, Rhonda Abouazra . Enhancing success in fundamental engineering courses: A case study on using team based learning to address high failure rates. STEM Education, 2025, 5(1): 152-170. doi: 10.3934/steme.2025008
    [7] Mahmoud Elkhodr, Ergun Gide, Robert Wu, Omar Darwish . ICT students' perceptions towards ChatGPT: An experimental reflective lab analysis. STEM Education, 2023, 3(2): 70-88. doi: 10.3934/steme.2023006
    [8] Suhua Wang, Zhiqiang Ma, Hongjie Ji, Tong Liu, Anqi Chen, Dawei Zhao . Personalized exercise recommendation method based on causal deep learning: Experiments and implications. STEM Education, 2022, 2(2): 157-172. doi: 10.3934/steme.2022011
    [9] Tasadduq Imam . Good practices of delivery and teaching leadership for online educators in technical disciplines: A perspective. STEM Education, 2021, 1(2): 92-103. doi: 10.3934/steme.2021007
    [10] Muzakkir, Rose Amnah Abd Rauf, Hutkemri Zulnaidi . Development and validation of the Quran – Science, Technology, Engineering, Art, And Mathematics (Q-STEAM) module. STEM Education, 2024, 4(4): 346-363. doi: 10.3934/steme.2024020
  • The laboratory, an integral component of engineering education, can be conducted via traditional, online or mixed modes. Within these modes is a diverse range of implementation formats, each with different strengths and weaknesses. Empirical evidence investigating laboratory learning is rather scattered, with objectives measurement focused on the innovation in question (e.g., new simulation or experiment). Recently, a clearer picture of the most important laboratory learning objectives has formed. Missing is an understanding of whether academics implementing laboratories across different modes think about learning objectives differently. Using a survey based on the Laboratory Learning Objectives Measurement instrument, academics from a diverse range of engineering disciplines from across the world undertook a ranking exercise. The findings show that those implementing traditional and mixed laboratories align closely in their ranking choices, while those implementing online-only laboratories think about the objectives slightly differently. These findings provide an opportunity for reflection, enabling engineering educators to refine the alignment of their teaching modes, implementations and assessments with their intended learning objectives.



    Practical work has learning benefits that stem across all levels of education [1]. Practical laboratory experiences, referred to as traditional laboratory mode, have long supported engineering education, where students bridge the gap between theoretical knowledge and practical application [2]. Traditionally, these labs have been conducted in physical spaces involving hands-on learning experiences, but over many years, online laboratory modes have emerged as a viable alternative. Such viability was witnessed during the COVID-19 lockdowns when many traditional laboratory implementations were transitioned to online experiences [3,4]. The term online laboratory can be associated with various implementations, including remote, virtually represented, fully simulated and otherwise emulated laboratories [5]. This collection of approaches defines the online mode used within this study. A mixed laboratory refers to implementations that combine traditional and online modes. The definitions used here under the umbrella of modes are very broad, and a more granular introduction and description of a large spectrum of laboratory formats is outlined in the work of May, Terkowsky [6]. Different advantages are associated with the different modes, some key points are listed below [7,8,9,10]:

    - Traditional Labs:

    - Hands-On Learning: Allowing students to touch, feel and interact with physical equipment, fostering a deep understanding of engineering principles.

    - Immediate Feedback: Students receive instant feedback through real-time observations and measurements.

    - Collaboration: Traditional labs often encourage collaboration among students. Working in teams fosters communication and teamwork.

    - Safety: Engineers can work in dangerous environments. Traditional labs provide a controlled environment where safety protocols can be observed and practiced.

    - Instructor Guidance: Instructors can offer hands-on guidance, answer questions in real-time and provide valuable insights based on their expertise.

    - Troubleshooting: Implementing an experiment involves interacting with various tools and equipment, which can lead to errors, faulty equipment and problems. This provides students with a multi-sensory experience that replicates real-world challenges.

    - Online Labs:

    - Accessibility: Depending on the online format, online labs can be accessed from anywhere with an internet connection, allowing students to engage with experiments at their convenience. This was a significant benefit during COVID-19 lockdowns.

    - Cost-Efficiency: Setting up and maintaining physical labs can be expensive. Online labs often require fewer resources, making engineering education more cost-effective. Remote labs, for example, can give students in remote or underprivileged areas access to hardware that would otherwise not be possible.

    - Repeatability: Online labs can be repeated multiple times quickly. For example, students can change components and values and rerun a simulation at the speed of a button. There is no need to find physical items and worry about wear and tear.

    - Data Analysis: Online labs often provide data analysis tools that allow students to process and visualize results more efficiently.

    - Self-Paced Learning: Online labs can accommodate various learning styles and paces, empowering students to tailor their educational experience to their needs.

    - Safety: Students can work in a safe environment without the risk of harm or equipment damage.

    - Scalability: Online labs can easily scale to accommodate larger numbers of students, making them an excellent choice for institutions with high enrolment.

    - Troubleshooting: Online labs allow students to focus on the core learning objectives, eliminating the need for complex fault finding and dealing with potential faulty equipment and logistical problems.

    By combining online and traditional components, mixed laboratories try to build upon the different strengths across the different modes. For example, verification of theoretical concepts may be simulated first and followed up with traditional hands-on activities. Using a mixed approach, students have access to various benefits associated with both modes [11,12].

    The list of advantages highlights that different laboratory modes have different strengths and weaknesses. Not surprisingly, this relationship also extends to differences in learning outcomes across modes [13]. Substantial evidence shows that online laboratories can provide equal or better cognitive learning outcomes than traditional approaches [14,15,16]. However, beyond the cognitive domain, little empirical evidence offers the understanding that we need for learning outcomes across the psychomotor and affective domains, resulting in a knowledge gap [17]. Nikolic et al. [17] pinpointed two reasons for this phenomenon. First, the ease of data collection is cited as a factor, particularly when it comes to cognitive learning. Second, the predominant focus of empirical data on learning revolves around gaining insights into innovation, such as new simulations, experimental setups or technologies. The emphasis is often not placed on achieving a holistic understanding of laboratory-based learning. Therefore, due to the diversity of applications and intended outcomes in different laboratory implementations, direct comparisons are not necessarily helpful in developing a holistic understanding of learning [5]. There are several steps we can undertake to overcome this limitation. The first step in developing this holistic understanding is gaining insights into what learning is occurring beyond those defined in course learning objectives. The first author has made progress on this in terms of perceived learning [18] and is currently building evidence in terms of real learning. In other words, whether there is a relationship between what students think they learned and what they actually learned. The second step is to understand which learning objectives are important and to whom, and major inroads have been made on this front [19,20,21]. The third step is to confirm if we are effectively assessing said objectives [17] because misalignment is probable [22]. For instance, Nightingale, Carew and Fung [22] found that a significant mismatch can occur between the stated learning objectives of subjects and how students are assessed. This is the next research phase by the team. By doing this, staff can associate the best mode, implementation and assessment with intended learning objectives.

    In this work, we focus on the second step, which is understanding the importance of learning objectives, as mentioned above. Collectively, the work of Nikolic et al. [19] found that the most important cognitive ranked items were understanding, design/modeling and analysis. For the psychomotor-based objectives, the items ranked highest were successful experimentation, planning & execution and instrument use. For the affective-based objectives, items ranked highest were teamwork, communication and independence. However, ranking order may be influenced by location or discipline, with discipline being most influential to the slight variances [19,20,21]. These differences are important to understand as they give meaning to laboratory design decisions by different groups of engineering educators.

    Within the literature, the influence of laboratory mode on design decisions is unknown. That is, do academics designing laboratory implementations think about objectives differently when working in a specific mode? This study contributes to the field by exploring the targeted learning objectives of academics implementing traditional, mixed or online-only laboratory implementations. In doing so, the study answers the following research question: "How do academics implementing different laboratory modes of teaching think about laboratory learning objectives and rank them in terms of importance?". This study's findings will help classify which mode is best suited to which learning objective.

    There are 13 core laboratory learning objectives: Instrumentation, models, experiment, data analysis, safety, design, psychomotor, sensory awareness, learning from failure, creativity, communication, teamwork and ethics [2]. Learning also occurs across three domains: Cognitive, psychomotor and affective [23]. An instrument called Laboratory Learning Objectives Measurement (LLOM) [19] combines the laboratory learning objectives with Bloom's Taxonomy, providing an easy-to-use, context-modifiable, holistic template to explore learning in engineering laboratories. Using this instrument, it has been discovered that students perceive that learning occurs across all three domains during experimentation [18]. However, the scientific community has concentrated its efforts on measuring cognitive learning [5,17]. For example, Steger and Nitsche [24] compared learning of simulation and traditional implementations by exploring student achievement based on post-laboratory test results. Likewise, Singh and Mantri [15] investigated the differences through pre and post-laboratory tests. All tests focused on the cognitive domain.

    While perceived learning can highlight many benefits to student experience [25], more effort is needed to understand real learning. To overcome this, over the last ten years, through the efforts of higher-quality journals, research efforts have accelerated to analyze empirical data on student learning through learning instruments or student performance than simply focusing on perceptions of learning through survey instruments [17]. Complicating the analysis and evidence collection is whether learning is considered on an immediate or long-term basis [26]. Furthermore, influences such as the impact of teaching staff can affect the results of studies if not carefully accounted for [24,27]. We can improve knowledge in this area by improving our processes and strategies towards measuring learning [28].

    The community has built substantial evidence that cognitive learning occurs across any laboratory mode. For example, Uzunidis and Pagiatakis [29] showed that the average student grade for laboratory reports was similar across virtual and physical implementations. Similarities in cognitive learning were also found by Memik and Nikolic [30]. Growing evidence shows that combining modes increases learning benefits [14]. For example, Coleman and Hosein [31] found that the maximum marks for laboratory reports increased when a simulation was added and used in a traditional laboratory. A similar uplift in marks was seen by Gamo [12] and Kollöffel and de Jong [32]. However, a more significant problem across all studies is that there is no unity in the laboratory objectives being addressed; they are primarily targeted at a specific innovation [17]. Therefore, as innovation drives the researched learning outcomes, it is important to take a step back and determine if the important learning objectives, whatever they may be, are not being lost in this innovation-driven approach. Hence, it is important to determine which laboratory objectives are important.

    Not much attention beyond perceptions of learning is given to psychomotor or affective learning, primarily because it is harder to collect the data [17,33]. For example, using a pre and post-test is easy to implement [34]. Attempts made to try and measure psychomotor or affective learning have primarily come from instructor observations and interviews [35,36], possibly more time consuming and suggestive. The problem with this deficiency of empirical data is that we do not have a holistic understanding of learning across all modes. If there are areas of weakness, the community can work towards innovative solutions. For example, teamwork is strongly associated with the traditional mode, but examples of collaboration in online modes have emerged [37]. Additionally, COVID-19 transitions to remote work environments have shown how the world can adjust to online forms of collaboration. Just as with the cognitive domain, some commonality is needed to understand better the laboratory objectives across the psychomotor and affective domains, hence the need for this study. We need a better understanding of which laboratory objectives are essential. Then, it will be possible to collectively test the impact of learning across modes using the best assessment methods.

    The Laboratory Learning Objectives Measurement (LLOM) instrument provides a holistic list of learning objectives that combine the laboratory objectives outlined in [2] with Bloom's Taxonomy [23], hierarchical models used for the classification of educational learning objectives. It uses a template format in which keywords can be substituted for any engineering discipline or context. This allows it to be used in traditional labs and new innovative labs (e.g., 3D Printing). The template is outlined in Table 1. As an example of its use, item C1 could be written as "Understand the operation of soil testing equipment" for a civil laboratory, while it could be written as "Understand the operation of a multimeter" for an electronics laboratory. A comprehensive explanation of the instrument is available in [19].

    Table 1.  Laboratory learning objective items [19].
    Domain Item LLOM Objective*
    Cognitive C1 Understand the operation of equipment/software used within the laboratory
    Cognitive C2 Design experiments/models (physical or simulation) to verify course concepts
    Cognitive C3 Use engineering tools (e.g. [name of hardware/software used]) to solve problems
    Cognitive C4 Read and understand datasheets/circuit-diagrams/ procedures/user-manuals/help-menus
    Cognitive C5 Draw & interpret relevant charts, graphs, tables & signals
    Cognitive C6 Recognise safety issues associated with laboratory experimentation
    Cognitive C7 Analyze the results from an experiment
    Cognitive C8 Write a conclusion summarising your findings from an experiment
    Cognitive C9 Write a laboratory report/entry into a logbook in a professional manner
         
    Psychomotor P1 Correctly conduct an experiment on [course equipment/ software name-e.g. power systems]
    Psychomotor P2H Select and use appropriate instruments for the input, output and measurement of your circuit/system
    Psychomotor P2S Select appropriate commands and navigate interface to simulate/program a model
    Psychomotor P3 Plan and execute experimental work related to this course
    Psychomotor P4 Construct/code a working circuit/simulation/program
    Psychomotor P5 Interpret sounds, temperature, smells and visual cues and use tools to diagnose faults/errors
    Psychomotor P6H Operate instruments (e.g. [equipment name]) required for experimentation
    Psychomotor P6S Operate software packages (e.g. [software name]) required for coding/simulation
    Psychomotor P7 Take the reading of the output from circuits/ instruments/sensors
         
    Affective A1 Work in a team to conduct experiments, diagnose problems and analyze results
    Affective A2 Communicate laboratory setup, fault diagnosis, readings and findings with others
    Affective A3 Work independently to conduct experiments, diagnose problems and analyze results
    Affective A4 Consider ethical issues in laboratory experimentation and communication of discoveries
    Affective A5 Creatively use software/hardware to design or modify an experiment to solve a problem
    Affective A6 Learn from failure (when experiment/simulation/code fails or results are unexpected)
    Affective A7 Motivate yourself to complete experiments and learn from the laboratory activities
    * Terms in italics are place keepers. The term is to be substituted for one relevant to the experiment.

     | Show Table
    DownLoad: CSV

    Through the use of this instrument, it is possible to develop a common understanding of what is perceived as the most important laboratory objectives. This allows for a reflection on the direction and thinking of academic communities across disciplines, locations and modes. Questions can then be asked, such as, are the perceived rankings optimum? Are some objectives more important in some disciplines than others? Moreover, in the case of this study, do academics with a traditional focus think about objectives differently from those that design online laboratories? Answering such questions allows for some positive reflection and possible realignment of actions.

    Through investigations to date, evidence suggests that students perceive that learning occurs across all three domains in a laboratory [18]. This was achieved by students rating their ability against the instrument items before the start of the first laboratory session and after the last laboratory session, with the differences equating to their perceived learning. In terms of determining the most and least important objectives, the LLOM items have been used in ranking exercises.

    In terms of ranking, there is much commonality in order across continents [20]. As discovered by Nikolic et al. [19], even though a general common order is present, the most accurate rankings are determined by discipline [19]. Through those findings, it has been possible to develop insights into why the higher and lower rankings are how the academics perceived them. It allows for reflection and an opportunity to consider the correlation of objectives to the given assessment tasks. Interestingly, for the cognitive and psychomotor domains, the ranking order correlated somewhat to the hierarchical structure of Bloom's Taxonomy, but this was not the case for the affective domain [19]. Repeating this analysis with laboratory modes can open new insights.

    In 2021, over 3,000 academics worldwide were invited to participate in a survey that required ranking learning objectives using the LLOM instrument. This is an instrument used as a foundation for multiple papers [17,18,19,20,21] and has undergone a range of testing, including Cronbach's alpha and factor analysis (Kaiser rule, parallel analysis, optimal coordinates and acceleration factor) as outlined in [18]. Recruitment for participation came from advertisements via direct email and through social and professional networks of the research team. This included professional networks on platforms such as Facebook and LinkedIn. From the invitations, there were 219 survey commencements and 160 completions. Given the high workload on the academic community and the cognitive load required to complete the rankings, the number of completions met expectations.

    Response distribution was 113 from Australasia, 25 from Europe, 12 from Asia, nine from North America and one from South America. While Australasian responses dominate, an earlier study [24] found that across the board, statistical differences and rankings were minimal across the cognitive and psychomotor domains but evident across the affective domain. Discipline response distribution was 2 Aeronautical, 7 Biomedical, 17 Chemical, 14 Civil, 17 Computer, 22 Electrical, 19 Electronics, 2 Industrial/Process, 10 Materials, 21 Mechanical, 8 Mechatronics, 1 Mining, 4 Other, 10 Software and 6 Telecommunications. Regarding laboratory teaching experience, 23% of respondents had less than five years of teaching experience, 20% had between five and ten years of experience and 57% had ten or more years of experience.

    Participants completed the survey through Qualtrics. They were provided insights into how the template functioned and could be tailored for purpose. The survey required participants to rank the multi-domain objectives in order of importance (1 = highest ranked) as listed in the Laboratory Learning Objectives Measurement (LLOM) instrument. Participants were required to rank the objectives from most important (ranking = 1) to least important. A fixed initial ranking was used to determine if any rankings remained unchanged based on the order in Table 1. None of the rankings were left in the default state for the responses analyzed.

    The data was analyzed in five groups:

    - Collectively (n = 160): This included all responses.

    - Traditional (n = 56): This covered those that only implemented face-to-face styled laboratories.

    - Online (n = 13): This covered those that only implemented online-styled laboratories.

    - Mixed (n = 90): This covered those that implemented laboratories that combined traditional and online modes.

    - Other (n = 1): As classified by the respondents as not fitting any of the groups. This data was not investigated separately, only within the collective.

    It must be observed that the online-only cohort is a relatively small sample. This could create some noise within the ranking order. However, the data would still be helpful as the authors believe that the presented ratio probably represents the current implementation ratio.

    Limitations

    This study does have certain constraints. It relies on a self-selection approach, meaning that the viewpoints expressed might predominantly reflect those of academics who are more actively involved in and influenced by research in engineering education. Although we provided guidance on how to understand and use the LLOM template, there is no assurance that every participant comprehended all the elements and correctly applied the template, including identifying key terms within the context. Despite inviting approximately 3000 academics to participate, only a relatively small number completed the survey in its entirety. It is worth noting that such a limited response rate aligns with common patterns observed in previous experiences of this kind.

    The statistician on the team analyzed the results. The platform R version 4.05 was used for the statistical analysis with the results shown in Tables 2 (cognitive), 3 (psychomotor) and 4 (affective). Rankings were determined using averages. The lower the number, the more academics ranked the objective as more important than objectives with a higher average. In brackets, the 95% confidence interval (CI) is shown. When two confidence intervals do not overlap, a statistically significant difference in mean values can be concluded. The differences between the online-only and traditional groups are highlighted in green. Differences between the online group and the collective are shown in blue. For example, for P2S in Table 3, the online-only group has a confidence interval (2.16, 4.95), and the traditional group has a confidence interval (5.10, 6.15). As the intervals do not overlap (2.16 and 4.95 are both smaller than 5.10 and 6.15), and the lower endpoint, 5.10, is larger than the higher endpoint of the other (which is 4.95), a statistically significant difference in mean values can be concluded.

    Table 2.  Learning objectives cognitive domain (Averages with 95% confidence interval) and ranking order.
    Collectively Mixed Modes Online Only Traditional Non-Param. ANOVA
    C1 3.11 (2.79, 3.42) 2.89 (2.46, 3.54) 4.08 (3.06, 6.05) 3.21 (2.68, 3.75) 0.181
    C2 3.31 (2.92, 3.69) 3.52 (3.09, 4.51) 2.62 (0.56, 4.77) 3.16 (2.50, 3.82) 0.3979
    C3 4.06 (3.69, 4.43) 3.77 (3.37, 4.63) 3.69 (1.91, 5.43) 4.61 (3.87, 5.35) 0.1374
    C4 5.50 (5.14, 5.86) 5.48 (5.21, 6.53) 6.00 (3.77, 8.01) 5.41 (4.81, 6.01) 0.4055
    C5 5.10 (4.84, 5.36) 5.29 (4.80, 5.78) 4.77 (3.34, 6.00) 4.84 (4.43, 5.25) 0.4481
    C6 6.23 (5.85, 6.61) 5.92 (5.05, 6.36) 8.23 (7.21, 9.01) 6.34 (5.69, 6.99) 0.0363
    C7 3.86 (3.54, 4.18) 4.07 (3.27, 4.48) 3.23 (1.96, 4.48) 3.70 (3.17, 4.22) 0.6472
    C8 6.54 (6.22, 6.85) 6.49 (5.60, 6.80) 5.54 (4.18, 7.16) 6.82 (6.29, 7.35) 0.2609
    C9 7.29 (6.98, 7.61) 7.58 (6.71, 7.80) 6.85 (4.75, 8.36) 6.91 (6.33, 7.49) 0.0423
    Rank
    1 C1 C1 C2 C2
    2 C2 C2 C7 C1
    3 C7 C3 C3 C7
    4 C3 C7 C1 C3
    5 C5 C5 C5 C5
    6 C4 C4 C8 C4
    7 C6 C6 C4 C6
    8 C8 C8 C9 C8
    9 C9 C9 C6 C9

     | Show Table
    DownLoad: CSV
    Table 3.  Learning objectives psychomotor domain (Averages with 95% CI) and ranking order.
    Collectively Mixed Modes Online Only Traditional Non-Param. ANOVA
    P1 2.46 (2.19, 2.73) 2.44 (1.74, 2.51) 3.46 (1.75, 5.80) 2.27 (1.81, 2.72) 0.02731
    P2H 4.13 (3.78, 4.48) 3.86 (3.30, 4.45) 5.08 (3.86, 8.14) 4.32 (3.68, 4.97) 0.03486
    P2S 5.24 (4.93, 5.56) 5.16 (4.75, 5.79) 4.08 (2.16, 4.95) 5.63 (5.10, 6.15) 0.05501
    P3 3.02 (2.72, 3.33) 3.10 (2.77, 3.89) 3.85 (2.37, 5.85) 2.71 (2.22, 3.21) 0.20317
    P4 5.23 (4.87, 5.60) 5.04 (4.85, 5.99) 3.31 (1.75, 4.92) 5.93 (5.30, 6.56) 0.00148
    P5 6.86 (6.56, 7.16) 7.03 (6.70, 7.73) 7.54 (5.37, 8.85) 6.52 (6.01, 7.03) 0.19424
    P6H 4.90 (4.55, 5.25) 4.97 (4.19, 5.37) 6.31 (6.18, 7.60) 4.48 (3.90, 5.07) 0.00434
    P6S 6.50 (6.15, 6.85) 6.63 (5.68, 7.01) 5.46 (2.13, 6.75) 6.55 (6.01, 7.09) 0.01152
    P7 6.65 (6.29, 7.01) 6.77 (6.01, 7.27) 5.92 (3.90, 7.65) 6.59 (6.00, 7.18) 0.73002
    Rank
    1 P1 P1 P1 P1
    2 P3 P3 P4 P3
    3 P2H P2H P3 P2H
    4 P6H P6H P2S P6H
    5 P4 P4 P2H P2S
    6 P2S P2S P6S P4
    7 P6S P6S P7 P5
    8 P7 P7 P6H P6S
    9 P5 P5 P5 P7

     | Show Table
    DownLoad: CSV
    Table 4.  Learning objectives affective domain (Averages with 95% confidence interval) and ranking order.
    Collectively Mixed Modes Online Only Traditional Non-Param. ANOVA
    A1 2.49 (2.21, 2.77) 2.58 (1.87, 2.71) 3.92 (1.53, 5.81) 2.02 (1.65, 2.39) 0.00719
    A2 3.24 (3.01, 3.47) 3.39 (3.04, 3.90) 3.46 (2.78, 4.78) 2.93 (2.56, 3.29) 0.2859
    A3 3.58 (3.27, 3.88) 3.50 (2.97, 3.98) 2.77 (0.49, 4.18) 3.86 (3.36, 4.36) 0.163
    A4 5.50 (5.28, 5.72) 5.43 (5.03, 5.84) 5.46 (4.15, 6.29) 5.66 (5.36, 5.96) 0.82857
    A5 4.44 (4.13, 4.76) 4.40 (4.03, 5.07) 3.77 (2.51, 5.49) 4.64 (4.09, 5.20) 0.69254
    A6 4.23 (3.97, 4.49) 4.29 (3.84, 4.81) 3.92 (3.30, 5.15) 4.16 (3.75, 4.57) 0.9569
    A7 4.53 (4.20, 4.85) 4.41 (3.88, 5.03) 4.69 (3.35, 6.20) 4.73 (4.17, 5.29) 0.85778
    Rank
    1 A1 A1 A3 A1
    2 A2 A2 A2 A2
    3 A3 A3 A1 A3
    4 A6 A6 A5 A6
    5 A5 A5 A6 A5
    6 A7 A7 A7 A7
    7 A4 A4 A4 A4

     | Show Table
    DownLoad: CSV

    The value in the last column shows the p-value of the non-parametric equivalent test of ANOVA, the Kruskal-Wallis test, to account for non-Gaussian distributed data, which is also best suited to the small sample size. The p-value is used to test for mean differences across groups; this examines whether, for a particular objective (e.g., C1), the mean responses are different across the laboratory modes, i.e., if a p-value is less than 5% (highlighted in grey), then responses differ across groups for that question, otherwise not.

    Each table also provides a visual representation of the objectives in ranking order. Visual representations can help develop a better understanding of data. Colour coding is used to show how the collective ranking differs across the laboratory modes. For example, in Table 2, C1 is light blue. The different ranking of C1 for each laboratory modes can be easily observed in the table by following the colour trend.

    From the sample, it was interesting that most respondents implemented mixed-mode laboratory activities. One reason could be that a substantial percentage of the academic community believes in the benefits of mixed-modal learning. Another reason could be that there is a growing opportunity to mesh online and hands-on skills for experimentation [7], such as in robotics [38].

    The results of the research questions are outlined in the upcoming discussion section.

    Each domain is discussed separately below.

    The data indicates that ranking preferences across the collective, traditional and mixed modes were mainly in alignment. The substantial differences came from academics that implemented online implementations only. The online-only results show a firm preference for C2, the ability to 'design experiments/models to verify course concepts' as the most important objective, with a value lower than the highest ranking objective for the other groups, even for the traditional group that also ranked it first. However, it is important to note that this difference is not statistically significant.

    More interesting was that C1, 'understanding the operation of equipment/software used within the laboratory', was ranked fourth for the 'online-only' group. Across continents [20] and all disciplines, apart from computer and software engineering [19], C1 was ranked first or second. While not statistically significant, this suggests a pattern regarding the thinking of learning objectives across computer-based academics. The other major differences are that C8, summarising findings, is ranked higher for the online group and C6, 'safety' is ranked lowest (as expected). C6 is the only cognitive objective in which a statistical difference is found both across groups (grey highlight) and mean values between the online-only group and across both the collective (blue highlight) and traditional groups (green highlight). The weight of C6 is substantially higher in the other groups. This is not surprising because online laboratories' safety benefits are often touted as one of the main benefits of such an implementation [8,14] and, therefore, unlikely to be emphasised as a key learning objective. While in traditional laboratories, engaging with safe practices is one of the benefits [34,39], and hence would have higher emphasis. However, these insights overlook the fact that virtual reality-based online laboratory implementations may change this dynamic as the technology becomes prevalent, allowing for immersive experiences that bring safety front and center [40,41]; hence, the need for this study to reflect on learning objectives.

    The opposing ranking viewpoints between traditional-only and online-only rankings would suggest that combining the two modes would diversify the cognitive focus in the learning experience. For example, online resources can play a supporting role in aiding understanding in a traditional laboratory [42]. However, the mixed mode group data shows that those implementing such setups think broadly in line with academics implementing traditional-only laboratories.

    Across the psychomotor domain, the ranking pattern mimicked that found across the cognitive domain. There was a very clear alignment across the collective, mixed and traditional groups. P1, reflecting successful experimentation, consistently occupied the highest rank in each of these groups. The one noticeable outlier across the four was that P5, the psychomotor skills associated with fault finding, was ranked higher for the traditional group, however, this difference was not statistically significant. P5 had also been ranked mostly last or second last across the other comparisons undertaken in different studies by the researchers [19,20,21]. It seems appropriate that those academics focusing more on the traditional laboratory approach, where things are more likely to go wrong, would rate P5 higher. The authors have previously argued [19] that there is merit for a rethink that this objective should be ranked higher. It is apparent that objectives related to traditional implementations take higher precedence for those implementing mixed modes, just as was found in the cognitive domain.

    Across the groups, it was no surprise that the online rankings were the most different, as online and traditional modes have obvious differences in psychomotor opportunity. Noting that virtual reality-based online laboratory implementations provide a platform to change such a dynamic as they gradually become more immersive [40,43]. Four of the nine psychomotor items, P2H, P4, P6H and P6S, had statistical mean differences across groups.

    One standout observation involves P2H and P6H (statistically significant across both mean values and groups), representing the selection and operation of instruments. While in all other groups, P2H is unanimously placed in the third position and P6H in fourth, they notably drop to the fifth and eighth rank, respectively, when assessed solely within the online-only group. This is unsurprising, as hands-on activities would not be a primary focus in a hands-off environment. Another noticeable difference is that P4, construct/coding, moved up from fifth in the other groups to second in the online group. This divergence highlights a distinct pattern that sets the online-only group apart from the others in terms of their preferences and evaluations. This could be due to the lower key role equipment plays in online modes compared to the activity of constructing/coding working circuits, simulations and programs.

    Different strengths and weaknesses of the modes were highlighted in the literature review, most with psychomotor implications, as such engagement with psychomotor objectives is very different. These findings show that those objectives that resonate strongest with simulation/remote competencies were ranked higher than those associated with hardware. Interestingly, correctly conducting an experiment was the highest-ranked objective across all modes.

    Unlike the other two domains, there was almost complete alignment across all groups for the affective domain. In this particular domain, the collective, mixed and traditional groups exhibited a striking symmetry, mirroring one another perfectly. The significant outlier was the swapping of objectives A3 and A1 for the online group. That is, the online group ranked independent learning higher than teamwork, which is unsurprising as many online experimentation implementations are targeted at individual work. A1 was the only statistical difference recorded at the group level.

    Face-to-face learning easily enables many advantages of collaborative learning, especially soft skills [44]. This is not to say that teamwork is not possible in online modes, indeed, it can [37]. When synthesising the results from this study with the other three [19,20,21], it appears that discipline-based influences have the greatest influence on rankings across the affective domain. Ethics (A4) is ranked last across all groups. Given that data collection may be more prevalent away from the eyes of teaching staff in online modes, it may be wise to give this objective higher priority to ensure that the data being collected is the same being reported and analyzed, especially if marks are involved. It can be easy to manipulate data, and such practices must be encouraged as wrong. This is just one example, but ethics is clearly an area requiring greater consideration [45].

    As outlined in the literature review, most laboratory studies, especially those comparing laboratory mode implementations, focus on learning in the cognitive domain. As a result, the rankings in the cognitive domain correlate with such findings. The window of opportunity can be found in the psychomotor domain, where little effort has been made to gather non-perception-based empirical data on learning [5,17]. With a focus on different learning objectives, studies can attempt to measure and explore if the differences translate with learning. Such knowledge can aid in making design decisions. It has never been more important to improve our understanding of psychomotor learning due to the impact ChatGPT and other AI technologies are about to have on cognitive learning experiences [46].

    We have been developing knowledge of laboratory learning, and we now better understand which learning objectives are important and to whom. The next step is synthesising this information and examining if our assessment practices correlate, which we need a much better understanding of [22]. If they do not, we can start to make changes.

    We investigated the research question: "How do academics implementing different laboratory modes of teaching think about laboratory learning objectives and rank them in terms of importance?". An almost perfect alignment was found across all modes for the affective domain. The main difference was the swap in priority between independent and collaborative learning, which closely aligns with the typical experience a student may face engaging in such modes. Online-only academics, prioritized independent learning over teamwork. While independent work may be the default approach when using many online technologies, collaborative learning is possible with the right technology and approach [37].

    For the cognitive and psychomotor domains, much similarity was found across the collective, traditional and mixed groups, with the greatest difference coming from the online-only group. These differences can, to some degree, be attributed to the technology associated with each mode. Specifically, the online-only group tended to assign higher rankings to items that were more relevant to their mode of learning. For example, the aspect of 'safety' received the lowest ranking from the online-only group, most probably because student engaging in simulation or remote laboratory setups may not need to give significant consideration to safety due to the safe laboratory controlled environment. However, it is possible that if the focus of the technology changed, e.g., virtual reality bringing about highly immersive learning experiences where safety was the core learning objective, it would be very beneficial [40]. Similarly, virtual reality could simulate a great range of psychomotor activities. This reinforces the contribution of this study, allowing the academic community to reflect on the factors influencing ranking decisions, which, in turn, can influence their design choices. Academics can consider if the rankings are justified, optimal or need adjustment. The different areas of ranking priorities identified can be used by researchers to home in their investigations on the strengths and weaknesses of different laboratory modes, ultimately informing the development of more effective teaching strategies. One key takeaway from these findings is that academics should not let technology limitations guide their focus on the important laboratory objectives. The laboratory object should be the focus.

    The first step in developing this holistic understanding is gaining insights into what learning is occurring beyond those defined in course learning objectives. Progress on this in terms of perceived learning has been made [10], and research is currently underway to build evidence regarding real learning. The second step is to understand which learning objectives are important and to whom, and major inroads have been made on this front [11,12,13]. The third step is to confirm if we are effectively assessing said objectives [9], the next research phase by the research team. By doing this, staff can associate the best mode, implementation and assessment with intended learning objectives. This will help engineering educators enhance the alignment of their teaching modes, implementations and assessments with their intended learning objectives.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    We would like to thank the constructive feedback provided by the reviewers. We also thank all the academics that took the time to participate in the survey. Without their time and input, this research would not have been possible.

    The authors have no conflicts of interest in this paper.

    This study was completed with UOW ethics approval number 2021/252.



    [1] Danković, D., Marjanović, M., Mitrović, N., Živanović, E., Danković, M., Prijić, A., et al., The Importance of Students' Practical Work in High Schools for Higher Education in Electronic Engineering. IEEE Transactions on Education, 2023, 66(2): 146–155. https://doi.org/10.1109/TE.2022.3202629 doi: 10.1109/TE.2022.3202629
    [2] Feisel, L.D. and Rosa, A.J., The Role of the Laboratory in Undergraduate Engineering Education. Journal of Engineering Education, 2005, 94(1): 121–130. https://doi.org/10.1002/j.2168-9830.2005.tb00833.x doi: 10.1002/j.2168-9830.2005.tb00833.x
    [3] Méndez Ruiz, J. and Valverde Armas, P., Designing a drinking water treatment experiment as a virtual lab to support engineering education during the COVID-19 outbreak. Cogent Engineering, 2022, 9(1): 2132648. https://doi.org/10.1080/23311916.2022.2132648 doi: 10.1080/23311916.2022.2132648
    [4] May, D., Morkos, B., Jackson, A., Beyette, F.R., Hunsu, N., Walther, J., et al., Switching from Hands-on Labs to Exclusively Online Experimentation in Electrical and Computer Engineering Courses. 2021 ASEE Virtual Annual Conference, 2021.
    [5] May, D., Alves, G.R., Kist, A.A. and Zvacek, S.M., Online Laboratories in Engineering Education Research and Practice. International Handbook of Engineering Education Research, 2023,525–552. https://doi.org/10.4324/9781003287483-29 doi: 10.4324/9781003287483-29
    [6] May, D., Terkowsky, C., Varney, V. and Boehringer, D., Between hands-on experiments and Cross Reality learning environments – contemporary educational approaches in instructional laboratories. European Journal of Engineering Education, 2023, 48(5): 783–801. https://doi.org/10.1080/03043797.2023.2248819 doi: 10.1080/03043797.2023.2248819
    [7] Ma, J. and Nickerson, J.V., Hands-on, simulated, and remote laboratories: A comparative literature review. ACM Computing Surveys (CSUR), 2006, 38(3): 7-es.
    [8] Stefanovic, M., Tadic, D., Nestic, S. and Djordjevic, A., An assessment of distance learning laboratory objectives for control engineering education. Computer Applications in Engineering Education, 2015, 23(2): 191–202. https://doi.org/10.1002/cae.21589 doi: 10.1002/cae.21589
    [9] May, D., Morkos, B., Jackson, A., Hunsu, N.J., Ingalls, A. and Beyette, F., Rapid transition of traditionally hands-on labs to online instruction in engineering courses. European Journal of Engineering Education, 2023, 48(5): 842–860. https://doi.org/10.1080/03043797.2022.2046707 doi: 10.1080/03043797.2022.2046707
    [10] Kruger, K., Wolff, K. and Cairncross, K., Real, virtual, or simulated: Approaches to emergency remote learning in engineering. Computer Applications in Engineering Education, 2022, 30(1): 93–105.
    [11] Campbell, J.O., Bourne, J.R., Mosterman, P.J. and Brodersen, A.J., The Effectiveness of Learning Simulations for Electronic Laboratories. Journal of Engineering Education, 2002, 91(1): 81–87. https://doi.org/10.1002/j.2168-9830.2002.tb00675.x doi: 10.1002/j.2168-9830.2002.tb00675.x
    [12] Gamo, J., Assessing a Virtual Laboratory in Optics as a Complement to On-Site Teaching. IEEE Transactions on Education, 2019, 62(2): 119–126. https://doi.org/10.1109/TE.2018.2871617 doi: 10.1109/TE.2018.2871617
    [13] Lindsay, E.D. and Good, M.C., Effects of laboratory access modes upon learning outcomes. IEEE Transactions on Education, 2005, 48(4): 619–631. https://doi.org/10.1109/TE.2005.852591 doi: 10.1109/TE.2005.852591
    [14] Balakrishnan, B. and Woods, P.C., A comparative study on real lab and simulation lab in communication engineering from students' perspectives. European Journal of Engineering Education, 2013, 38(2): 159–171. https://doi.org/10.1080/03043797.2012.755499 doi: 10.1080/03043797.2012.755499
    [15] Singh, G., Mantri, A., Sharma, O. and Kaur, R., Virtual reality learning environment for enhancing electronics engineering laboratory experience. Computer Applications in Engineering Education, 2021, 29(1): 229–243.
    [16] Ogot, M., Elliott, G. and Glumac, N., An Assessment of In-Person and Remotely Operated Laboratories. Journal of Engineering Education, 2003, 92(1): 57–64. https://doi.org/10.1002/j.2168-9830.2003.tb00738.x doi: 10.1002/j.2168-9830.2003.tb00738.x
    [17] Nikolic, S., Ros, M., Jovanovic, K. and Stanisavljevic, Z., Remote, Simulation or Traditional Engineering Teaching Laboratory: A Systematic Literature Review of Assessment Implementations to Measure Student Achievement or Learning. European Journal of Engineering Education, 2021, 46(6): 1141–1162. https://doi.org/10.1080/03043797.2021.1990864 doi: 10.1080/03043797.2021.1990864
    [18] Nikolic, S., Suesse, T., Jovanovic, K. and Stanisavljevic, Z., Laboratory Learning Objectives Measurement: Relationships Between Student Evaluation Scores and Perceived Learning. IEEE Transactions on Education, 2021, 64(2): 163–171. https://doi.org/10.1109/TE.2020.3022666 doi: 10.1109/TE.2020.3022666
    [19] Nikolic, S., Suesse, T.F., Grundy, S., Haque, R., Lyden, S., Hassan, G.M., et al., Laboratory learning objectives: ranking objectives across the cognitive, psychomotor and affective domains within engineering. European Journal of Engineering Education, 2023, 48(4): 559–614. https://doi.org/10.1080/03043797.2023.2248042 doi: 10.1080/03043797.2023.2248042
    [20] Nikolic, S., Suesse, T., Grundy, S., Haque, R., Lyden, S., Hassan, G.M., et al., A European vs Australasian Comparison of Engineering Laboratory Learning Objectives Rankings. SEFI 50th Annual Conference. European Society for Engineering Education (SEFI), 2023.
    [21] Nikolic, S., Suesse, T., Haque, R., Hassan, G., Lyden, S., Grundy, S., et al., An Australian University Comparison of Engineering Laboratory Learning Objectives Rankings. 33rd Australasian Association for Engineering Education Conference, 2022, 45–53.
    [22] Nightingale, S., Carew, A.L. and Fung, J., Application of constructive alignment principles to engineering education: have we really changed? AaeE Conference, 2007. Melbourne.
    [23] Krathwohl, D.R., A Revision of Bloom's Taxonomy: An Overview. Theory Into Practice, 2002, 41(4): 212–218.
    [24] Steger, F., Nitsche, A., Arbesmeier, A., Brade, K.D., Schweiger, H.G. and Belski, I., Teaching Battery Basics in Laboratories: Hands-On Versus Simulated Experiments. IEEE Transactions on Education, 2020, 63(3): 198–208.
    [25] Salehi, F., Mohammadpour, J., Abbassi, R., Cheng, S., Diasinos, S. and Eaton, R., Developing an Interactive Digital Reality Module for Simulating Physical Laboratories in Fluid Mechanics. Australasian Journal of Engineering Education, 2022, 27(2): 100–114. https://doi.org/10.1080/22054952.2022.2162673 doi: 10.1080/22054952.2022.2162673
    [26] Sriadhi, S., Sitompul, H., Restu, R., Khaerudin, S. and Wan Yahaya, W.A., Virtual-laboratory based learning to improve students' basic engineering competencies based on their spatial abilities. Computer Applications in Engineering Education, 2022, 30(6): 1857–1871.
    [27] Nikolic, S., Suesse, T.F., McCarthy, T.J. and Goldfinch, T.L., Maximising Resource Allocation in the Teaching Laboratory: Understanding Student Evaluations of Teaching Assistants in a Team Based Teaching Format. European Journal of Engineering Education, 2017, 42(6): 1277–1295. https://doi.org/10.1080/03043797.2017.1287666 doi: 10.1080/03043797.2017.1287666
    [28] Jackson, T., Shen, J., Nikolic, S. and Xia, G., Managerial factors that influence the success of knowledge management systems: A systematic literature review. Knowledge and Process Management, 2020, 27(2): 77–92. https://doi.org/10.1002/kpm.1622 doi: 10.1002/kpm.1622
    [29] Uzunidis, D. and Pagiatakis, G., Design and implementation of a virtual on-line lab on optical communications. European Journal of Engineering Education, 2023, 48(5): 913–928. https://doi.org/10.1080/03043797.2023.2173558 doi: 10.1080/03043797.2023.2173558
    [30] Memik, E. and Nikolic, S., The virtual reality electrical substation field trip: Exploring student perceptions and cognitive learning. STEM Education, 2021, 1(1): 47–59. https://doi.org/10.3934/steme.2021004 doi: 10.3934/steme.2021004
    [31] Coleman, P. and Hosein, A., Using voluntary laboratory simulations as preparatory tasks to improve conceptual knowledge and engagement. European Journal of Engineering Education, 2023, 48(5): 899–912. https://doi.org/10.1080/03043797.2022.2160969 doi: 10.1080/03043797.2022.2160969
    [32] Kollöffel, B. and de Jong, T., Conceptual Understanding of Electrical Circuits in Secondary Vocational Engineering Education: Combining Traditional Instruction with Inquiry Learning in a Virtual Lab. Journal of Engineering Education, 2013,102(3): 375–393.
    [33] Nikolic, S., Suesse, T.F., Goldfinch, T. and McCarthy, T.J., Relationship between Learning in the Engineering Laboratory and Student Evaluations. Australasian Association for Engineering Education Annual Conference, 2015.
    [34] Cai, R. and Chiang, F.-K., A laser-cutting-centered STEM course for improving engineering problem-solving skills of high school students in China. STEM Education, 2021, 1(3): 199–224. https://doi.org/10.3934/steme.2021015 doi: 10.3934/steme.2021015
    [35] Vojinovic, O., Simic, V., Milentijevic, I. and Ciric, V., Tiered Assignments in Lab Programming Sessions: Exploring Objective Effects on Students' Motivation and Performance. IEEE Transactions on Education, 2020, 63(3): 164–172. https://doi.org/10.1109/TE.2019.2961647 doi: 10.1109/TE.2019.2961647
    [36] Vial, P.J., Nikolic, S., Ros, M., Stirling, D. and Doulai, P., Using Online and Multimedia Resources to Enhance the Student Learning Experience in a Telecommunications Laboratory within an Australian University. Australasian Journal of Engineering Education, 2015, 20(1): 71–80. http://dx.doi.org/10.7158/D13-006.2015.20.1 doi: 10.7158/D13-006.2015.20.1
    [37] Tang, H., Arslan, O., Xing, W. and Kamali-Arslantas, T., Exploring collaborative problem solving in virtual laboratories: a perspective of socially shared metacognition. Journal of Computing in Higher Education, 2023, 35(2): 296–319. https://doi.org/10.1007/s12528-022-09318-1 doi: 10.1007/s12528-022-09318-1
    [38] Romdhane, L. and Jaradat, M.A., Interactive MATLAB based project learning in a robotics course: Challenges and achievements. STEM Education, 2021, 1(1): 32–46. https://doi.org/10.3934/steme.2021003 doi: 10.3934/steme.2021003
    [39] Wahab, N.A.A., Aqila, N.A., Isa, N., Husin, N.I., Zin, A.M., Mokhtar, M., et al., A Systematic Review on Hazard Identification, Risk Assessment and Risk Control in Academic Laboratory. Journal of Advanced Research in Applied Sciences and Engineering Technology, 2021, 24(1): 47–62. https://doi.org/10.37934/araset.24.1.4762 doi: 10.37934/araset.24.1.4762
    [40] Pedram, S., Palmisano, S., Skarbez, R., Perez, P. and Farrelly, M., Investigating the process of mine rescuers' safety training with immersive virtual reality: A structural equation modelling approach. Computers & Education, 2020,153: 103891. https://doi.org/10.1016/j.compedu.2020.103891 doi: 10.1016/j.compedu.2020.103891
    [41] Pedram, S., Palmisano, S., Miellet, S., Farrelly, M. and Perez, P., Influence of age and industry experience on learning experiences and outcomes in virtual reality mines rescue training. Frontiers in Virtual Reality, 2022, 3: 941225. https://doi.org/10.3389/frvir.2022.941225 doi: 10.3389/frvir.2022.941225
    [42] Nikolic, S., Training laboratory: Using online resources to enhance the laboratory learning experience. Teaching, Assessment and Learning (TALE), 2014 International Conference on, 2014. IEEE. https://doi.org/10.1109/TALE.2014.7062584
    [43] Marks, B. and Thomas, J., Adoption of virtual reality technology in higher education: An evaluation of five teaching semesters in a purpose-designed laboratory. Education and information technologies, 2022, 27(1): 1287–1305. https://doi.org/10.1007/s10639-021-10653-6 doi: 10.1007/s10639-021-10653-6
    [44] Carbone, G., Curcio, E.M., Rodinò, S. and Lago, F., A Robot-Sumo student competition at UNICAL as a learning-by-doing strategy for STEM education. STEM Education, 2022, 2(3): 262–274. https://doi.org/10.3934/steme.2022016 doi: 10.3934/steme.2022016
    [45] Gwynne-Evans, A.J., Chetty, M. and Junaid, S., Repositioning ethics at the heart of engineering graduate attributes. Australasian Journal of Engineering Education, 2021, 26(1): 7–24. https://doi.org/10.1080/22054952.2021.1913882 doi: 10.1080/22054952.2021.1913882
    [46] Nikolic, S., Daniel, S., Haque, R., Belkina, M., Hassan, G.M., Grundy, S., et al., ChatGPT versus Engineering Education Assessment: A Multidisciplinary and Multi-institutional Benchmarking and Analysis of this Generative Artificial Intelligence Tool to Investigate Assessment Integrity. European Journal of Engineering Education, 2023, 48(4): 559–614. https://doi.org/10.1080/03043797.2023.2213169 doi: 10.1080/03043797.2023.2213169
  • This article has been cited by:

    1. Olumide Sunday Adesina, Lawarence O. Obokoh, Olajumoke Olayemi Salami, Does the high school external examination grades and the type of high school attended impact the academic performance of freshmen university students?, 2024, 4, 2767-1925, 328, 10.3934/steme.2024019
    2. Sasha Nikolic, Carolyn Sandison, Rezwanul Haque, Scott Daniel, Sarah Grundy, Marina Belkina, Sarah Lyden, Ghulam M. Hassan, Peter Neal, ChatGPT, Copilot, Gemini, SciSpace and Wolfram versus higher education assessments: an updated multi-institutional study of the academic integrity impacts of Generative Artificial Intelligence (GenAI) on assessment, teaching and learning in engineering, 2024, 2205-4952, 1, 10.1080/22054952.2024.2372154
    3. Sasha Nikolic, Thomas F. Suesse, Sarah Grundy, Rezwanul Haque, Sarah Lyden, Sulakshana Lal, Ghulam M. Hassan, Scott Daniel, Marina Belkina, Assessment integrity and validity in the teaching laboratory: adapting to GenAI by developing an understanding of the verifiable learning objectives behind laboratory assessment selection, 2025, 0304-3797, 1, 10.1080/03043797.2025.2456944
  • Author's biography Sasha Nikolic received a B.E. degree in telecommunications and a PhD in engineering education from the University of Wollongong, Australia, in 2001 and 2017, respectively. He is a Senior Lecturer of Engineering Education at the University of Wollongong. His interest is developing career-ready graduates involving research in teaching laboratories, artificial intelligence, industry engagement, work-integrated learning, knowledge management, communication, and reflection. Dr Nikolic has been recognised with many awards, including an Australian Award for University Teaching Citation in 2012 and 2019, and a 2023 AAEE Engineering Education Research Design Award. He is a member of the executive committee of AAEE and an Associate Editor for AJEE and EJEE; Sarah Grundy is an education-focused lecturer at the School of Chemical Engineering, The University of New South Wales. Sarah predominantly teaches design subjects at all levels (undergraduate to postgraduate). Sarah has over 15 years of experience in Research & Development, Manufacturing, and project management in industry. Sarah's passion is to develop students to be credible engineers and make their impact in whatever industry through authentic learning practices; Dr. Rezwanul Haque is a Senior Lecturer specialising in Manufacturing Technology at the University of the Sunshine Coast. As an inaugural member of the AAEE Academy, he has contributed significantly to the academic community. In 2019, Dr. Haque served as an Academic Lead at the School of Science and Technology, overseeing the launch of two new Engineering programs and reviewing existing ones. His dedication to learning and teaching earned him the prestigious Senior Fellow status at the Higher Education Academy (UK) in the same year. His research focuses on Engineering Education and material characterisation through neutron diffraction; Sulakshana Lal has a PhD in Engineering Education from Curtin University, Perth, WA, Australia. Her research focused on comparing the learning and teaching processes of face-to-face and remotely-operated engineering laboratories. With a keen interest in the intersection of technology and education, Sulakshana has published several articles in reputable journals and also presented her work at national and international engineering education conferences. Her expertise lies in understanding the nuances of different laboratories pedagogical settings and harnessing technology to enhance laboratory learning outcomes. Sulakshana is passionate about sharing her knowledge and helping educators and students navigate the evolving landscape of engineering education; Dr. Ghulam M. Hassan is Senior Lecturer in Department of Computer Science and Software Engineering at The University of Western Australia (UWA). He received his PhD from UWA. He completed MS and BS from Oklahoma State University, USA and University of Engineering and Technology (UET) Peshawar, Pakistan, respectively. His research interests are multidisciplinary problems, including engineering education, artificial intelligence, machine learning and optimisation in different fields of engineering and education. He is the recipient of multiple teaching excellence awards and is awarded AAEE Engineering Education Research Design Award 2021 & 2023; Scott Daniel is a Senior Lecturer in Humanitarian Engineering at the University of Technology Sydney, and serves as Deputy Editor at the Australasian Journal of Engineering Education and on the Editorial Boards of the European Journal of Engineering Education, the African Journal of Teacher Education and Development, and the Journal of Humanitarian Engineering. Scott uses qualitative methodologies to explore different facets of engineering education, particularly humanitarian engineering. He won the 2019 Australasian Association for Engineering Education Award for Research Design for his work with Andrea Mazzurco on the assessment of socio-technical thinking and co-design expertise in humanitarian engineering; Dr. Marina Belkina is Lecturer and First Year Experience Coordinator at Western Sydney University. She has taught various subjects and courses (Foundation, Diploma, first and second years of Bachelor's Degree, online Associate Degree). She has implemented numerous projects to support learning, including: Creating the YouTube channel Engineering by Steps, Leading the development of HD videos for the first-year engineering courses, Developing iBook for physics, creating 3D lectures and aminations for Engineering Materials, and conducting research focused on exploring student's barriers to Higher Education; Sarah Lyden completed her BSc-BE (Hons) at the University of Tasmania in 2011. From 2012 to 2015 she was a PhD candidate with the School of Engineering and ICT at the University of Tasmania. From March 2015 to February 2018 Sarah was employed as the API Lecturer in the field of power systems and renewable energy. Since 2018, Sarah has been employed as Lecturer in the School of Engineering. Sarah has been a member of the School of Engineering and ICT's STEM education and outreach team; Dr. Thomas F. Suesse completed his MSc (Dipl-Math) degree in mathematics at the Friedrich-Schiller-University (FSU) of Jena, Germany, in 2003. Dr Suesse then worked as a research fellow at the Institute of Medical Statistics, Informatics and Documentation (IMSID) and FSU. In 2005 he went to Victoria University of Wellington (VUW), New Zealand, to start his PhD in statistics and his degree was conferred with his thesis titled, 'Analysis and Diagnostics of Categorical Variables with Multiple Outcomes' in 2008. In 2009 Dr Suesse started working as a research fellow at the Centre for Statistical and Survey Methodology (CSSM) at the University of Wollongong. He was appointed as a lecturer at UOW in 2011 and promoted to senior lecturer in 2015. Currently he is at FSU on a research on a research fellowship
    Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1745) PDF downloads(56) Cited by(3)

Figures and Tables

Tables(4)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog