
Traditional virtual simulated interaction systems experience data fragmentation during the process of converting from two-dimensional to three-dimensional information, resulting in reduced realism and an inability to meet the teaching requirements of computer courses. Therefore, the integration of augmented reality (AR) technology into the educational environment remains an urgent and unresolved issue. To address the aforementioned issues, this paper investigates the data throughput limitations present in virtual simulation interaction systems. In response to this problem, an application solution utilizing AR technology is proposed, specifically a design concept for a virtual simulation interactive system tailored to computer-related courses. This system achieves its objectives through the collaborative interaction of AR hardware and supplementary software algorithms. The AR hardware is subdivided into framework design and functional hardware design, while the software components encompass AR models, virtual interaction models, and fusion methods. Through testing and comparison of the data throughput of this system with two other virtual simulation interaction systems, it was found that the virtual simulation interactive system optimized using AR technology can effectively enhance data throughput and address the issue of reduced realism in virtual interaction scenes caused by data fragmentation. This design system provides a more realistic and efficient mode of interaction for teaching computer-related courses.
Citation: Juan Li, Geng Sun. Design of a virtual simulation interaction system based on enhanced reality[J]. Electronic Research Archive, 2023, 31(10): 6260-6273. doi: 10.3934/era.2023317
[1] | Yu Lei, Zhi Su, Chao Cheng . Virtual reality in human-robot interaction: Challenges and benefits. Electronic Research Archive, 2023, 31(5): 2374-2408. doi: 10.3934/era.2023121 |
[2] | Wu Zeng, Heng-liang Zhu, Chuan Lin, Zheng-ying Xiao . A survey of generative adversarial networks and their application in text-to-image synthesis. Electronic Research Archive, 2023, 31(12): 7142-7181. doi: 10.3934/era.2023362 |
[3] | Majed Alowaidi, Sunil Kumar Sharma, Abdullah AlEnizi, Shivam Bhardwaj . Integrating artificial intelligence in cyber security for cyber-physical systems. Electronic Research Archive, 2023, 31(4): 1876-1896. doi: 10.3934/era.2023097 |
[4] | Zhongnian Li, Yanyan Ding, Meng Wei, Xinzheng Xu . Open-world barely-supervised learning via augmented pseudo labels. Electronic Research Archive, 2024, 32(10): 5804-5818. doi: 10.3934/era.2024268 |
[5] | Xiaoyu Zheng, Dewang Chen, Liping Zhuang . Empowering high-speed train positioning: Innovative paradigm for generating universal virtual positioning big data. Electronic Research Archive, 2023, 31(10): 6197-6215. doi: 10.3934/era.2023314 |
[6] | Youngjin Hwang, Ildoo Kim, Soobin Kwak, Seokjun Ham, Sangkwon Kim, Junseok Kim . Unconditionally stable monte carlo simulation for solving the multi-dimensional Allen–Cahn equation. Electronic Research Archive, 2023, 31(8): 5104-5123. doi: 10.3934/era.2023261 |
[7] | Qingguang Guan . Some estimates of virtual element methods for fourth order problems. Electronic Research Archive, 2021, 29(6): 4099-4118. doi: 10.3934/era.2021074 |
[8] | Youngjin Hwang, Seokjun Ham, Chaeyoung Lee, Gyeonggyu Lee, Seungyoon Kang, Junseok Kim . A simple and efficient numerical method for the Allen–Cahn equation on effective symmetric triangular meshes. Electronic Research Archive, 2023, 31(8): 4557-4578. doi: 10.3934/era.2023233 |
[9] | Hui Yao, Xin Li, Hancheng Dan, Qingli Dai, Zhanping You . Compatibility investigation of waste plastics in bitumen via a molecular dynamics method. Electronic Research Archive, 2023, 31(12): 7224-7243. doi: 10.3934/era.2023366 |
[10] | Shuhao Cao . A simple virtual element-based flux recovery on quadtree. Electronic Research Archive, 2021, 29(6): 3629-3647. doi: 10.3934/era.2021054 |
Traditional virtual simulated interaction systems experience data fragmentation during the process of converting from two-dimensional to three-dimensional information, resulting in reduced realism and an inability to meet the teaching requirements of computer courses. Therefore, the integration of augmented reality (AR) technology into the educational environment remains an urgent and unresolved issue. To address the aforementioned issues, this paper investigates the data throughput limitations present in virtual simulation interaction systems. In response to this problem, an application solution utilizing AR technology is proposed, specifically a design concept for a virtual simulation interactive system tailored to computer-related courses. This system achieves its objectives through the collaborative interaction of AR hardware and supplementary software algorithms. The AR hardware is subdivided into framework design and functional hardware design, while the software components encompass AR models, virtual interaction models, and fusion methods. Through testing and comparison of the data throughput of this system with two other virtual simulation interaction systems, it was found that the virtual simulation interactive system optimized using AR technology can effectively enhance data throughput and address the issue of reduced realism in virtual interaction scenes caused by data fragmentation. This design system provides a more realistic and efficient mode of interaction for teaching computer-related courses.
The continuous development and innovation of internet technology has driven the advancement of various new cutting-edge digital technologies, leading to significant social, political, and financial changes. Additionally, it has profoundly impacted education and has become an inseparable and important part of today's educational process [1]. Among them, virtual interaction technology is a common derivative technology of three-dimensional (3D) digital technology, film and television technology, image processing technology, and other media synthesis technologies [2]. Virtual interactive technology can convert planar two-Dimensional (2D) information into a 3D space scenes, and make use of visual, auditory, and tactile signal communication processing technology to better display 2D information, which can improve the efficiency of information transmission, lower the threshold of information acceptance and maximize the improvement of information transmission. Based on the above characteristics, the virtual simulation interactive system emerged at a historic moment and has been widely applied in specialized classes in the teaching domain. However, in the existing virtual simulation interaction systems applied in computer courses, there is an information rupture in the conversion from two-dimensional information to the three-dimensional virtual space [3]. The reason for this rupture is that the data stream throughput of the existing virtual simulation interactive system is weak, which results in the congestion of the original data in the process of 3D calculation and reduction in the restoration degree of virtual space image reality [4], hence leading to an unsatisfactory teaching and learning experience.
Therefore, by introducing augmented reality (AR) technology, we designed a virtual simulation interactive system which can be applied in professional computer courses. In recent years, AR has become a rapidly emerging technology [5]. It enables interactive experiences with the real world by enhancing the perception of real-world objects through computer-generated information. Today, mobile AR applications utilize various tools such as head-mounted displays, cameras, global positioning system (GPS) sensors, smart glasses, smartphones, and tablets to blend the physical environment with digital content [6,7]. This integration merges the real world with virtual objects and projects virtual objects into a real environment [4,8].
In the literature, numerous studies have discussed the effectiveness of AR in various disciplines, learning styles, and environments [9,10]. However, there is limited research addressing the issue of information discontinuity during the transition from two-dimensional information to a three-dimensional virtual space.
The research purpose of this paper is to understand if the application of virtual interaction technology in classroom can lead to a better teaching and learning experience and outcomes. To understand the research question, we have considered the specific needs of our learners, the subject matter and the available support of technology. As well as answering the research question, we have discovered and addressed a big hurdle that lies on the way towards the success of applying virtual interaction systems in electronic teaching and learning, namely the problem of data discontinuity. Herein, an enhanced reality model is proposed, from which a virtual interaction model is optimized in regard to a better learning experience. We also investigated the feasibility of regular updates and maintenance of the virtual interaction system to keep the learning materials current and effective. The novelty of this study lies in the newly designed system emitter that incorporates a collaborative synergy of AR hardware and software. In the system emitter, the 3D virtual scene calculation model is optimized to improve the system data throughput and to enhance the system virtual interaction ability. Through the system hardware and software design, the virtual simulation and care system design was completed. The experimental results indicate that the designed virtual interactive simulation system achieved a significant increase in data throughput, effectively addressing the issue of information discontinuity in teaching.
Based on the existing virtual interaction and AR technologies, the hardware framework of the virtual simulation interaction system for computer professional courses was designed. The framework was designed for the construction and support of the hardware of the enhanced display technology. The specific framework structure is shown in Figure 1.
In designing the hardware frame of the system, referring to the design parameters of the VR equipment [9,10], we introduced a variety of access ports for the virtual interactive hardware equipment; it makes the power supply unit of the frame independent equipment, which can not only ensure the stability of the circuit, but also enlarge the power management area to provide a circuit guarantee for the stable operation of the high-performance AR processing hardware [11].
According to the requirement of high pixel acquisition for image information in the design of the AR hardware framework, the structure of the corresponding camera photosensitive device in the hardware was optimized. The structure is shown in Figure 2.
The hardware design is based on the integrated stack technology, and the three high-precision camera sensors are arranged horizontally to form the AR image acquisition module [12]. The three camera sensors are as follows: the ambient light camera sensor, the structured light camera sensor, and the motion camera sensor. The three camera sensor acquisition requirements are met. An infrared sensor and light is added on the left side of the sensor. The VX766 video signal sensor, the SQL1057 audio sensor, and the EQS21J73 image sensor are used as the three core data sensors in the control design. Meanwhile, the Qualcomm Xiaolong 7 Series mobile processor is used as the central processing unit (CPU) to ensure computational power stability. Considering that the above information is mainly based on the image information, the eight channel QN217 signal mixer is used to fuse the data in the information flow fusion interaction design [13,14].
In the virtual data hardware design, the microphone audio data, gyroscope motion data, and video image data fusion signal are converted into a 3D visual space through visual communication Integrated Circuit (IC) processing [15]. Among them, there are two kinds of visual communication interfaces, the local data interaction protocol interface and the external communication protocol interface, where the local data interaction protocol interface adopts a Process Design Language (PDL) coaxial design; the external communication protocol interface adopts a type-c interface design, and the transmission rate implements a USB3.1 transmission standard.
In the hardware design of data signal processing, the control unit is used to separate the information signal processing from the visual data control, and the independent design is beneficial for the allocation of signal computing power; meanwhile, the virtual data variables can be dynamically monitored with different types of signal processing [16].
Based on this design idea, the Processor In the Loop (PIL) modulus signal control unit, the SHG8021 signal shift register, high-speed control memory, and the dynamic flash memory control are used to constitute the control unit. In terms of power supply control, a two-way control scheme is adopted. There are two sets of circuit control ICs to form a parallel control unit. In the normal state, two power management ICs control the allocation of circuit resources. When one IC is abnormal, the other power control IC can independently control the normal operation of the whole circuit in order to ensure the normal operation of the design hardware.
The concepts of AR models and virtual interaction models are distinct. An AR model refers to the technology that overlays virtual information onto the real world, while a virtual interaction model involves recognizing and determining objects (such as 2D, 3D, GPS, motion sensing, facial recognition, etc.) through devices and superimposing virtual information at a specific location relative to these recognized objects, which is then displayed on the device screen, enabling real-time interaction with virtual information. AR models and virtual interaction models can be combined for a better user experience. For instance, AR models can be used to create virtual objects; then, virtual interaction models can be employed to interact with these objects. This combined usage can enhance the user experience and increase user engagement. Therefore, we will introduce the constructions of the AR model and virtual interaction in the next two sections, respectively.
In order to realize the function of the AR hardware, the model of AR data processing is constructed in the system design. Using the visual image distribution technology, the visual communication environment is constructed through a 3D scene, and the corresponding data algorithm is used to restore the image information according to the spatial distribution characteristics. In this scene, the change coefficient of the virtual image data is related to the resolution during the image restoration process. Through the signal fusion with the AR hardware, the man-machine mixed data simulation signal is obtained, and the distribution coefficient of the signal in the visual space is improved by simulating the space coordinate strength of the augmented signal, thus obtaining the AR model. The overall structure of the model is shown in Figure 3.
In the process of building the model, we take the characteristics of the virtual simulation interactive scenes into full account, utilize the technology of visual communication and the technology of building a 3D scene to restore the visual space scene of the image signals collected by the AR hardware. Based on the premise of keeping the original single-pixel features of the original position unchanged, we combine the adjacent pixel areas by superimposing the features through the visual synthesis algorithm. The combined single-pixel areas are arranged twice according to the arrangement structure of the visual communication pixels; then, the mixed data of the enhanced signals are arranged according to the distribution characteristics of the RGB pixels in the visual space [17].
According to the above process, a set of different pixel positions is synthesized, and a virtual restored coordinate system (Xv,Yv,Zv), a real coordinate system (Xw,Yw,Zw), and an enhanced scene coordinate system (Xc,Yc,Zc) are generated according to the distribution characteristics of the virtual space. The gray level coefficients of the virtual enhanced pixels in the process of interaction are obtained by coordinate fusion:
qixel_S[V3×3W3×3OT1][Xc(Xw−Xv)Yc(Yw−Yv)Zc(Zw−Zv)], | (1) |
where V3×3(3×3) represents the pixel composition matrix of the enhanced virtual signal and W3×1(3×1) represents the visual space enhancement component.
The image virtual space coordinate weights in the AR model are calculated. The original 2D data information was transformed into 3D space information through the differential fusion of the above three different coordinate systems, and the 3D space structure of the model was reconstructed based on the 3D space distribution characteristics of the data pair bitmap and the virtual space structure. Combined with the mapping relationship between the distribution characteristics of the enhanced hardware signal and the distribution of the enhanced virtual signal, the weighted coefficient of the pixel corresponding to the virtual space coordinates of the image in the AR model is obtained as follows:
Q=qixel_S−[fdx0u000fdyv000010][XcYcZc1], | (2) |
where (u0,v0) represents the virtual weight coefficient of the pixel of the enhanced image, and after the weight calculation, the physical size of the single pixel on the x-axis and y-axis of the enhanced virtual pixel is fdx, fdy.
In the 3D space constructed by the virtual simulation interactive system, the pixel size on different coordinates determines the visual transfer coefficients of different dimensions of the space. Therefore, according to the obtained physical size of the enhanced pixel, the center coordinate point of the three-dimensional space of the enhanced pixel in the visual image space is set to (ui,vi).
The output weight of the 3D virtual scene is determined by calculating the signal weight coefficient after fusion. However, due to the influence of the corresponding enhancement range of the weight coefficients, it is impossible to calculate them directly. In order to obtain the weight of the fusion signal more accurately, the design model adopts the method of component calculation to calculate the weight of the visual virtual scene, the weight of the enhancement coefficient, and the weight of the fusion coefficient, and then fuses the three groups of weights according to the above pixel signal fusion algorithm according to the distribution coefficient of the 3D visual virtual scene to obtain the output weight of the AR pixels in the visual 3D virtual scene at this time. The output expression of the AR model is as follows:
G=∑ni=1Q⋅(ui,vi). | (3) |
Based on the AR model obtained by calculation, the establishment of the virtual interaction model in the design system can be further completed.
Since the information in the AR model established above only has display characteristics, it cannot autonomously complete information interaction and virtual scene docking according to specific conditions. Therefore, based on the output formula of the AR model, the virtual information fusion clustering calculation method is used to perform interactive scene replacement of the information data in the model scene. In the Red Green Blue (RGB) space value domain, we extract a single pixel as an illustration.
In the pixel array, each pixel is composed of a pixel array arrangement factor of micro-pixels, and its arrangement factor is determined by the mixing coefficient of the single pixel's own area and the global pixel composition image resolution. Therefore, it can be distributed according to the virtual coordinates of the 3D space. Features and the symmetry of the information flow of the corresponding RGB-F micro-pixels [18] dynamically decompose the original pixel information in the virtual scene. According to the time variable development relationship, we replace the corresponding information flow of the micro-pixels with the original pixel coordinate position one-by-one and update the corresponding time relationship. In order to realize the information exchange under different time variables, the calculation function formula corresponding to the above realization process is as follows:
H=cos(ε2log2(G)), | (4) |
where ε represents the update coefficient corresponding to the amount of time of the micro-pixel information stream.
On this basis, the difference between the micro-pixel update time weight and the original pixel replacement information flow is defined as a virtual space interaction variable to realize the unification of the information interaction variable and the virtual space interaction variable.
According to the consistency of the corresponding windows between the two, the window scale of the virtual space interaction variable is obtained as follows:
Ri=1n−1n∑t=2|μti−μ(t−i)i|, | (5) |
where μi represents the mean scale of the window at the i pixel point, and μti represents the scale of the i pixel in the t virtual pixel array. In order to reduce the disturbance coefficient in the virtual docking process, according to the corresponding window scale of the interactive pixels, wavelet decomposition is used to reduce the noise of the reality augmented model, and the membership function of the virtual interactive pixel after the disturbance factor is removed as follows:
σi=(1−α)Ri+α|μi+1+μi|, | (6) |
where α represents the noise reduction coefficient corresponding to the pixel scale window. According to the pixel scale distribution characteristics of the model after noise reduction, the fusion characteristics of the interactive pixel window scale during the virtual docking process are as follows:
qixel_M=max(8∑i=1(σi−qixel_S)). | (7) |
If qixel_S≺qixel_M, the pixel distribution characteristics of the window can be calculated at this time, and the calculated value is brought into the AR model for data fusion, and the virtual interaction model is obtained as follows:
GRAY(x,y)={1,GRAY(x,y)⩾threshold0,GRAY(x,y)≺threshold, | (8) |
where threshold represents the virtual docking component of pixels of different window scales.
The steps to realize the virtual interaction of computer-specialized courses based on AR are illustrated as follows.
Supported by the designed hardware framework, the AR hardware converts the course information data into an image information stream. At the hardware output end, relying on JAVA Eclipse as the development environment [19], the algorithm virtual data processing resource base adopts MySQL [20], cooperates with the 3D virtual space processing algorithm of the B/S architecture [21], maps the information flow data according to the space coordinate position and constitutes the virtual interactive scene. In the whole process, the information collection, data processing and information fusion of hardware are used to convert the visual information of course information data, and the information weights of the visual information transmission are enhanced by using the AR technology; then, the AR model and the virtual interaction model of the software processing part are used to construct the AR model and the virtual interaction model of the visual communication information flow. The virtual interaction function of the output scene is realized to complete the design of the virtual simulation interaction system of computer professional courses based on AR. The implementation flow is shown in Figure 4.
The performance of the virtual simulation interactive system based on the AR is tested. The test is completed using the GUN's Not Unix (GUN) development test tool.
In order to unify the environment and test the quantity of the design system and the contrast test system, a development tool was used to test the system. The development tool was selected by referring to the following six points:
1) The authority of the development tool: whether the performance data obtained by the development test tool can represent the actual performance of the test system and whether it can truthfully reflect the parameter changes of the system;
2) The flexibility of the development and test tools: whether the development and test tools have different application logics in the same scenario;
3) The convergence of development test tools: the control performance of the obtained test parameters and the control performance of each parameter in the test process;
4) The extensibility of the development test tool: whether the test tool has the function of adding a tool library;
5) The stability of the development test tool: whether the scheduling status of the overall tool operation resources during the test is stable; and
6) The data analysis ability of the development test tool: the ability to analyze the various data of the test system obtained during the test.
According to the above six-point test, among many tools, the GUN development and test tool was selected for this test.
In the test, the virtual interactive image testing function of GUN was used to understand the enhancement effect of the virtual interactive scene. In order to show the test results more intuitively, two contrasting systems were used in the test. In addition, to ensure the consistency of the standards of the parameter data obtained during the testing, the experimental parameters of the testing system were set as follows:
We created a test scenario based on the experimental parameters set in Table 1 above and created a test scenario that used a cloud computer with virtual acquisition equipment to complete a virtual simulation interactive system for professional computer courses based on AR, as shown in Figure 5.
Project | Parameter | Project | Parameter |
Output image resolution before enhancement of the virtual simulation interactive system | 180 × 240 | Output virtual interactive image resolution after AR | 960 × 960 |
Test system virtual interactive scene size | 240 × 240 ×240 | Compare template specifications | 8 × 8 |
Without considering the influence of third-party factors, in the design and test scenario, we started the comparison test of the multi-relevant Virtual Reality Modeling Language (VRML) collaborative virtual reality simulation system, the homework virtual simulation training system and the design system and tested the information flow throughput of the virtual simulation interactive system of the professional course ability. The test result is shown in Figure 6.
Through the test analysis shown in Figure 6, the data throughput capacity of the virtual simulation system when the same 2D data information is converted into the same 3D virtual interactive scene in the same data interaction environment. A designed AR-based professional computer course interaction is proposed. As shown in Figure 6(a), the data throughput of the virtual system is relatively high. As shown in Figure 6(b) and (c), the throughput results of the other two virtual simulation interactive systems participating in the test comparison are that the multi-relevance VRML collaborative virtual reality simulation system is better than the job virtual simulation training system, and both are lower than the proposed design system throughput. According to the data throughput and the corresponding time, the specific throughput value of the virtual simulation interactive system can be calculated per unit of time, as shown in Table 2. Among them, the multi-relevance VRML collaborative virtual reality simulation system is denoted as test system 1, and the homework virtual simulation training system is denoted as test system 2.
system | Data interaction gain/KBPS | Maximum data virtual capacity/KBPS | Mean value of virtual data interaction/KBPS | Virtual interaction stability coefficient | Data congestion factor |
Design system | 560 | 1331 | 550 | 2.42 | 0 |
Test system 1 | – | 872 | 351 | 2.48 | 1.4 |
Test system 2 | – | 88 | –3496 | –0.02 | 0.1 |
Compared with the data in Table 2, the design system has the best virtual interaction ability, the best overall stability and the best virtual interaction quality under the AR optimization.
The utilization of AR in engineering education, particularly in computer-specific subjects, has been extensively examined and thoroughly explored for its practicality. It is believed that incorporating AR technology into the learning process can enhance the engagement and immersive of learning activities [22]. Both teachers and students can benefit from virtual practices, hands-on simulations, interactive labs, gamifications and so on, provided that the AR environment is well supported by integration of hardware and software [23]. Based on the characteristics of the design hardware and the mechanism of 2D data transformation in a 3D virtual space, two sets of models of AR and 3D virtual interaction are established, the model data fusion algorithm is applied to the 3D scene of the system to improve the 3D scene effect and solve the problem of low throughput of the virtual interaction data [24].
However, the following are some deficiencies in the design: whether the variables in the process of 2D information conversion are within the controllable range of the AR model; whether the data symmetry of the interactive scene of the virtual simulation constructed by conversion is stable; the difference of the information of different 2D courses into 3D virtual variables; and whether the system database can be improved and optimized to the best state by updating the data of different virtual scenes. Since a better learning experience can both improve students' learning interests and knowledge comprehension, the collaboration of hardware and software, and existing models of AR and virtual interaction applied in computer-specific and other engineering subjects, it is worthy of exploration and redevelopment.
This paper presents a pioneering study that employs a novel system harnessing the collaborative advantages of AR hardware and software. The focus of this research is on addressing the issue of diminished realism due to inefficient data throughput, particularly data fragmentation. The significance of this study lies in its introduction of an AR-driven virtual interactive approach tailored for computer-related courses. Alongside adopting this approach, two innovative models are developed for AR and virtual interaction.
Empirical research results validate the feasibility and effectiveness of the proposed methodology. Adopting this new instructional approach not only enhances student engagement but also assists them in better comprehending and mastering complex computer concepts, thereby significantly improving learning outcomes. Our forthcoming efforts will be dedicated to expanding the application of this vivid, immersive, and naturally interactive method to a broader educational context, including non-computer-related subjects. Additionally, we will design a more user-friendly method for generating learning content, thereby providing a new impetus to the advancement of education.
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.
This research study is sponsored by these projects: the Jiangsu Higher Education Reform Research Project (2021jsjg641), the Jiangsu Educational Science "14th five-year plan" Project (B/2021/01/13), and the Science and Technology Research Program of Chongqing Municipal Education Commission (Grant No. KJZD-M202201801 and Grant No. KJZD-K202001801).
The authors declare no conflicts of interest.
[1] |
C. Troussas, A. Krouska, C. Sgouropoulou, Impact of social networking for advancing learners' knowledge in E-learning environments, Educ. Inf. Technol., 2021 (26), 4285–4305. https://doi.org/10.1007/s10639-021-10483-6 doi: 10.1007/s10639-021-10483-6
![]() |
[2] | Z. Yu, Z. Ran, Simulation research on touch perception control of virtual interactive system based on VRML, Comput. Simul., 37 (2020), 193–197. |
[3] |
R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, B. MacIntyre, Recent advances in augmented reality, IEEE Comput. Graphics Appl., 21 (2001), 34–47. http://doi.org/10.1109/38.963459 doi: 10.1109/38.963459
![]() |
[4] | C. Yang, Research on the application of computer virtual reality technology, Inf. Commun., 10 (2020), 161–163. |
[5] |
E. Liu, S. Cai, Z. Liu, L. Liu, WebART: Web-based augmented reality learning resources authoring tool and its user experience study among teachers, IEEE Trans. Learn. Technol., 16 (2023), 53–65. https://doi.org/10.1109/TLT.2022.3214854 doi: 10.1109/TLT.2022.3214854
![]() |
[6] |
C. Cong, J. Li, K. Qin, Design and application of virtual simulation training system in urban rail transit operation, Urban Mass Transit, 23 (2020), 44–49. https://doi.org/10.16037/j.1007-869x.2020.08.011 doi: 10.16037/j.1007-869x.2020.08.011
![]() |
[7] |
D. Li, Design of plane image interactive system based on virtual reality technology, Mod. Electron. Tech., 43 (2020), 158–160. https://doi.org/10.16652/j.issn.1004-373x.2020.08.041 doi: 10.16652/j.issn.1004-373x.2020.08.041
![]() |
[8] | Y. Dong, Implementation of multi-associative VRML collaborative virtual reality simulation system, Comput. Simul., 36 (2019), 372–376. |
[9] |
B. Gan, C. Zhang, Y. Chen, Y. Chen, Research on role modeling and behavior control of virtual reality animation interactive system in Internet of Things, J. Real-Time Image Process., 18 (2021), 1069–1083. https://doi.org/10.1007/s11554-020-01046-y doi: 10.1007/s11554-020-01046-y
![]() |
[10] | Y. Zhang, C. Liu, Research on the integrated application of VR, AR technology in long and short weapons events, Bull. Sport Sci. Technol., 29 (2021), 187–189. |
[11] | A. Marougkas, C. Troussas, A. Krouska, C. Sgouropoulou, A Framework for Personalized Fully Immersive Virtual Reality Learning Environments with Gamified Design in Education, IOS Press, Amsterdam, 2021. |
[12] | Y. Wang, Complex virtual training simulation system design, Mod. Def. Technol., 43 (2015), 215–222. |
[13] | Q. Niu, Motion Simulation and Virtual Disassembly System Design of Automobile Rear Axle in Augmented Reality Environment, Master thesis, Wuhan University of Technology in Hubei, 2019. https://doi.org/10.27381/d.cnki.gwlgu.2019.000894 |
[14] | J. Yong, Y. Wang, B. Yue, W. Wang, Research on the construction of virtual simulation experimental teaching resource system based on augmented reality technology, Ind. Inf. Technol. Educ., 10 (2019), 85–89. |
[15] |
W. Wang, W. Zhang, Y. Li, The application of augmented reality technology in industrial robot teaching, Intern. Combust. Eng. Parts, 20 (2019), 251–252. https://doi.org/10.19475/j.cnki.issn1674-957x.2019.20.133 doi: 10.19475/j.cnki.issn1674-957x.2019.20.133
![]() |
[16] | C. Papakostas, C. Troussas, A. Krouska, C. Sgouropoulou, On the Development of a Personalized Augmented Reality Spatial Ability Training Mobile Application, IOS Press, Amsterdam, 2021. |
[17] | Y. Chen, W. Zhang, S. Chen, The analysis of the effects of museum learning based on augmented reality taking the "AR box" virtual simulation learning environment as an example, Mod. Distance Educ. Res., 32 (2020), 104–112. |
[18] |
S. Cai, E. Liu, Y. Shen, C. Liu, S. Li, Y. Shen., Probability learning in mathematics using augmented reality: Impact on student's learning gains and attitudes, Interact. Learn. Environ., 28 (2020), 560–573. https://doi.org/10.1080/10494820.2019.1696839 doi: 10.1080/10494820.2019.1696839
![]() |
[19] | Z. Liu, X. Jian, B. Shi, H. Zhang, Operation and maintenance simulation for smart substation equipment based on augmented reality technology, South. Power Syst. Technol., 13 (2019), 69–75. |
[20] | X. Xing, The application of augmented reality technology in medical clinical teaching, China Mod. Educ. Equip., 19 (2021), 32–34. |
[21] | J. Chen, Y. Zhou, J. Zhai, The application of virtual reality and augmented reality technologies in museum learning, Mod. Educ. Technol., 31 (2021), 5–13. |
[22] |
Z. Turan, G. Atila, Augmented reality technology in science education for students with specific learning difficulties: Its effect on students' learning and view, Res. Sci. Technol. Educ., 39 (2021), 506–524. https://doi.org/10.1080/02635143.2021.1901682 doi: 10.1080/02635143.2021.1901682
![]() |
[23] | T. Chiang, S. Yang, G. Hwang, An augmented reality-based mobile learning system to improve students' learning achievements and motivations in natural science inquiry activities, Educ. Technol. Soc., 17 (2014), 352–365. |
[24] |
C. Papakostas, C. Troussas, A. Krouska, C. Sgouropoulou. User acceptance of augmented reality welding simulator in engineering training, Educ. Inf. Technol., 27 (2022), 791–817. https://doi.org/10.1007/s10639-020-10418-7 doi: 10.1007/s10639-020-10418-7
![]() |
Project | Parameter | Project | Parameter |
Output image resolution before enhancement of the virtual simulation interactive system | 180 × 240 | Output virtual interactive image resolution after AR | 960 × 960 |
Test system virtual interactive scene size | 240 × 240 ×240 | Compare template specifications | 8 × 8 |
system | Data interaction gain/KBPS | Maximum data virtual capacity/KBPS | Mean value of virtual data interaction/KBPS | Virtual interaction stability coefficient | Data congestion factor |
Design system | 560 | 1331 | 550 | 2.42 | 0 |
Test system 1 | – | 872 | 351 | 2.48 | 1.4 |
Test system 2 | – | 88 | –3496 | –0.02 | 0.1 |
Project | Parameter | Project | Parameter |
Output image resolution before enhancement of the virtual simulation interactive system | 180 × 240 | Output virtual interactive image resolution after AR | 960 × 960 |
Test system virtual interactive scene size | 240 × 240 ×240 | Compare template specifications | 8 × 8 |
system | Data interaction gain/KBPS | Maximum data virtual capacity/KBPS | Mean value of virtual data interaction/KBPS | Virtual interaction stability coefficient | Data congestion factor |
Design system | 560 | 1331 | 550 | 2.42 | 0 |
Test system 1 | – | 872 | 351 | 2.48 | 1.4 |
Test system 2 | – | 88 | –3496 | –0.02 | 0.1 |